I am trying to use the Math.NET numerics implementation of the FFT algorithm, but I must be doing something wrong because the output is always unit
The following is the the setup:
open MathNet.Numerics
open MathNet.Numerics.Statistics
open MathNet.Numerics.IntegralTransforms
let rnd = new Random()
let rnddata = Array.init 100 (fun u -> rnd.NextDouble())
let x = rnddata |> Array.Parallel.map (fun d -> MathNet.Numerics.complex.Create(d, 0.0) )
then when I run this:
let tt = MathNet.Numerics.IntegralTransforms.Fourier.BluesteinForward(x, FourierOptions.Default)
I receive an empty output below?
val tt : unit = ()
Any ideas why?
I think the Fourier.BluesteinForward method stores the results in the input array (by overwriting whatever was there originally).
If you do not need the input after running the transform, you can just use x and read the results (this saves some memory copying, which is why Math.NET does that by default). Otherwise, you can clone the array and wrap it in a more functional style code like this:
let bluesteinForward input =
let output = Array.copy input
MathNet.Numerics.IntegralTransforms.Fourier.BluesteinForward
(output, FourierOptions.Default)
output
Related
I am doing a direct combination of 2 functions using a lambda function. Because indentation is a problem I am doing direct copy/paste from Visual Code:
version 1 of the function is:
module myrun =
...
let mywrite(price: string, volume: string, time:string) =
use cmd99 = new SqlCommandProvider<"INSERT INTO swirepacific(ric, price, volume, inst, ttime, views, strike, etime) VALUES (#st1, #st2, #st3, #st4, #st5, #st6, #st7, #st8)", connStr>(connStr)
cmd99.Execute(st1=myric,st2=float price,st3=int volume,st4=inst,st5=System.DateTime.Parse time, st6 = myview, st7 = strike, st8=System.DateTime.Parse etime)
|> ignore
let toDB (data:seq<IData>) =
data
|> Seq.map (fun (r:IData) -> (r.AllFields.["VALUE"].ToString(),r.AllFields.["VOLUME"].ToString()
,r.Timestamp.Value.ToString("yyyy-MM-dd hh:mm:ss.fff")))
|> Seq.iter mywrite
Version 1 is a write to database code. It uses SqlCommandProvider to parse a SQL statement directly.
The problem with version 1 is that while it accepts price, volume and time from previous functions, the myric, inst, strike and etime are Global Variables.
With version 2, I want to skip global variables completely and take all inputs as parameters. It should be a better practice and pretty much required if I want to run the functions in parallel:
Version 2 of the function is
module myrun =
...
let toDB2 (data:seq<IData>, aric: Aric, ainst: Ainst, astrike: Astrike, aetime: Aetime) =
data
|> Seq.map (fun (r:IData) -> (r.AllFields.["VALUE"].ToString(),r.AllFields.["VOLUME"].ToString(),r.Timestamp.Value.ToString("yyyy-MM-dd hh:mm:ss.fff")))
|> Seq.iter (fun (price, volume, time) ->
(use cmd99 = new SqlCommandProvider<"INSERT INTO swirepacific(ric, price, volume, inst, ttime, views, strike, etime) VALUES (#st1, #st2, #st3, #st4, #st5, #st6, #st7, #st8)", connStr>(connStr)
cmd99.Execute(st1=aric,st2=float price,st3=int volume,st4=ainst,st5=System.DateTime.Parse time, st6 = myview, st7 = astrike, st8=System.DateTime.Parse aetime)
|> ignore)
There is a red line under the let in "let toDB2". The error is
Incomplete value or function definition. If this is an expression, the body of the expression must be indented to the same column as the 'let' keyword
I suspect it has something to do with the indentation of the lambda function.
Can you please help me? Thanks!
I think you're just missing a single closing parenthesis. I just added one at the end to close your 2nd lambda expression and the compiler is happy:
module myrun =
let toDB2 (data:seq<IData>, aric: Aric, ainst: Ainst, astrike: Astrike, aetime: Aetime) =
data
|> Seq.map (fun (r:IData) -> (r.AllFields.["VALUE"].ToString(),r.AllFields.["VOLUME"].ToString(),r.Timestamp.Value.ToString("yyyy-MM-dd hh:mm:ss.fff")))
|> Seq.iter (fun (price, volume, time) ->
(use cmd99 = new SqlCommandProvider<"INSERT INTO swirepacific(ric, price, volume, inst, ttime, views, strike, etime) VALUES (#st1, #st2, #st3, #st4, #st5, #st6, #st7, #st8)", connStr>(connStr)
cmd99.Execute(st1=aric,st2=float price,st3=int volume,st4=ainst,st5=System.DateTime.Parse time, st6 = myview, st7 = astrike, st8=System.DateTime.Parse aetime)
|> ignore))
The answer from #Aron shows how to correct your code without changing its structure. However, I think that there is not much benefit in using functions like map in your particular case.
You are essentially writing imperative bit of code that iterates over some data and writes it to a database. This performs IO and so it is inherently imperative - in such cases, I often find it easier to write a more imperative style of F# - the nice side effect of doing this is that anyone reading the code will immediately see that it is, in fact, imperative and will not confuse it with some functional data processing.
So, my version would actually remove all lambdas from this snippet:
module myrun =
[<Literal>]
let InsertQuery =
"INSERT INTO swirepacific(ric, price, volume, inst, ttime, views, strike, etime)
VALUES (#st1, #st2, #st3, #st4, #st5, #st6, #st7, #st8)"
let toDB2 (data:seq<IData>, aric: Aric, ainst: Ainst, astrike: Astrike, aetime: Aetime) =
for r in data do
let price = r.AllFields.["VALUE"].ToString()
let volume = r.AllFields.["VOLUME"].ToString()
let time = r.Timestamp.Value.ToString("yyyy-MM-dd hh:mm:ss.fff")
use cmd99 = new SqlCommandProvider<InsertQuery, connStr>(connStr)
cmd99.Execute
( st1=aric,st2=float price,st3=int volume,st4=ainst,st5=System.DateTime.Parse time,
st6 = myview, st7 = astrike, st8=System.DateTime.Parse aetime) |> ignore
I need to write a Deedle FrameData (including "ID" column and additional "Delta" column with blank entries) to CSV. While I can generate a 2D array of the FrameData, I am unable to write it correctly to a CSV file.
module SOQN =
open System
open Deedle
open FSharp.Data
// TestInput.csv
// ID,Alpha,Beta,Gamma
// 1,no,1,hi
// ...
// TestOutput.csv
// ID,Alpha,Beta,Gamma,Delta
// 1,"no","1","hi",""
// ...
let inputCsv = #"D:\TestInput.csv"
let outputCsv = #"D:\TestOutput.csv"
let (df:Frame<obj,string>) = Frame.ReadCsv(inputCsv, hasHeaders=true, inferTypes=false, separators=",", indexCol="ID")
// See http://www.fssnip.net/sj/title/Insert-Deedle-frame-into-Excel
let data4Frame (frame:Frame<_,_>) = frame.GetFrameData()
// See http://www.fssnip.net/sj/title/Insert-Deedle-frame-into-Excel
let boxOptional obj =
match obj with
| Deedle.OptionalValue.Present obj -> box (obj.ToString())
| _ -> box ""
// See http://www.fssnip.net/sj/title/Insert-Deedle-frame-into-Excel
let frameToArray (data:FrameData) =
let transpose (array:'T[,]) =
Array2D.init (array.GetLength(1)) (array.GetLength(0)) (fun i j -> array.[j, i])
data.Columns
|> Seq.map (fun (typ, vctr) -> vctr.ObjectSequence |> Seq.map boxOptional |> Array.ofSeq)
|> array2D
|> transpose
let main =
printfn ""
printfn "Output Deedle FrameData To CSV"
printfn ""
let dff = data4Frame df
let rzlt = frameToArray dff
printfn "rzlt: %A" rzlt
do
use writer = new StreamWriter(outputCsv)
writer.WriteLine("ID,Alpha,Beta,Gamma,Delta")
// writer.WriteLine rzlt
0
[<EntryPoint>]
main
|> ignore
What am I missing?
I would not use FrameData to do this - frame data is mostly internal and while there are some legitimate uses for it, I don't think it makes sense for this task.
If you simply want to add an empty Delta column to your input CSV, then you can do this:
let df : Frame<int, _> = Frame.ReadCsv("C:/temp/test-input.csv", indexCol="ID")
df.AddColumn("Delta", [])
df.SaveCsv("C:/temp/test-output.csv", ["ID"])
This does almost everything you need - it writes the ID column and the extra Delta column.
The only caveat is that it does not add the extra quotes around the data. This is not required by the CSV specification unless you need to escape a comma in a column and I don't think there is an easy way to get Deedle to do this.
So, I think then you'd have to write your own writing to a CSV file. The following shows how to do this, but it does not correctly escape quotes and commas (which is why you should use SaveCsv even if it does not put in the quotes when they're not needed):
use writer = new StreamWriter("C:/temp/test-output.csv")
writer.WriteLine("ID,Alpha,Beta,Gamma,Delta")
for key, row in Series.observations df.Rows do
writer.Write(key)
for value in Series.valuesAll row do
writer.Write(",")
writer.Write(sprintf "\"%O\"" (if value.IsSome then value.Value else box ""))
writer.WriteLine()
You can get the example of writing to csv from source of the library (it uses FrameData there)
After adding wrapper:
type FrameData with
member frameData.SaveCsv(path:string, ?includeRowKeys, ?keyNames, ?separator, ?culture) =
use writer = new StreamWriter(path)
writeCsv writer (Some path) separator culture includeRowKeys keyNames frameData
you could write like this:
dff.SaveCsv outputCsv
i'm writing a small console application in F#.
[<EntryPoint>]
let main argv =
high_lvl_funcs.print_opt
let opt = Console.ReadLine()
match opt with
| "0" -> printfn "%A" (high_lvl_funcs.calculate_NDL)
| "1" -> printfn ("not implemented yet")
| _ -> printfn "%A is not an option" opt
from module high_lvl_funcs
let print_opt =
let options = [|"NDL"; "Deco"|]
printfn "Enter the number of the option you want"
Array.iteri (fun i x -> printfn "%A: %A" i x) options
let calculate_NDL =
printfn ("enter Depth in m")
let depth = lfuncs.m_to_absolute(float (Console.ReadLine()))
printfn ("enter amount of N2 in gas (assuming o2 is the rest)")
let fn2 = float (Console.ReadLine())
let table = lfuncs.read_table
let tissue = lfuncs.create_initialise_Tissues ATM WATERVAPOUR
lfuncs.calc_NDL depth fn2 table lfuncs.loading_constantpressure tissue 0.0
lfuncs.calc_NDL returns a float
this produces this
Enter the number of the option you want
0: "NDL"
1: "Deco"
enter Depth in m
which means it prints what it's suppose to then jumps straight to high_lvl_funcs.calculate_NDL
I wanted it to produce
Enter the number of the option you want
0: "NDL"
1: "Deco"
then let's assume 0 is entered, and then calculate high_lvl_funcs.calculate_NDL
after some thinking and searching i assume this is because F# wants to assign all values before it starts the rest. Then i thought that i need to declaring a variable without assigning it. but people seem to agree that this is bad in functional programming. From another question: Declaring a variable without assigning
so my question is, is it possible to rewrite the code such that i get the flow i want and avoid declaring variables without assigning them?
You can fix this by making calculate_NDL a function of no arguments, instead of a closure that evaluates to a float:
let calculate_NDL () =
Then call it as a function in your match like this:
match opt with
| "0" -> printfn "%A" (high_lvl_funcs.calculate_NDL())
However I'd suggest refactoring this code so that calculate_NDL takes any necessary inputs as arguments rather than reading them from the console i.e. read the inputs from the console separately and pass them to calculate_NDL.
let calculate_NDL depth fn2 =
let absDepth = lfuncs.m_to_absolute(depth)
let table = lfuncs.read_table
let tissue = lfuncs.create_initialise_Tissues ATM WATERVAPOUR
lfuncs.calc_NDL absDepth fn2 table lfuncs.loading_constantpressure tissue 0.0
It's generally a good idea to write as much code as possible as pure functions that don't rely on I/O (like reading from stdin).
The Stats.expandingXXXX functions are pretty fast. However there is no public api to do a expandingWindow apply. The following solution i created is really slow when it comes to large dataset like 100k. Any suggestion is appreciated?
let ExpWindowApply f minSize data =
let keys = dataSeries.Keys
let startKey = dataSeries.FirstKey()
let values = keys
|> Seq.map(fun k ->
let ds = data.Between(startKey,k)
match ds with
|_ when ds.ValueCount >= minSize -> f ds.Values
|_ -> Double.NaN
)
let result = Series(keys, values)
result
I understand the Stats.expandingXXX function are actually special cases where the function being applied can be iterately calculated based on previous loop's state. And not all function can take advantage of states from previous calculation. Is there anything better way than Series.Between in terms of creating a window of data?
Update
For those who are also interested in the similar issue. The answer provides alternative implementation and insight into rarely documented series vector and index operation. But it doesn't improve performance.
The expanding functions in Deedle are fast because they are using an efficient online algorithm that makes it possible to calculate the statistics on the fly with just one pass - rather than actually building the intermediate series for the sub-ranges.
There is a built-in function aggregate that lets you do something this - though it works in the reversed way. For example, if you want to sum all elements starting from the current one to the end, you can write:
let s = series [ for i in 1 .. 10 -> i, float i ]
s |> Series.aggregateInto
(Aggregation.WindowWhile(fun _ _ -> true))
(fun seg -> seg.Data.FirstKey())
(fun seg -> OptionalValue(Stats.sum seg.Data))
If you want to do the same thing using the underlying representation, you can directly use the addressing scheme that Deedle uses to link the keys (in the index) with values (in the data vector). This is an ugly mutable sample, but you can encapsulate it into something nicer:
[ let firstAddr = s.Index.Locate(s.FirstKey())
for k in s.Index.KeySequence ->
let lastAddr = s.Index.Locate(k)
seq {
let a = ref firstAddr
while !a <> lastAddr do
yield s.Vector.GetValue(!a).Value
a := s.Index.AddressOperations.AdjustBy(!a, +1L) } |> Seq.sum ]
I'm looking for a clean set of ways to manage Test Specific Equality in F# unit tests. 90% of the time, the standard Structural Equality fits the bill and I can leverage it with unquote to express the relation between my result and my expected.
TL;DR "I can't find a clean way to having a custom Equality function for one or two properties in a value which 90% of is well served by Structural Equality, does F# have a way to match an arbitrary record with custom Equality for just one or two of its fields?"
Example of a general technique that works for me
When verifying a function that performs a 1:1 mapping of a datatype to another, I'll often extract matching tuples from both sides of in some cases and compare the input and output sets. For example, I have an operator:-
let (====) x y = (x |> Set.ofSeq) = (y |> Set.ofSeq)
So I can do:
let inputs = ["KeyA",DateTime.Today; "KeyB",DateTime.Today.AddDays(1); "KeyC",DateTime.Today.AddDays(2)]
let trivialFun (a:string,b) = a.ToLower(),b
let expected = inputs |> Seq.map trivialFun
let result = inputs |> MyMagicMapper
test <# expected ==== actual #>
This enables me to Assert that each of my inputs has been mapped to an output, without any superfluous outputs.
The problem
The problem is when I want to have a custom comparison for one or two of the fields.
For example, if my DateTime is being passed through a slightly lossy serialization layer by the SUT, I need a test-specific tolerant DateTime comparison. Or maybe I want to do a case-insensitive verification for a string field
Normally, I'd use Mark Seemann's SemanticComparison library's Likeness<Source,Destination> to define a Test Specific equality, but I've run into some roadblocks:
tuples: F# hides .ItemX on Tuple so I can't define the property via a .With strongly typed field name Expression<T>
record types: TTBOMK these are sealed by F# with no opt-out so SemanticComparison can't proxy them to override Object.Equals
My ideas
All I can think of is to create a generic Resemblance proxy type that I can include in a tuple or record.
Or maybe using pattern matching (Is there a way I can use that to generate an IEqualityComparer and then do a set comparison using that?)
Alternate failing test
I'm also open to using some other function to verify the full mapping (i.e. not abusing F# Set or involving too much third party code. i.e. something to make this pass:
let sut (a:string,b:DateTime) = a.ToLower(),b + TimeSpan.FromTicks(1L)
let inputs = ["KeyA",DateTime.Today; "KeyB",DateTime.Today.AddDays(1.0); "KeyC",DateTime.Today.AddDays(2.0)]
let toResemblance (a,b) = TODO generate Resemblance which will case insensitively compare fst and tolerantly compare snd
let expected = inputs |> List.map toResemblance
let result = inputs |> List.map sut
test <# expected = result #>
Firstly, thanks to all for the inputs. I was largely unaware of SemanticComparer<'T> and it definitely provides a good set of building blocks for building generalized facilities in this space. Nikos' post gives excellent food for thought in the area too. I shouldn't have been surprised Fil exists too - #ptrelford really does have a lib for everything (the FSharpValue point is also v valuable)!
We've thankfully arrived at a conclusion to this. Unfortunately it's not a single all-encompassing tool or technique, but even better, a set of techniques that can be used as necessary in a given context.
Firstly, the issue of ensuring a mapping is complete is really an orthogonal concern. The question refers to an ==== operator:-
let (====) x y = (x |> Set.ofSeq) = (y |> Set.ofSeq)
This is definitely the best default approach - lean on Structural Equality. One thing to note is that, being reliant on F# persistent sets, it requires your type to support : comparison (as opposed to just : equality).
When doing set comparisons off the proven Structural Equality path, a useful technique is to use HashSet<T> with a custom IEqualityComparer:-
[<AutoOpen>]
module UnorderedSeqComparisons =
let seqSetEquals ec x y =
HashSet<_>( x, ec).SetEquals( y)
let (==|==) x y equals =
let funEqualityComparer = {
new IEqualityComparer<_> with
member this.GetHashCode(obj) = 0
member this.Equals(x,y) =
equals x y }
seqSetEquals funEqualityComparer x y
the equals parameter of ==|== is 'a -> 'a -> bool which allows one to use pattern matching to destructure args for the purposes of comparison. This works well if either the input or the result side are naturally already tuples. Example:
sut.Store( inputs)
let results = sut.Read()
let expecteds = seq { for x in inputs -> x.Name,x.ValidUntil }
test <# expecteds ==|== results
<| fun (xN,xD) (yN,yD) ->
xF=yF
&& xD |> equalsWithinASecond <| yD #>
While SemanticComparer<'T> can do a job, it's simply not worth bothering for tuples with when you have the power of pattern matching. e.g. Using SemanticComparer<'T>, the above test can be expressed as:
test <# expecteds ==~== results
<| [ funNamedMemberComparer "Item2" equalsWithinASecond ] #>
using the helper:
[<AutoOpen>]
module MemberComparerHelpers =
let funNamedMemberComparer<'T> name equals = {
new IMemberComparer with
member this.IsSatisfiedBy(request: PropertyInfo) =
request.PropertyType = typedefof<'T>
&& request.Name = name
member this.IsSatisfiedBy(request: FieldInfo) =
request.FieldType = typedefof<'T>
&& request.Name = name
member this.GetHashCode(obj) = 0
member this.Equals(x, y) =
equals (x :?> 'T) (y :?> 'T) }
let valueObjectMemberComparer() = {
new IMemberComparer with
member this.IsSatisfiedBy(request: PropertyInfo) = true
member this.IsSatisfiedBy(request: FieldInfo) = true
member this.GetHashCode(obj) = hash obj
member this.Equals(x, y) =
x.Equals( y) }
let (==~==) x y mcs =
let ec = SemanticComparer<'T>( seq {
yield valueObjectMemberComparer()
yield! mcs } )
seqSetEquals ec x y
All of the above is best understood by reading Nikos Baxevanis' post NOW!
For types or records, the ==|== technique can work (except critically you lose Likeness<'T>s verifying coverage of fields). However the succinctness can make it a valuable tool for certain sorts of tests :-
sut.Save( inputs)
let expected = inputs |> Seq.map (fun x -> Mapped( base + x.ttl, x.Name))
let likeExpected x = expected ==|== x <| (fun x y -> x.Name = y.Name && x.ValidUntil = y.ValidUntil)
verify <# repo.Store( is( likeExpected)) #> once