How to cumulate (scan) Deedle data frame values - f#

I'm loading a sequence of records into a deedle data frame (from a database table). Is it possible to accumulate (for example sum cumulatively) the values, and get back a data frame? For example there is Series.scanValues but there is no Frame.scanValues. There is Frame.map, but it didn't do what I expected, it left all values as they were.
#if INTERACTIVE
#r #"Fsharp.Charting"
#load #"..\..\Deedle.fsx"
#endif
open FSharp.Charting
open FSharp.Charting.ChartTypes
open Deedle
type SeriesX = {
DataDate:DateTime
Series1:float
Series2:float
Series3:float
}
let rnd = new System.Random()
rnd.NextDouble() - 0.5
let data =
[for i in [100..-1..1] ->
{SeriesX.DataDate = DateTime.Now.AddDays(float -i)
SeriesX.Series1 = rnd.NextDouble() - 0.5
SeriesX.Series2 = rnd.NextDouble() - 0.5
SeriesX.Series3 = rnd.NextDouble() - 0.5
}
]
# now comes the deedle frame:
let df = data |> Frame.ofRecords
let df = df.IndexRows<DateTime>("DataDate")
df.["Series1"] |> Chart.Line
df.["Series1"].ScanValues((fun acc x -> acc + x),0.0) |> Chart.Line
let df' = df |> Frame.mapValues (Seq.scan (fun acc x -> acc + x) 0.0)
df'.["Series1"] |> Chart.Line
The last two lines just give me back the original values while I would like to have the accumulated values like in df.["Series1"].Scanvalues for Series1, Series2, and Series3.

For filtering and projection, series provides Where and Select methods
and corresponding Series.map and Series.filter functions (there is
also Series.mapValues and Series.mapKeys if you only want to transform
one aspect).
So you just apply your function to each Series:
let allSum =
df.Columns
|> Series.mapValues(Series.scanValues(fun acc v -> acc + (v :?> float)) 0.0)
|> Frame.ofColumns
and use Frame.ofColumns that to convert the result to the Frame.
Edit:
If you need to select only numerics columns, you can use the Frame.getNumericCols:
let allSum =
df
|> Frame.getNumericCols
|> Series.mapValues(Series.scanValues (+) 0.0)
|> Frame.ofColumns
without an explicit type cast code has become more beautiful :)

There is a Series.scanValues function. You can obtain a series from every column in your data frame like this: frame$column, which gets you a Series.
If you need all the columns at once to do the scan, you could first map each row into a single value (a tuple, for example) and the apply the Series.scanValues to that new column.

Related

What would be an example for Frame.mapCols?

I am trying to create examples for all the methods in Deedle, with mixed success. I was able to provide an example for Frame.filterColValues but not for Frame.mapCols. Stealing from one of Tomas Petricek's numerous writeups, I defined a DataFrame as follows:
open Deedle
let dates =
[ DateTime(2013,1,1);
DateTime(2013,1,4);
DateTime(2013,1,8) ]
let values = [ 10.0; 20.0; 30.0 ]
let first = Series(dates, values)
/// Generate date range from 'first' with 'count' days
let dateRange (first:System.DateTime) count =
seq {for i in 0..(count - 1) -> first.AddDays(float i)}
/// Generate 'count' number of random doubles
let rand count =
let rnd = System.Random()
seq {for i in 0..(count - 1) -> rnd.NextDouble()}
// A series with values for 10 days
let second = Series(dateRange (DateTime(2013,1,1)) 10, rand 10)
// df1 has two columns
let df1 = Frame(["first"; "second"], [first; second])
Next I was able to provide an example for Frame.filterColValues:
let df2 = Frame.filterColValues (fun s -> (s.GetAtAs<double>(0) > 5.0)) df1
// df2 has only one column
Frame.toArray2D(df2)
But I could not (and I tried hard) create an example for Frame.map cols. The best I could come up with was:
let df3 = Frame.mapCols (fun k x -> x.GetAtAs<double>(0)) df1
error FS0001: The type 'double' is not compatible with the type 'ISeries<'a>'
What am I doing wrong? Can someone post an example? Or, even better, point to a place where there are examples for the Deedle methods?
The Frame.mapCols function lets you transform a column series into another column series.
The most basic example is just to return the column series unchanged:
df1
|> Frame.mapCols (fun k col -> col)
As Foggy Finder mentioned in a comment, you can fill all missing values in a column using this - the body of the lambda has to return a series:
df1
|> Frame.mapCols (fun k v -> Series.fillMissingWith 0.0 v)
If you wanted, you could return a series with just a single element (this turns your frame into a frame with one row - containing data from the first row - and original number of columns):
df1
|> Frame.mapCols (fun k col -> series [ 0 => col.GetAtAs<float>(0) ])
In your code snippet, it looks like you wanted just a series (with a single value for each column), which can be done by getting the columns and then using Series.map:
df1.Columns
|> Series.map (fun k col -> col.GetAtAs<float>(0))

How do I do an efficient temporal join of records in an array?

I would like to join a record to the next record at least X days/minutes/seconds into the future. I need to do this with arrays with a few hundred thousand records. I am open to sequences/lists/arrays but I believe arrays are likely to be fastest.
I can do this quickly in Deedle with Frame.joinAlign JoinKind.Left Lookup.ExactOrGreater, but I have an easier time reasoning about transformations using standard arrays/sequences/lists.
The following example is fine with 1000 records but very slow when 100k. A comment here suggests a binary search but I do not see how to do that here where the search is based on an inequality.
type Test1 = {
Date : DateTime
Value : float
}
type Test2 = {
Date1 :DateTime
Value1 : float
Date2 : DateTime
Value2 : float
}
let rng = System.Random()
let rng2 = System.Random()
let rs =
[| for i = 1 to 1000 do
let baseDay = DateTime(2016,1,1).AddDays(float i)
let actualDay = baseDay.AddDays(float (rng2.Next(7)))
yield {Date = actualDay; Value = rng.NextDouble() }|]
[| for r in rs do
let futureDay = r.Date.AddDays(float 4)
let r2 =
rs
|> Array.filter (fun x -> x.Date > futureDay)
|> Array.tryHead
let nr =
match r2 with
| Some x -> Some {Date1 = r.Date;Value1 = r.Value; Date2=x.Date;Value2 = x.Value}
| None -> None
if nr.IsSome then yield nr.Value |]
The problem is this expression:
let r2 =
rs
|> Array.filter (fun x -> x.Date > futureDay)
|> Array.tryHead
This filters the entire array and creates a new array with all the matching items, when you really just want the first matching item. And this is happening for every r. Try this instead:
let r2 = rs |> Array.tryFind (fun x -> x.Date > futureDay)
N.b. your logic would have been fine if you were dealing with sequences rather than arrays as the filter would have been evaluated lazily, but of course sequences are going to be slower than arrays in general. The thing to keep in mind is that whereas the Seq module is lazy (with some exceptions), when using the Array and List (and Set and Map, etc.) modules, every step in the chain/pipeline will eagerly allocate a new list/array and consequently can be very expensive when working with large collections.
If sorting rs doesn't affect your logic or expected output, a further improvement can be made by using Array.FindIndex to start searching at r's index rather than from the beginning of the array each time:
Array.sortInPlace rs
rs
|> Seq.mapi (fun i r ->
let futureDay = r.Date.AddDays 4.0
let r2Index = Array.FindIndex (rs, i, (fun x -> x.Date > futureDay))
match r2Index with
| -1 -> None
| i' -> let x = rs.[i']
Some { Date1=r.Date; Value1=r.Value; Date2=x.Date; Value2=x.Value })
|> Seq.choose id
|> Array.ofSeq
This should offer a significant improvement over even the Array.tryFind approach as only a handful of array elements will need to be scanned each time.
Here are FSI timings from my ageing tablet with the system under otherwise nil load:
10k elements:
unsorted + Array.filter (original code): 00:00:08.783
unsorted + Array.tryFind: 00:00:03.844
Array.sort + Seq.mapi: 00:00:00.027
100k elements:
unsorted + Array.filter: I didn't bother.
unsorted + Array.tryFind: 00:06:14.288
Array.sort + Seq.mapi: 00:00:00.305

Sampling in F# : is Set adequate?

I have an array of items, from which I'd like to sample.
I was under the impression that a Set would the a good structure to sample from, in a fold where I'd give back the original or a modified set with the retrieved element missing depending if I want replacement of not.
However, there seems to no method to retrieve an element directly from a Set.
Is there something I am missing ? or should I use Set of indices, along with a surrogate function that starts at some random position < Set.count and goes up until it finds a member ?
That is, something along this line
module Seq =
let modulo (n:int) start =
let rec next i = seq { yield (i + 1)%n ; yield! next (i+1)}
next start
module Array =
let Sample (withReplacement:bool) seed (entries:'T array) =
let prng, indexes = new Random(seed), Set(Seq.init (entries |> Array.length) id)
Seq.unfold (fun set -> let N = set |> Set.count
let next = Seq.modulo N (prng.Next(N)) |> Seq.truncate N |> Seq.tryFind(fun i -> set |> Set.exists ((=) i))
if next.IsSome then
Some(entries.[next.Value], if withReplacement then set else Set.remove next.Value set)
else
None)
Edit : Tracking positively what I gave, instead of tracking what I still can give would make it simpler and more efficient.
For sampling without replacement, you could just permute the source seq and take however many elements you want to sample
let sampleWithoutReplacement n s =
let a = Array.ofSeq s
seq { for i = a.Length downto 1 do
let j = rnd.Next i
yield a.[j]
a.[j] <- a.[i - 1] }
|> Seq.take n
To sample with replacement, just pick a random element n times from the source seq
let sampleWithReplacement n s =
let a = Array.ofSeq s
Seq.init n (fun _ -> a.[rnd.Next(a.Length)])
These may not be the most efficient methods with huge data sets however
Continuing our comments...if you want to randomly sample a sequence without slurping the entire thing into memory you could generate a set of random indices the size of your desired sample (not too different from what you already have):
let rand count max =
System.Random()
|> Seq.unfold (fun r -> Some(r.Next(max), r))
|> Seq.distinct
|> Seq.take count
|> set
let takeSample sampleSize inputSize input =
let indices = rand sampleSize inputSize
input
|> Seq.mapi (fun idx x ->
if Set.contains idx indices then Some x else None)
|> Seq.choose id
let inputSize = 100000
let input = Seq.init inputSize id
let sample = takeSample 50 inputSize input
printfn "%A" (Seq.toList sample)

What's the style for immutable set and map in F#

I have just solved problem23 in Project Euler, in which I need a set to store all abundant numbers. F# has a immutable set, I can use Set.empty.Add(i) to create a new set containing number i. But I don't know how to use immutable set to do more complicated things.
For example, in the following code, I need to see if a number 'x' could be written as the sum of two numbers in a set. I resort to a sorted array and array's binary search algorithm to get the job done.
Please also comment on my style of the following program. Thanks!
let problem23 =
let factorSum x =
let mutable sum = 0
for i=1 to x/2 do
if x%i=0 then
sum <- sum + i
sum
let isAbundant x = x < (factorSum x)
let abuns = {1..28123} |> Seq.filter isAbundant |> Seq.toArray
let inAbuns x = Array.BinarySearch(abuns, x) >= 0
let sumable x =
abuns |> Seq.exists (fun a -> inAbuns (x-a))
{1..28123} |> Seq.filter (fun x -> not (sumable x)) |> Seq.sum
the updated version:
let problem23b =
let factorSum x =
{1..x/2} |> Seq.filter (fun i->x%i=0) |> Seq.sum
let isAbundant x = x < (factorSum x)
let abuns = Set( {1..28123} |> Seq.filter isAbundant )
let inAbuns x = Set.contains x abuns
let sumable x =
abuns |> Seq.exists (fun a -> inAbuns (x-a))
{1..28123} |> Seq.filter (fun x -> not (sumable x)) |> Seq.sum
This version runs in about 27 seconds, while the first 23 seconds(I've run several times). So an immutable red-black tree actually does not have much speed down compared to a sorted array with binary search. The total number of elements in the set/array is 6965.
Your style looks fine to me. The different steps in the algorithm are clear, which is the most important part of making something work. This is also the tactic I use for solving Project Euler problems. First make it work, and then make it fast.
As already remarked, replacing Array.BinarySearch by Set.contains makes the code even more readable. I find that in almost all PE solutions I've written, I only use arrays for lookups. I've found that using sequences and lists as data structures is more natural within F#. Once you get used to them, that is.
I don't think using mutability inside a function is necessarily bad. I've optimized problem 155 from almost 3 minutes down to 7 seconds with some aggressive mutability optimizations. In general though, I'd save that as an optimization step and start out writing it using folds/filters etc. In the example case of problem 155, I did start out using immutable function composition, because it made testing and most importantly, understanding, my approach easy.
Picking the wrong algorithm is much more detrimental to a solution than using a somewhat slower immutable approach first. A good algorithm is still fast even if it's slower than the mutable version (couch hello captain obvious! cough).
Edit: let's look at your version
Your problem23b() took 31 seconds on my PC.
Optimization 1: use new algorithm.
//useful optimization: if m divides n, (n/m) divides n also
//you now only have to check m up to sqrt(n)
let factorSum2 n =
let rec aux acc m =
match m with
| m when m*m = n -> acc + m
| m when m*m > n -> acc
| m -> aux (acc + (if n%m=0 then m + n/m else 0)) (m+1)
aux 1 2
This is still very much in functional style, but using this updated factorSum in your code, the execution time went from 31 seconds to 8 seconds.
Everything's still in immutable style, but let's see what happens when an array lookup is used instead of a set:
Optimization 2: use an array for lookup:
let absums() =
//create abundant numbers as an array for (very) fast lookup
let abnums = [|1..28128|] |> Array.filter (fun n -> factorSum2 n > n)
//create a second lookup:
//a boolean array where arr.[x] = true means x is a sum of two abundant numbers
let arr = Array.zeroCreate 28124
for x in abnums do
for y in abnums do
if x+y<=28123 then arr.[x+y] <- true
arr
let euler023() =
absums() //the array lookup
|> Seq.mapi (fun i isAbsum -> if isAbsum then 0 else i) //mapi: i is the position in the sequence
|> Seq.sum
//I always write a test once I've solved a problem.
//In this way, I can easily see if changes to the code breaks stuff.
let test() = euler023() = 4179871
Execution time: 0.22 seconds (!).
This is what I like so much about F#, it still allows you to use mutable constructs to tinker under the hood of your algorithm. But I still only do this after I've made something more elegant work first.
You can easily create a Set from a given sequence of values.
let abuns = Set (seq {1..28123} |> Seq.filter isAbundant)
inAbuns would therefore be rewritten to
let inAbuns x = abuns |> Set.mem x
Seq.exists would be changed to Set.exists
But the array implementation is fine too ...
Note that there is no need to use mutable values in factorSum, apart from the fact that it's incorrect since you compute the number of divisors instead of their sum:
let factorSum x = seq { 1..x/2 } |> Seq.filter (fun i -> x % i = 0) |> Seq.sum
Here is a simple functional solution that is shorter than your original and over 100× faster:
let problem23 =
let rec isAbundant i t x =
if i > x/2 then x < t else
if x % i = 0 then isAbundant (i+1) (t+i) x else
isAbundant (i+1) t x
let xs = Array.Parallel.init 28124 (isAbundant 1 0)
let ys = Array.mapi (fun i b -> if b then Some i else None) xs |> Array.choose id
let f x a = x-a < 0 || not xs.[x-a]
Array.init 28124 (fun x -> if Array.forall (f x) ys then x else 0)
|> Seq.sum
The first trick is to record which numbers are abundant in an array indexed by the number itself rather than using a search structure. The second trick is to notice that all the time is spent generating that array and, therefore, to do it in parallel.

Get a random subset from a set in F#

I am trying to think of an elegant way of getting a random subset from a set in F#
Any thoughts on this?
Perhaps this would work: say we have a set of 2x elements and we need to pick a subset of y elements. Then if we could generate an x sized bit random number that contains exactly y 2n powers we effectively have a random mask with y holes in it. We could keep generating new random numbers until we get the first one satisfying this constraint but is there a better way?
If you don't want to convert to an array you could do something like this. This is O(n*m) where m is the size of the set.
open System
let rnd = Random(0);
let set = Array.init 10 (fun i -> i) |> Set.of_array
let randomSubSet n set =
seq {
let i = set |> Set.to_seq |> Seq.nth (rnd.Next(set.Count))
yield i
yield! set |> Set.remove i
}
|> Seq.take n
|> Set.of_seq
let result = set |> randomSubSet 3
for x in result do
printfn "%A" x
Agree with #JohannesRossel. There's an F# shuffle-an-array algorithm here you can modify suitably. Convert the Set into an array, and then loop until you've selected enough random elements for the new subset.
Not having a really good grasp of F# and what might be considered elegant there, you could just do a shuffle on the list of elements and select the first y. A Fisher-Yates shuffle even helps you in this respect as you also only need to shuffle y elements.
rnd must be out of subset function.
let rnd = new Random()
let rec subset xs =
let removeAt n xs = ( Seq.nth (n-1) xs, Seq.append (Seq.take (n-1) xs) (Seq.skip n xs) )
match xs with
| [] -> []
| _ -> let (rem, left) = removeAt (rnd.Next( List.length xs ) + 1) xs
let next = subset (List.of_seq left)
if rnd.Next(2) = 0 then rem :: next else next
Do you mean a random subset of any size?
For the case of a random subset of a specific size, there's a very elegant answer here:
Select N random elements from a List<T> in C#
Here it is in pseudocode:
RandomKSubset(list, k):
n = len(list)
needed = k
result = {}
for i = 0 to n:
if rand() < needed / (n-i)
push(list[i], result)
needed--
return result
Using Seq.fold to construct using lazy evaluation random sub-set:
let rnd = new Random()
let subset2 xs = let insertAt n xs x = Seq.concat [Seq.take n xs; seq [x]; Seq.skip n xs]
let randomInsert xs = insertAt (rnd.Next( (Seq.length xs) + 1 )) xs
xs |> Seq.fold randomInsert Seq.empty |> Seq.take (rnd.Next( Seq.length xs ) + 1)

Resources