How to read float data from a file in F# - f#

I have a file with float numbers and I want to read it and save the values in an array to do some math operations. Then I need to print the new array in another file. All the ways that I found reads the file as a string and I also don't know how to convert string to a float array.

If we could assume that each float in the file on its own line then you could do something like this:
open System
open System.IO
let readFloats filePath =
let strs = File.ReadLines(filePath) // Read file line by line
strs |> Seq.map (fun str -> System.Double.TryParse(str)) // TryParse returns pair (Boolean * float value). Boolean is true if string parsed correctly
|> Seq.filter (fun (success, _) -> success) // filters out wrong (not parsed strings)
|> Seq.map snd // transforms sequence of pairs (bool * float) into float only

Related

Read two integers in the same line as tuple in f#

I'm trying to read two integers which are going to be taken as input from the same line. My attempt so far:
let separator: char =
' '
Console.ReadLine().Split separator
|> Array.map Convert.ToInt32
But this returns a two-element array and I have to access the individual indices to access the integers. Ideally, what I would like is the following:
let (a, b) =
Console.ReadLine().Split separator
|> Array.map Convert.ToInt32
|> (some magic to convert the two element array to a tuple)
How can I do that?
I'm afraid there's no magic. You have to explicitly convert into a tuple
let a, b =
Console.ReadLine().Split separator
|> Array.map int
|> (fun arr -> arr.[0], arr.[1])
Edit: you can use reflection as #dbc suggested but that's slow and probably overkill for what you're doing.

F# deedle transform Series<string, obj> to Series<string, float>?

If I get a row of Frame by using the .Rows.[rowIndex] operation, Deedle will return me an Object Series. Sometimes I know this only contains float. How do i convert all the obj into float series in on shot?
In Deedle series are generic, so ideally it should be possible to get a float series right away. But as the reasons why you get a series of Objects is not clear, you can still convert the values to floats by mapping an appropriate type casting function:
#load #"..\packages\Deedle.1.2.4\Deedle.fsx"
open Deedle
open System
// Let's prepare a sample series
let keys = ["1";"2";"3"]
let values = [1.1 :> Object;1.2 :> Object;1.3 :> Object]
let series = Series(keys, values)
// Now apply the map taking the Series<string,System.Object> series to Series<string,float>
series |> Series.map (fun _ v -> v :?> float)
// as #Foggy Finder pointed out, there is a convenience function to only map values
series |> Series.mapValues (fun v -> v :?> float)
// Alternatively, use the tryMap function that takes the Series<int,Object> series
// to Series<int,TryValue<float>>
series |> Series.tryMap (fun _ v -> v :?> float)
The type of the Series.map function is (('a -> 'b -> 'c) -> Series<'a,'b> -> Series<'a,'c>) when 'a : equality. This means that the first argument of the mapping function is the key which we ignore using the underscore as it is not needed to make the type cast. As Foggy Finder pointed out, there is a convenience function Series.mapValues that hides the keys.

F# Write Mapped Input to Output

I am new to F# and am starting with a simple project to get going.
I have large txt files that I process - usually about 10 million records. What I want to do is read the file, filter out some specific rows, map the fields to only take a subset of the columns from the original file, and then output the result.
The 2 questions I have are:
How do I filter based on the map. The file has about 30 fields.
How do I take the output of the map and write it to a new TXT file
//Open the file
let lines = seq {use r = new StreamReader(kDir + kfName )
while not r.EndOfStream do yield r.ReadLine() }
//Filter the file
let sFilt = "Detached Houses,Upper Middle"
let out1 = lines
|> Seq.filter (fun x -> x.Contains(sFilt))
//Write out the filtered file - this works great
//val out1 : seq<string>
File.WriteAllLines("c:\\temp\\out1.txt", out1 )
//Here is where I have an issue
//I am trying to just get 2 of the columns to an output file
//val out2 : seq<string * string> - this has a different patter than out1
let out2 = out1 |> Seq.map (fun x2 -> x2.Split[|','|])
|> Seq.map (fun x3 -> x3.[0], x3.[3])
I get the following error on this line - I know the out1 and out2 are different. How can I resolve this difference?
Error message:
Possible overload: 'File.WriteAllLines(path: string, contents: IEnumerable<string>) : unit'. Type constraint mismatch. The type seq<string * string> is not compatible with type IEnumerable<string>
The type 'string' does not match the type 'string * string'.
What you can do is map back to a seq<string> from your seq<string*string>.
Seq.map (fun (str1, str2) -> sprintf "%s, %s" str1 str2)
You can just add that to your existing chain of map operations
let out2 =
out1
|> Seq.map (fun x2 -> x2.Split[|','|])
|> Seq.map (fun x3 -> x3.[0], x3.[3])
|> Seq.map (fun (str1, str2) -> sprintf "%s, %s" str1 str2)
Then, once again, you have a sequence of strings which you can write to your file.
fun x3 -> x3.[0], x3.[3] creates a tuple of strings string * string. You need to concat them, e.g. fun x3 -> sprintf "%s,%s" x3.[0] x3.[3] (if you want the comma in the output) or just fun x3 -> x3.[0] + x3.[3].
Its also possible that you want to use the CsvProvider if the file is properly structured. No reason for handling any IO if file is properly structured.
Then you get typed data, column names etc "for free"...
If it is not entirely well structured you might also use CsvParser for some less strictness in reading/handling the file.
Take a look at:
https://fsharp.github.io/FSharp.Data/library/CsvProvider.html
or
https://fsharp.github.io/FSharp.Data/library/CsvFile.html

F#: Converting tuples into a hashtable

I am new to programming, and this is my first time working with a typed, functional, and .NET language, so pardon me if my question is silly/trivial.
I have a list of tuples, and I would like to store the first item (which is a string) of each tuple as a value in the hashtable, and the second item (which is a byte array) of each tuple as a key. How may I go about doing so?
Here is my code:
let readAllBytes (tupleOfFileLengthsAndFiles) =
let hashtable = new Hashtable()
tupleOfFileLengthsAndFiles
|> snd
|> List.map (fun eachFile -> (eachFile, File.ReadAllBytes eachFile))
|> hashtable.Add(snd eachTuple, fst eachTuple)
However, the last line is underscored in red. How can I improve it? Thanks in advance for your help.
The easies way is to use dict
It will convert a sequence of tuples in a Dictionary which is a string type Hashtable.
> dict [(1,"one"); (2,"two"); ] ;;
val it : System.Collections.Generic.IDictionary<int,string> =
seq [[1, one] {Key = 1;
Value = "one";}; [2, two] {Key = 2;
Value = "two";}]
If you are really interested in a Hashtable you can use this simple function:
let convert x =
let d = Hashtable()
x |> Seq.iter d.Add
d
So, I'm not sure what do you want to do with this, it seems you're also interested in reading the file in the middle of the conversion. May be something like this:
let readAllBytes (tupleOfFileLengthsAndFiles:seq<'a*'b>) =
tupleOfFileLengthsAndFiles
|> Seq.map (fun (x, y) -> x, File.ReadAllBytes y)
|> convert

Handy F# snippets [closed]

As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
There are already two questions about F#/functional snippets.
However what I'm looking for here are useful snippets, little 'helper' functions that are reusable. Or obscure but nifty patterns that you can never quite remember.
Something like:
open System.IO
let rec visitor dir filter=
seq { yield! Directory.GetFiles(dir, filter)
for subdir in Directory.GetDirectories(dir) do
yield! visitor subdir filter}
I'd like to make this a kind of handy reference page. As such there will be no right answer, but hopefully lots of good ones.
EDIT Tomas Petricek has created a site specifically for F# snippets http://fssnip.net/.
Perl style regex matching
let (=~) input pattern =
System.Text.RegularExpressions.Regex.IsMatch(input, pattern)
It lets you match text using let test = "monkey" =~ "monk.+" notation.
Infix Operator
I got this from http://sandersn.com/blog//index.php/2009/10/22/infix-function-trick-for-f go to that page for more details.
If you know Haskell, you might find yourself missing infix sugar in F#:
// standard Haskell call has function first, then args just like F#. So obviously
// here there is a function that takes two strings: string -> string -> string
startsWith "kevin" "k"
//Haskell infix operator via backQuotes. Sometimes makes a function read better.
"kevin" `startsWith` "K"
While F# doesn't have a true 'infix' operator, the same thing can be accomplished almost as elegantly via a pipeline and a 'backpipeline' (who knew of such a thing??)
// F# 'infix' trick via pipelines
"kevin" |> startsWith <| "K"
Multi-Line Strings
This is pretty trivial, but it seems to be a feature of F# strings that is not widely known.
let sql = "select a,b,c \
from table \
where a = 1"
This produces:
val sql : string = "select a,b,c from table where a = 1"
When the F# compiler sees a back-slash followed by a carriage return inside a string literal, it will remove everything from the back-slash to the first non-space character on the next line. This allows you to have multi-line string literals that line up, without using a bunch of string concatenation.
Generic memoization, courtesy of the man himself
let memoize f =
let cache = System.Collections.Generic.Dictionary<_,_>(HashIdentity.Structural)
fun x ->
let ok, res = cache.TryGetValue(x)
if ok then res
else let res = f x
cache.[x] <- res
res
Using this, you could do a cached reader like so:
let cachedReader = memoize reader
Simple read-write to text files
These are trivial, but make file access pipeable:
open System.IO
let fileread f = File.ReadAllText(f)
let filewrite f s = File.WriteAllText(f, s)
let filereadlines f = File.ReadAllLines(f)
let filewritelines f ar = File.WriteAllLines(f, ar)
So
let replace f (r:string) (s:string) = s.Replace(f, r)
"C:\\Test.txt" |>
fileread |>
replace "teh" "the" |>
filewrite "C:\\Test.txt"
And combining that with the visitor quoted in the question:
let filereplace find repl path =
path |> fileread |> replace find repl |> filewrite path
let recurseReplace root filter find repl =
visitor root filter |> Seq.iter (filereplace find repl)
Update Slight improvement if you want to be able to read 'locked' files (e.g. csv files which are already open in Excel...):
let safereadall f =
use fs = new FileStream(f, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
use sr = new StreamReader(fs, System.Text.Encoding.Default)
sr.ReadToEnd()
let split sep (s:string) = System.Text.RegularExpressions.Regex.Split(s, sep)
let fileread f = safereadall f
let filereadlines f = f |> safereadall |> split System.Environment.NewLine
For performance intensive stuff where you need to check for null
let inline isNull o = System.Object.ReferenceEquals(o, null)
if isNull o then ... else ...
Is about 20x faster then
if o = null then ... else ...
Active Patterns, aka "Banana Splits", are a very handy construct that let one match against multiple regular expression patterns. This is much like AWK, but without the high performance of DFA's because the patterns are matched in sequence until one succeeds.
#light
open System
open System.Text.RegularExpressions
let (|Test|_|) pat s =
if (new Regex(pat)).IsMatch(s)
then Some()
else None
let (|Match|_|) pat s =
let opt = RegexOptions.None
let re = new Regex(pat,opt)
let m = re.Match(s)
if m.Success
then Some(m.Groups)
else None
Some examples of use:
let HasIndefiniteArticle = function
| Test "(?: |^)(a|an)(?: |$)" _ -> true
| _ -> false
type Ast =
| IntVal of string * int
| StringVal of string * string
| LineNo of int
| Goto of int
let Parse = function
| Match "^LET\s+([A-Z])\s*=\s*(\d+)$" g ->
IntVal( g.[1].Value, Int32.Parse(g.[2].Value) )
| Match "^LET\s+([A-Z]\$)\s*=\s*(.*)$" g ->
StringVal( g.[1].Value, g.[2].Value )
| Match "^(\d+)\s*:$" g ->
LineNo( Int32.Parse(g.[1].Value) )
| Match "^GOTO \s*(\d+)$" g ->
Goto( Int32.Parse(g.[1].Value) )
| s -> failwithf "Unexpected statement: %s" s
Maybe monad
type maybeBuilder() =
member this.Bind(v, f) =
match v with
| None -> None
| Some(x) -> f x
member this.Delay(f) = f()
member this.Return(v) = Some v
let maybe = maybeBuilder()
Here's a brief intro to monads for the uninitiated.
Option-coalescing operators
I wanted a version of the defaultArg function that had a syntax closer to the C# null-coalescing operator, ??. This lets me get the value from an Option while providing a default value, using a very concise syntax.
/// Option-coalescing operator - this is like the C# ?? operator, but works with
/// the Option type.
/// Warning: Unlike the C# ?? operator, the second parameter will always be
/// evaluated.
/// Example: let foo = someOption |? default
let inline (|?) value defaultValue =
defaultArg value defaultValue
/// Option-coalescing operator with delayed evaluation. The other version of
/// this operator always evaluates the default value expression. If you only
/// want to create the default value when needed, use this operator and pass
/// in a function that creates the default.
/// Example: let foo = someOption |?! (fun () -> new Default())
let inline (|?!) value f =
match value with Some x -> x | None -> f()
'Unitize' a function which doesn't handle units
Using the FloatWithMeasure function http://msdn.microsoft.com/en-us/library/ee806527(VS.100).aspx.
let unitize (f:float -> float) (v:float<'u>) =
LanguagePrimitives.FloatWithMeasure<'u> (f (float v))
Example:
[<Measure>] type m
[<Measure>] type kg
let unitize (f:float -> float) (v:float<'u>) =
LanguagePrimitives.FloatWithMeasure<'u> (f (float v))
//this function doesn't take units
let badinc a = a + 1.
//this one does!
let goodinc v = unitize badinc v
goodinc 3.<m>
goodinc 3.<kg>
OLD version:
let unitize (f:float -> float) (v:float<'u>) =
let unit = box 1. :?> float<'u>
unit * (f (v/unit))
Kudos to kvb
Scale/Ratio function builder
Again, trivial, but handy.
//returns a function which will convert from a1-a2 range to b1-b2 range
let scale (a1:float<'u>, a2:float<'u>) (b1:float<'v>,b2:float<'v>) =
let m = (b2 - b1)/(a2 - a1) //gradient of line (evaluated once only..)
(fun a -> b1 + m * (a - a1))
Example:
[<Measure>] type m
[<Measure>] type px
let screenSize = (0.<px>, 300.<px>)
let displayRange = (100.<m>, 200.<m>)
let scaleToScreen = scale displayRange screenSize
scaleToScreen 120.<m> //-> 60.<px>
Transposing a list (seen on Jomo Fisher's blog)
///Given list of 'rows', returns list of 'columns'
let rec transpose lst =
match lst with
| (_::_)::_ -> List.map List.head lst :: transpose (List.map List.tail lst)
| _ -> []
transpose [[1;2;3];[4;5;6];[7;8;9]] // returns [[1;4;7];[2;5;8];[3;6;9]]
And here is a tail-recursive version which (from my sketchy profiling) is mildly slower, but has the advantage of not throwing a stack overflow when the inner lists are longer than 10000 elements (on my machine):
let transposeTR lst =
let rec inner acc lst =
match lst with
| (_::_)::_ -> inner (List.map List.head lst :: acc) (List.map List.tail lst)
| _ -> List.rev acc
inner [] lst
If I was clever, I'd try and parallelise it with async...
F# Map <-> C# Dictionary
(I know, I know, System.Collections.Generic.Dictionary isn't really a 'C#' dictionary)
C# to F#
(dic :> seq<_>) //cast to seq of KeyValuePair
|> Seq.map (|KeyValue|) //convert KeyValuePairs to tuples
|> Map.ofSeq //convert to Map
(From Brian, here, with improvement proposed by Mauricio in comment below. (|KeyValue|) is an active pattern for matching KeyValuePair - from FSharp.Core - equivalent to (fun kvp -> kvp.Key, kvp.Value))
Interesting alternative
To get all of the immutable goodness, but with the O(1) lookup speed of Dictionary, you can use the dict operator, which returns an immutable IDictionary (see this question).
I currently can't see a way to directly convert a Dictionary using this method, other than
(dic :> seq<_>) //cast to seq of KeyValuePair
|> (fun kvp -> kvp.Key, kvp.Value) //convert KeyValuePairs to tuples
|> dict //convert to immutable IDictionary
F# to C#
let dic = Dictionary()
map |> Map.iter (fun k t -> dic.Add(k, t))
dic
What is weird here is that FSI will report the type as (for example):
val it : Dictionary<string,int> = dict [("a",1);("b",2)]
but if you feed dict [("a",1);("b",2)] back in, FSI reports
IDictionary<string,int> = seq[[a,1] {Key = "a"; Value = 1; } ...
Tree-sort / Flatten a tree into a list
I have the following binary tree:
___ 77 _
/ \
______ 47 __ 99
/ \
21 _ 54
\ / \
43 53 74
/
39
/
32
Which is represented as follows:
type 'a tree =
| Node of 'a tree * 'a * 'a tree
| Nil
let myTree =
Node
(Node
(Node (Nil,21,Node (Node (Node (Nil,32,Nil),39,Nil),43,Nil)),47,
Node (Node (Nil,53,Nil),54,Node (Nil,74,Nil))),77,Node (Nil,99,Nil))
A straightforward method to flatten the tree is:
let rec flatten = function
| Nil -> []
| Node(l, a, r) -> flatten l # a::flatten r
This isn't tail-recursive, and I believe the # operator causes it to be O(n log n) or O(n^2) with unbalanced binary trees. With a little tweaking, I came up with this tail-recursive O(n) version:
let flatten2 t =
let rec loop acc c = function
| Nil -> c acc
| Node(l, a, r) ->
loop acc (fun acc' -> loop (a::acc') c l) r
loop [] (fun x -> x) t
Here's the output in fsi:
> flatten2 myTree;;
val it : int list = [21; 32; 39; 43; 47; 53; 54; 74; 77; 99]
LINQ-to-XML helpers
namespace System.Xml.Linq
// hide warning about op_Explicit
#nowarn "77"
[<AutoOpen>]
module XmlUtils =
/// Converts a string to an XName.
let xn = XName.op_Implicit
/// Converts a string to an XNamespace.
let xmlns = XNamespace.op_Implicit
/// Gets the string value of any XObject subclass that has a Value property.
let inline xstr (x : ^a when ^a :> XObject) =
(^a : (member get_Value : unit -> string) x)
/// Gets a strongly-typed value from any XObject subclass, provided that
/// an explicit conversion to the output type has been defined.
/// (Many explicit conversions are defined on XElement and XAttribute)
/// Example: let value:int = xval foo
let inline xval (x : ^a when ^a :> XObject) : ^b =
((^a or ^b) : (static member op_Explicit : ^a -> ^b) x)
/// Dynamic lookup operator for getting an attribute value from an XElement.
/// Returns a string option, set to None if the attribute was not present.
/// Example: let value = foo?href
/// Example with default: let value = defaultArg foo?Name "<Unknown>"
let (?) (el:XElement) (name:string) =
match el.Attribute(xn name) with
| null -> None
| att -> Some(att.Value)
/// Dynamic operator for setting an attribute on an XElement.
/// Example: foo?href <- "http://www.foo.com/"
let (?<-) (el:XElement) (name:string) (value:obj) =
el.SetAttributeValue(xn name, value)
OK, this has nothing to do with snippets, but I keep forgetting this:
If you are in the interactive window, you hit F7 to jump back to the code window (without deselecting the code which you just ran...)
Going from code window to F# window (and also to open the F# window) is Ctrl Alt F
(unless CodeRush has stolen your bindings...)
Weighted sum of arrays
Calculating a weighted [n-array] sum of a [k-array of n-arrays] of numbers, based on a [k-array] of weights
(Copied from this question, and kvb's answer)
Given these arrays
let weights = [|0.6;0.3;0.1|]
let arrs = [| [|0.0453;0.065345;0.07566;1.562;356.6|] ;
[|0.0873;0.075565;0.07666;1.562222;3.66|] ;
[|0.06753;0.075675;0.04566;1.452;3.4556|] |]
We want a weighted sum (by column), given that both dimensions of the arrays can be variable.
Array.map2 (fun w -> Array.map ((*) w)) weights arrs
|> Array.reduce (Array.map2 (+))
First line: Partial application of the first Array.map2 function to weights yields a new function (Array.map ((*) weight) which is applied (for each weight) to each array in arr.
Second line: Array.reduce is like fold, except it starts on the second value and uses the first as the initial 'state'. In this case each value is a 'line' of our array of arrays. So applying an Array.map2 (+) on the first two lines means that we sum the first two arrays, which leaves us with a new array, which we then (Array.reduce) sum again onto the next (in this case last) array.
Result:
[|0.060123; 0.069444; 0.07296; 1.5510666; 215.40356|]
Performance testing
(Found here and updated for latest release of F#)
open System
open System.Diagnostics
module PerformanceTesting =
let Time func =
let stopwatch = new Stopwatch()
stopwatch.Start()
func()
stopwatch.Stop()
stopwatch.Elapsed.TotalMilliseconds
let GetAverageTime timesToRun func =
Seq.initInfinite (fun _ -> (Time func))
|> Seq.take timesToRun
|> Seq.average
let TimeOperation timesToRun =
GC.Collect()
GetAverageTime timesToRun
let TimeOperations funcsWithName =
let randomizer = new Random(int DateTime.Now.Ticks)
funcsWithName
|> Seq.sortBy (fun _ -> randomizer.Next())
|> Seq.map (fun (name, func) -> name, (TimeOperation 100000 func))
let TimeOperationsAFewTimes funcsWithName =
Seq.initInfinite (fun _ -> (TimeOperations funcsWithName))
|> Seq.take 50
|> Seq.concat
|> Seq.groupBy fst
|> Seq.map (fun (name, individualResults) -> name, (individualResults |> Seq.map snd |> Seq.average))
DataSetExtensions for F#, DataReaders
System.Data.DataSetExtensions.dll adds the ability to treat a DataTable as an IEnumerable<DataRow> as well as unboxing the values of individual cells in a way that gracefully handles DBNull by supporting System.Nullable. For example, in C# we can get the value of an integer column that contains nulls, and specify that DBNull should default to zero with a very concise syntax:
var total = myDataTable.AsEnumerable()
.Select(row => row.Field<int?>("MyColumn") ?? 0)
.Sum();
There are two areas where DataSetExtensions are lacking, however. First, it doesn't support IDataReader and second, it doesn't support the F# option type. The following code does both - it allows an IDataReader to be treated as a seq<IDataRecord>, and it can unbox values from either a reader or a dataset, with support for F# options or System.Nullable. Combined with the option-coalescing operator in another answer, this allows for code such as the following when working with a DataReader:
let total =
myReader.AsSeq
|> Seq.map (fun row -> row.Field<int option>("MyColumn") |? 0)
|> Seq.sum
Perhaps a more idiomatic F# way of ignoring database nulls would be...
let total =
myReader.AsSeq
|> Seq.choose (fun row -> row.Field<int option>("MyColumn"))
|> Seq.sum
Further, the extension methods defined below are usable from both F# and from C#/VB.
open System
open System.Data
open System.Reflection
open System.Runtime.CompilerServices
open Microsoft.FSharp.Collections
/// Ported from System.Data.DatasetExtensions.dll to add support for the Option type.
[<AbstractClass; Sealed>]
type private UnboxT<'a> private () =
// This class generates a converter function based on the desired output type,
// and then re-uses the converter function forever. Because the class itself is generic,
// different output types get different cached converter functions.
static let referenceField (value:obj) =
if value = null || DBNull.Value.Equals(value) then
Unchecked.defaultof<'a>
else
unbox value
static let valueField (value:obj) =
if value = null || DBNull.Value.Equals(value) then
raise <| InvalidCastException("Null cannot be converted to " + typeof<'a>.Name)
else
unbox value
static let makeConverter (target:Type) methodName =
Delegate.CreateDelegate(typeof<Converter<obj,'a>>,
typeof<UnboxT<'a>>
.GetMethod(methodName, BindingFlags.NonPublic ||| BindingFlags.Static)
.MakeGenericMethod([| target.GetGenericArguments().[0] |]))
|> unbox<Converter<obj,'a>>
|> FSharpFunc.FromConverter
static let unboxFn =
let theType = typeof<'a>
if theType.IsGenericType && not theType.IsGenericTypeDefinition then
let genericType = theType.GetGenericTypeDefinition()
if typedefof<Nullable<_>> = genericType then
makeConverter theType "NullableField"
elif typedefof<option<_>> = genericType then
makeConverter theType "OptionField"
else
invalidOp "The only generic types supported are Option<T> and Nullable<T>."
elif theType.IsValueType then
valueField
else
referenceField
static member private NullableField<'b when 'b : struct and 'b :> ValueType and 'b:(new:unit -> 'b)> (value:obj) =
if value = null || DBNull.Value.Equals(value) then
Nullable<_>()
else
Nullable<_>(unbox<'b> value)
static member private OptionField<'b> (value:obj) =
if value = null || DBNull.Value.Equals(value) then
None
else
Some(unbox<'b> value)
static member inline Unbox =
unboxFn
/// F# data-related extension methods.
[<AutoOpen>]
module FsDataEx =
type System.Data.IDataReader with
/// Exposes a reader's current result set as seq<IDataRecord>.
/// Reader is closed when sequence is fully enumerated.
member this.AsSeq =
seq { use reader = this
while reader.Read() do yield reader :> IDataRecord }
/// Exposes all result sets in a reader as seq<seq<IDataRecord>>.
/// Reader is closed when sequence is fully enumerated.
member this.AsMultiSeq =
let rowSeq (reader:IDataReader) =
seq { while reader.Read() do yield reader :> IDataRecord }
seq {
use reader = this
yield rowSeq reader
while reader.NextResult() do
yield rowSeq reader
}
/// Populates a new DataSet with the contents of the reader. Closes the reader after completion.
member this.ToDataSet () =
use reader = this
let dataSet = new DataSet(RemotingFormat=SerializationFormat.Binary, EnforceConstraints=false)
dataSet.Load(reader, LoadOption.OverwriteChanges, [| "" |])
dataSet
type System.Data.IDataRecord with
/// Gets a value from the record by name.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
member this.Field<'a> (fieldName:string) =
this.[fieldName] |> UnboxT<'a>.Unbox
/// Gets a value from the record by column index.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
member this.Field<'a> (ordinal:int) =
this.GetValue(ordinal) |> UnboxT<'a>.Unbox
type System.Data.DataRow with
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnName:string) =
this.[columnName] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnIndex:int) =
this.[columnIndex] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (column:DataColumn) =
this.[column] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnName:string, version:DataRowVersion) =
this.[columnName, version] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnIndex:int, version:DataRowVersion) =
this.[columnIndex, version] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (column:DataColumn, version:DataRowVersion) =
this.[column, version] |> UnboxT<'a>.Unbox
/// C# data-related extension methods.
[<Extension; AbstractClass; Sealed>]
type CsDataEx private () =
/// Populates a new DataSet with the contents of the reader. Closes the reader after completion.
[<Extension>]
static member ToDataSet(this:IDataReader) =
this.ToDataSet()
/// Exposes a reader's current result set as IEnumerable{IDataRecord}.
/// Reader is closed when sequence is fully enumerated.
[<Extension>]
static member AsEnumerable(this:IDataReader) =
this.AsSeq
/// Exposes all result sets in a reader as IEnumerable{IEnumerable{IDataRecord}}.
/// Reader is closed when sequence is fully enumerated.
[<Extension>]
static member AsMultipleEnumerable(this:IDataReader) =
this.AsMultiSeq
/// Gets a value from the record by name.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
[<Extension>]
static member Field<'T> (this:IDataRecord, fieldName:string) =
this.Field<'T>(fieldName)
/// Gets a value from the record by column index.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
[<Extension>]
static member Field<'T> (this:IDataRecord, ordinal:int) =
this.Field<'T>(ordinal)
Handling arguments in a command line application:
//We assume that the actual meat is already defined in function
// DoStuff (string -> string -> string -> unit)
let defaultOutOption = "N"
let defaultUsageOption = "Y"
let usage =
"Scans a folder for and outputs results.\n" +
"Usage:\n\t MyApplication.exe FolderPath [IncludeSubfolders (Y/N) : default=" +
defaultUsageOption + "] [OutputToFile (Y/N): default=" + defaultOutOption + "]"
let HandlArgs arr =
match arr with
| [|d;u;o|] -> DoStuff d u o
| [|d;u|] -> DoStuff d u defaultOutOption
| [|d|] -> DoStuff d defaultUsageOption defaultOutOption
| _ ->
printf "%s" usage
Console.ReadLine() |> ignore
[<EntryPoint>]
let main (args : string array) =
args |> HandlArgs
0
(I had a vague memory of this technique being inspired by Robert Pickering, but can't find a reference now)
A handy cache function that keeps up to max (key,reader(key)) in a dictionary and use a SortedList to track the MRU keys
let Cache (reader: 'key -> 'value) max =
let cache = new Dictionary<'key,LinkedListNode<'key * 'value>>()
let keys = new LinkedList<'key * 'value>()
fun (key : 'key) -> (
let found, value = cache.TryGetValue key
match found with
|true ->
keys.Remove value
keys.AddFirst value |> ignore
(snd value.Value)
|false ->
let newValue = key,reader key
let node = keys.AddFirst newValue
cache.[key] <- node
if (keys.Count > max) then
let lastNode = keys.Last
cache.Remove (fst lastNode.Value) |> ignore
keys.RemoveLast() |> ignore
(snd newValue))
Creating XElements
Nothing amazing, but I keep getting caught out by the implicit conversion of XNames:
#r "System.Xml.Linq.dll"
open System.Xml.Linq
//No! ("type string not compatible with XName")
//let el = new XElement("MyElement", "text")
//better
let xn s = XName.op_Implicit s
let el = new XElement(xn "MyElement", "text")
//or even
let xEl s o = new XElement(xn s, o)
let el = xEl "MyElement" "text"
Pairwise and pairs
I always expect Seq.pairwise to give me [(1,2);(3;4)] and not [(1,2);(2,3);(3,4)]. Given that neither exist in List, and that I needed both, here's the code for future reference. I think they're tail recursive.
//converts to 'windowed tuples' ([1;2;3;4;5] -> [(1,2);(2,3);(3,4);(4,5)])
let pairwise lst =
let rec loop prev rem acc =
match rem with
| hd::tl -> loop hd tl ((prev,hd)::acc)
| _ -> List.rev acc
loop (List.head lst) (List.tail lst) []
//converts to 'paged tuples' ([1;2;3;4;5;6] -> [(1,2);(3,4);(5,6)])
let pairs lst =
let rec loop rem acc =
match rem with
| l::r::tl -> loop tl ((l,r)::acc)
| l::[] -> failwith "odd-numbered list"
| _ -> List.rev acc
loop lst []
Naive CSV reader (i.e., won't handle anything nasty)
(Using filereadlines and List.transpose from other answers here)
///Given a file path, returns a List of row lists
let ReadCSV =
filereadlines
>> Array.map ( fun line -> line.Split([|',';';'|]) |> List.ofArray )
>> Array.toList
///takes list of col ids and list of rows,
/// returns array of columns (in requested order)
let GetColumns cols rows =
//Create filter
let pick cols (row:list<'a>) = List.map (fun i -> row.[i]) cols
rows
|> transpose //change list of rows to list of columns
|> pick cols //pick out the columns we want
|> Array.ofList //an array output is easier to index for user
Example
"C:\MySampleCSV"
|> ReadCSV
|> List.tail //skip header line
|> GetColumns [0;3;1] //reorder columns as well, if needs be.
Date Range
simple but useful list of dates between fromDate and toDate
let getDateRange fromDate toDate =
let rec dates (fromDate:System.DateTime) (toDate:System.DateTime) =
seq {
if fromDate <= toDate then
yield fromDate
yield! dates (fromDate.AddDays(1.0)) toDate
}
dates fromDate toDate
|> List.ofSeq
toggle code to sql
More trivial than most on this list, but handy nonetheless:
I'm always taking sql in and out of code to move it to a sql environment during development. Example:
let sql = "select a,b,c "
+ "from table "
+ "where a = 1"
needs to be 'stripped' to:
select a,b,c
from table
where a = 1
keeping the formatting. It's a pain to strip out the code symbols for the sql editor, then put them back again by hand when I've got the sql worked out. These two functions toggle the sql back and forth from code to stripped:
// reads the file with the code quoted sql, strips code symbols, dumps to FSI
let stripForSql fileName =
File.ReadAllText(fileName)
|> (fun s -> Regex.Replace(s, "\+(\s*)\"", ""))
|> (fun s -> s.Replace("\"", ""))
|> (fun s -> Regex.Replace(s, ";$", "")) // end of line semicolons
|> (fun s -> Regex.Replace(s, "//.+", "")) // get rid of any comments
|> (fun s -> printfn "%s" s)
then when you are ready to put it back into your code source file:
let prepFromSql fileName =
File.ReadAllText(fileName)
|> (fun s -> Regex.Replace(s, #"\r\n", " \"\r\n+\"")) // matches newline
|> (fun s -> Regex.Replace(s, #"\A", " \""))
|> (fun s -> Regex.Replace(s, #"\z", " \""))
|> (fun s -> printfn "%s" s)
I'd love to get rid of the input file but can't even begin to grok how to make that happen. anyone?
edit:
I figured out how to eliminate the requirement of a file for these functions by adding a windows forms dialog input/output. Too much code to show, but for those who would like to do such a thing, that's how I solved it.
Pascal's Triangle (hey, someone might find it useful)
So we want to create a something like this:
1
1 1
1 2 1
1 3 3 1
1 4 6 4 1
Easy enough:
let rec next = function
| [] -> []
| x::y::xs -> (x + y)::next (y::xs)
| x::xs -> x::next xs
let pascal n =
seq { 1 .. n }
|> List.scan (fun acc _ -> next (0::acc) ) [1]
The next function returns a new list where each item[i] = item[i] + item[i + 1].
Here's the output in fsi:
> pascal 10 |> Seq.iter (printfn "%A");;
[1]
[1; 1]
[1; 2; 1]
[1; 3; 3; 1]
[1; 4; 6; 4; 1]
[1; 5; 10; 10; 5; 1]
[1; 6; 15; 20; 15; 6; 1]
[1; 7; 21; 35; 35; 21; 7; 1]
[1; 8; 28; 56; 70; 56; 28; 8; 1]
[1; 9; 36; 84; 126; 126; 84; 36; 9; 1]
[1; 10; 45; 120; 210; 252; 210; 120; 45; 10; 1]
For the adventurous, here's a tail-recursive version:
let rec next2 cont = function
| [] -> cont []
| x::y::xs -> next2 (fun l -> cont <| (x + y)::l ) <| y::xs
| x::xs -> next2 (fun l -> cont <| x::l ) <| xs
let pascal2 n =
set { 1 .. n }
|> Seq.scan (fun acc _ -> next2 id <| 0::acc)) [1]
Flatten a List
if you have something like this:
let listList = [[1;2;3;];[4;5;6]]
and want to 'flatten' it down to a singe list so the result is like this:
[1;2;3;4;5;6]
it can be done thusly:
let flatten (l: 'a list list) =
seq {
yield List.head (List.head l)
for a in l do yield! (Seq.skip 1 a)
}
|> List.ofSeq
List comprehensions for float
This [23.0 .. 1.0 .. 40.0] was marked as deprecated a few versions backed.
But apparently, this works:
let dl = 9.5 / 11.
let min = 21.5 + dl
let max = 40.5 - dl
let a = [ for z in min .. dl .. max -> z ]
let b = a.Length
(BTW, there's a floating point gotcha in there. Discovered at fssnip - the other place for F# snippets)
Parallel map
let pmap f s =
seq { for a in s -> async { return f s } }
|> Async.Parallel
|> Async.Run

Resources