I have these records:
type Name = string
type PhoneNumber = int
type Sex = Male | Female
type YearOfBirth = int
type Interests = string list
type Client = {name: Name; phone: PhoneNumber; sex: Sex; birth: YearOfBirth; interests: Interests}
let client1 = {name = "Jon"; phone = 37613498; sex = Male; birth = 1980; interests = ["Cars"; "Boats"; "Airplanes"]}
let client2 = {name = "Jonna"; phone = 31852654; sex = Female; birth = 1970; interests = ["Makeup"; "Sewing"; "Bananas"]}
Which I put into a list:
let file1 = [client1;client2]
I then attempt to create a function using Map which should be able to filter file1 and only return the client, which has the same birthday as the one given in the function.
Example:
requestMap 1980
Would return map [("Jon", (37613498, Male, 1980, ["Cars"; "Boats"; "Airplanes"]))] in this case.
I've stumbled into a function, but I've got a bit stuck now.
let requestMap yob =
Map.ofList [for f in file1 do yield f.name,(f.phone,f.sex,f.birth,f.interests)] |>
Map.filter (fun key value -> )
I have trouble figuring out how I can get to birth in the current map's value? Because as it is right now, it's hidden inside value which is a PhoneNumber * Sex * YearOfBirth * Interests tuple currently.
Any hints?
To access elements of a tuple, you can use pattern matching:
Map.filter (fun key (phone, sex, birth, interests) -> birth = yob)
Or, if you're not interested in anything except birth year, you can ignore all other fields using underscore:
Map.filter (fun _ (_, _, birth, _) -> birth = yob)
That said, I would recommend filtering first and creating the map after, this would be less expensive:
let requestMap yob =
file1
|> List.filter (fun x -> x.birth = yob)
|> List.map (fun f -> f.name,(f.phone,f.sex,f.birth,f.interests))
|> Map.ofList
And while we're on the subject: why do you need to create that huge tuple in the first place? Can't you make the original records be values in your map? Like this:
let requestMap yob =
file1
|> List.filter (fun x -> x.birth = yob)
|> List.map (fun f -> f.name, f)
|> Map.ofList
Here are some more options not mentioned in other answers:
Store the whole Client in the Map value:
[for f in file1 do yield f.name, f]
|> Map.ofList
|> Map.filter (fun _ f -> f.birth = yob)
Do a conditional yield with an if:
[ for f in file1 do
if f.birth = yob then
yield f.name, f ]
|> Map.ofList
I think it would be better to filter the file1 list before transforming it into a map rather than creating the map and then filtering it.
let requestMap yob =
let filtered =
List.filter (fun client ->
match client with
| { Client.birth = year } when year = yob -> true
| _ -> false) file1
Map.ofList [for f in filtered do yield f.name,(f.phone,f.sex,f.birth,f.interests)]
I want to build a dictionary from a list of items.
An item has the following definition:
type Item =
| A of TotalPrice * Special
| B of TotalPrice * Special
| C of TotalPrice
| D of TotalPrice
I want the keys of the dictionary to map to the case ids:
| A
| B
| C
| D
I would then have the values for the case id be a list.
How do I separate the case ids from the case values?
Example:
let dictionary = items |> List.map (fun item -> item) // uh...
Appendix:
module Checkout
(*Types*)
type UnitPrice = int
type Qty = int
type Special =
| ThreeForOneThirty
| TwoForFourtyFive
type TotalPrice = { UnitPrice:int ; Qty:int }
type Item =
| A of TotalPrice * Special
| B of TotalPrice * Special
| C of TotalPrice
| D of TotalPrice
(*Functions*)
let totalPrice (items:Item list) =
let dictionary = items |> List.map (fun item -> item) // uh...
0
(*Tests*)
open FsUnit
open NUnit.Framework
[<Test>]
let ``buying 2 A units, B unit, A unit = $160`` () =
// Setup
let items = [A ({UnitPrice=50; Qty=2} , ThreeForOneThirty)
B ({UnitPrice=30; Qty=1} , TwoForFourtyFive)
A ({UnitPrice=50; Qty=1} , ThreeForOneThirty)]
items |> totalPrice |> should equal 160
Your data is badly defined for your use case. If you want to refer to the kinds of items by themselves, you need to define them by themselves:
type ItemKind = A | B | C | D
type Item = { Kind: ItemKind; Price: TotalPrice; Special: Special option }
Then you can easily build a dictionary of items:
let dictionary = items |> List.map (fun i -> i.Kind, i) |> dict
Although I must note that such dictionary may not be possible: if the items list contains several items of the same kind, some of them will not be included in the dictionary, because it can't contain multiple identical keys. Perhaps I didn't understand what kind of dictionary you're after.
If you want to create the dictionary with keys like A, B, C and D you will fail because A and B are constructors with type TotalPrice * Special -> Item and C and D are constructors of type TotalPrice -> Item. Dictionary would have to make a decision about type of keys.
Getting DU constructor name should be doable by reflection but is it really necessary for your case?
Maybe different type structure will be more efficient for your case, ie. Fyodor Soikin proposal.
Maybe the following will clarify somewhat why datastructure and code is no good, and as such also clarify that this mainly is not related to FP as indicated in some of the comments et al.
My guess is that the question is related to "how can this be grouped", and lo and behold, there is in fact a groupBy function!
(*Types*)
type UnitPrice = int
type Qty = int
type Special =
| ThreeForOneThirty
| TwoForFourtyFive
type TotalPrice = { UnitPrice:int ; Qty:int }
type Item =
| A of TotalPrice * Special
| B of TotalPrice * Special
| C of TotalPrice
| D of TotalPrice
let items = [A ({UnitPrice=50; Qty=2} , ThreeForOneThirty)
B ({UnitPrice=30; Qty=1} , TwoForFourtyFive)
A ({UnitPrice=50; Qty=1} , ThreeForOneThirty)]
let speciallyStupidTransformation =
function
| ThreeForOneThirty -> 34130
| TwoForFourtyFive -> 2445
let stupidTransformation =
function
| A (t,s) -> "A" + (s |> speciallyStupidTransformation |> string)
| B (t,s) -> "B" + (s |> speciallyStupidTransformation |> string)
| C (t) -> "C"
| D(t) -> "D"
let someGrouping = items |> List.groupBy(stupidTransformation)
val it : (string * Item list) list =
[("A34130",
[A ({UnitPrice = 50;
Qty = 2;},ThreeForOneThirty); A ({UnitPrice = 50;
Qty = 1;},ThreeForOneThirty)]);
("B2445", [B ({UnitPrice = 30;
Qty = 1;},TwoForFourtyFive)])]
Yeah its still a bad idea. But its somewhat grouped uniquely, and may be misused further to aggregate some sums or whatever.
Adding some more code for that, like the following:
let anotherStupidTransformation =
function
| A(t,_) -> (t.UnitPrice, t.Qty)
| B(t,_) -> (t.UnitPrice, t.Qty)
| C(t) -> (t.UnitPrice, t.Qty)
| D(t) -> (t.UnitPrice, t.Qty)
let x4y x y tp q =
if q%x = 0 then y*q/x else tp/q*(q%x)+(q-q%x)/x*y
let ``34130`` = x4y 3 130
let ``2445`` = x4y 2 45
let getRealStupidTotal =
function
| (s, (tp,q)) ->
(s|> List.ofSeq, (tp,q))
|> function
| (h::t, (tp,q)) ->
match t |> List.toArray |> System.String with
| "34130" -> ``34130`` tp q
| "2445" -> ``2445`` tp q
| _ -> tp
let totalPrice =
items
|> List.groupBy(stupidTransformation)
|> List.map(fun (i, l) -> i,
l
|> List.map(anotherStupidTransformation)
|> List.unzip
||> List.fold2(fun acc e1 e2 ->
((fst acc + e1) * e2, snd acc + e2) ) (0,0))
|> List.map(getRealStupidTotal)
|> List.sum
val totalPrice : int = 160
might or might not yield some test cases correct.
For the above testdata as far as I can read the initial code at least is ok. The sum does get to be 160...
Would I use this code anywhere? Nope.
Is it readable? Nope.
Is it fixable? Not without changing the way the data are structured to avoid several of the stupid transformations...
As it currently stands, this question is not a good fit for our Q&A format. We expect answers to be supported by facts, references, or expertise, but this question will likely solicit debate, arguments, polling, or extended discussion. If you feel that this question can be improved and possibly reopened, visit the help center for guidance.
Closed 10 years ago.
There are already two questions about F#/functional snippets.
However what I'm looking for here are useful snippets, little 'helper' functions that are reusable. Or obscure but nifty patterns that you can never quite remember.
Something like:
open System.IO
let rec visitor dir filter=
seq { yield! Directory.GetFiles(dir, filter)
for subdir in Directory.GetDirectories(dir) do
yield! visitor subdir filter}
I'd like to make this a kind of handy reference page. As such there will be no right answer, but hopefully lots of good ones.
EDIT Tomas Petricek has created a site specifically for F# snippets http://fssnip.net/.
Perl style regex matching
let (=~) input pattern =
System.Text.RegularExpressions.Regex.IsMatch(input, pattern)
It lets you match text using let test = "monkey" =~ "monk.+" notation.
Infix Operator
I got this from http://sandersn.com/blog//index.php/2009/10/22/infix-function-trick-for-f go to that page for more details.
If you know Haskell, you might find yourself missing infix sugar in F#:
// standard Haskell call has function first, then args just like F#. So obviously
// here there is a function that takes two strings: string -> string -> string
startsWith "kevin" "k"
//Haskell infix operator via backQuotes. Sometimes makes a function read better.
"kevin" `startsWith` "K"
While F# doesn't have a true 'infix' operator, the same thing can be accomplished almost as elegantly via a pipeline and a 'backpipeline' (who knew of such a thing??)
// F# 'infix' trick via pipelines
"kevin" |> startsWith <| "K"
Multi-Line Strings
This is pretty trivial, but it seems to be a feature of F# strings that is not widely known.
let sql = "select a,b,c \
from table \
where a = 1"
This produces:
val sql : string = "select a,b,c from table where a = 1"
When the F# compiler sees a back-slash followed by a carriage return inside a string literal, it will remove everything from the back-slash to the first non-space character on the next line. This allows you to have multi-line string literals that line up, without using a bunch of string concatenation.
Generic memoization, courtesy of the man himself
let memoize f =
let cache = System.Collections.Generic.Dictionary<_,_>(HashIdentity.Structural)
fun x ->
let ok, res = cache.TryGetValue(x)
if ok then res
else let res = f x
cache.[x] <- res
res
Using this, you could do a cached reader like so:
let cachedReader = memoize reader
Simple read-write to text files
These are trivial, but make file access pipeable:
open System.IO
let fileread f = File.ReadAllText(f)
let filewrite f s = File.WriteAllText(f, s)
let filereadlines f = File.ReadAllLines(f)
let filewritelines f ar = File.WriteAllLines(f, ar)
So
let replace f (r:string) (s:string) = s.Replace(f, r)
"C:\\Test.txt" |>
fileread |>
replace "teh" "the" |>
filewrite "C:\\Test.txt"
And combining that with the visitor quoted in the question:
let filereplace find repl path =
path |> fileread |> replace find repl |> filewrite path
let recurseReplace root filter find repl =
visitor root filter |> Seq.iter (filereplace find repl)
Update Slight improvement if you want to be able to read 'locked' files (e.g. csv files which are already open in Excel...):
let safereadall f =
use fs = new FileStream(f, FileMode.Open, FileAccess.Read, FileShare.ReadWrite)
use sr = new StreamReader(fs, System.Text.Encoding.Default)
sr.ReadToEnd()
let split sep (s:string) = System.Text.RegularExpressions.Regex.Split(s, sep)
let fileread f = safereadall f
let filereadlines f = f |> safereadall |> split System.Environment.NewLine
For performance intensive stuff where you need to check for null
let inline isNull o = System.Object.ReferenceEquals(o, null)
if isNull o then ... else ...
Is about 20x faster then
if o = null then ... else ...
Active Patterns, aka "Banana Splits", are a very handy construct that let one match against multiple regular expression patterns. This is much like AWK, but without the high performance of DFA's because the patterns are matched in sequence until one succeeds.
#light
open System
open System.Text.RegularExpressions
let (|Test|_|) pat s =
if (new Regex(pat)).IsMatch(s)
then Some()
else None
let (|Match|_|) pat s =
let opt = RegexOptions.None
let re = new Regex(pat,opt)
let m = re.Match(s)
if m.Success
then Some(m.Groups)
else None
Some examples of use:
let HasIndefiniteArticle = function
| Test "(?: |^)(a|an)(?: |$)" _ -> true
| _ -> false
type Ast =
| IntVal of string * int
| StringVal of string * string
| LineNo of int
| Goto of int
let Parse = function
| Match "^LET\s+([A-Z])\s*=\s*(\d+)$" g ->
IntVal( g.[1].Value, Int32.Parse(g.[2].Value) )
| Match "^LET\s+([A-Z]\$)\s*=\s*(.*)$" g ->
StringVal( g.[1].Value, g.[2].Value )
| Match "^(\d+)\s*:$" g ->
LineNo( Int32.Parse(g.[1].Value) )
| Match "^GOTO \s*(\d+)$" g ->
Goto( Int32.Parse(g.[1].Value) )
| s -> failwithf "Unexpected statement: %s" s
Maybe monad
type maybeBuilder() =
member this.Bind(v, f) =
match v with
| None -> None
| Some(x) -> f x
member this.Delay(f) = f()
member this.Return(v) = Some v
let maybe = maybeBuilder()
Here's a brief intro to monads for the uninitiated.
Option-coalescing operators
I wanted a version of the defaultArg function that had a syntax closer to the C# null-coalescing operator, ??. This lets me get the value from an Option while providing a default value, using a very concise syntax.
/// Option-coalescing operator - this is like the C# ?? operator, but works with
/// the Option type.
/// Warning: Unlike the C# ?? operator, the second parameter will always be
/// evaluated.
/// Example: let foo = someOption |? default
let inline (|?) value defaultValue =
defaultArg value defaultValue
/// Option-coalescing operator with delayed evaluation. The other version of
/// this operator always evaluates the default value expression. If you only
/// want to create the default value when needed, use this operator and pass
/// in a function that creates the default.
/// Example: let foo = someOption |?! (fun () -> new Default())
let inline (|?!) value f =
match value with Some x -> x | None -> f()
'Unitize' a function which doesn't handle units
Using the FloatWithMeasure function http://msdn.microsoft.com/en-us/library/ee806527(VS.100).aspx.
let unitize (f:float -> float) (v:float<'u>) =
LanguagePrimitives.FloatWithMeasure<'u> (f (float v))
Example:
[<Measure>] type m
[<Measure>] type kg
let unitize (f:float -> float) (v:float<'u>) =
LanguagePrimitives.FloatWithMeasure<'u> (f (float v))
//this function doesn't take units
let badinc a = a + 1.
//this one does!
let goodinc v = unitize badinc v
goodinc 3.<m>
goodinc 3.<kg>
OLD version:
let unitize (f:float -> float) (v:float<'u>) =
let unit = box 1. :?> float<'u>
unit * (f (v/unit))
Kudos to kvb
Scale/Ratio function builder
Again, trivial, but handy.
//returns a function which will convert from a1-a2 range to b1-b2 range
let scale (a1:float<'u>, a2:float<'u>) (b1:float<'v>,b2:float<'v>) =
let m = (b2 - b1)/(a2 - a1) //gradient of line (evaluated once only..)
(fun a -> b1 + m * (a - a1))
Example:
[<Measure>] type m
[<Measure>] type px
let screenSize = (0.<px>, 300.<px>)
let displayRange = (100.<m>, 200.<m>)
let scaleToScreen = scale displayRange screenSize
scaleToScreen 120.<m> //-> 60.<px>
Transposing a list (seen on Jomo Fisher's blog)
///Given list of 'rows', returns list of 'columns'
let rec transpose lst =
match lst with
| (_::_)::_ -> List.map List.head lst :: transpose (List.map List.tail lst)
| _ -> []
transpose [[1;2;3];[4;5;6];[7;8;9]] // returns [[1;4;7];[2;5;8];[3;6;9]]
And here is a tail-recursive version which (from my sketchy profiling) is mildly slower, but has the advantage of not throwing a stack overflow when the inner lists are longer than 10000 elements (on my machine):
let transposeTR lst =
let rec inner acc lst =
match lst with
| (_::_)::_ -> inner (List.map List.head lst :: acc) (List.map List.tail lst)
| _ -> List.rev acc
inner [] lst
If I was clever, I'd try and parallelise it with async...
F# Map <-> C# Dictionary
(I know, I know, System.Collections.Generic.Dictionary isn't really a 'C#' dictionary)
C# to F#
(dic :> seq<_>) //cast to seq of KeyValuePair
|> Seq.map (|KeyValue|) //convert KeyValuePairs to tuples
|> Map.ofSeq //convert to Map
(From Brian, here, with improvement proposed by Mauricio in comment below. (|KeyValue|) is an active pattern for matching KeyValuePair - from FSharp.Core - equivalent to (fun kvp -> kvp.Key, kvp.Value))
Interesting alternative
To get all of the immutable goodness, but with the O(1) lookup speed of Dictionary, you can use the dict operator, which returns an immutable IDictionary (see this question).
I currently can't see a way to directly convert a Dictionary using this method, other than
(dic :> seq<_>) //cast to seq of KeyValuePair
|> (fun kvp -> kvp.Key, kvp.Value) //convert KeyValuePairs to tuples
|> dict //convert to immutable IDictionary
F# to C#
let dic = Dictionary()
map |> Map.iter (fun k t -> dic.Add(k, t))
dic
What is weird here is that FSI will report the type as (for example):
val it : Dictionary<string,int> = dict [("a",1);("b",2)]
but if you feed dict [("a",1);("b",2)] back in, FSI reports
IDictionary<string,int> = seq[[a,1] {Key = "a"; Value = 1; } ...
Tree-sort / Flatten a tree into a list
I have the following binary tree:
___ 77 _
/ \
______ 47 __ 99
/ \
21 _ 54
\ / \
43 53 74
/
39
/
32
Which is represented as follows:
type 'a tree =
| Node of 'a tree * 'a * 'a tree
| Nil
let myTree =
Node
(Node
(Node (Nil,21,Node (Node (Node (Nil,32,Nil),39,Nil),43,Nil)),47,
Node (Node (Nil,53,Nil),54,Node (Nil,74,Nil))),77,Node (Nil,99,Nil))
A straightforward method to flatten the tree is:
let rec flatten = function
| Nil -> []
| Node(l, a, r) -> flatten l # a::flatten r
This isn't tail-recursive, and I believe the # operator causes it to be O(n log n) or O(n^2) with unbalanced binary trees. With a little tweaking, I came up with this tail-recursive O(n) version:
let flatten2 t =
let rec loop acc c = function
| Nil -> c acc
| Node(l, a, r) ->
loop acc (fun acc' -> loop (a::acc') c l) r
loop [] (fun x -> x) t
Here's the output in fsi:
> flatten2 myTree;;
val it : int list = [21; 32; 39; 43; 47; 53; 54; 74; 77; 99]
LINQ-to-XML helpers
namespace System.Xml.Linq
// hide warning about op_Explicit
#nowarn "77"
[<AutoOpen>]
module XmlUtils =
/// Converts a string to an XName.
let xn = XName.op_Implicit
/// Converts a string to an XNamespace.
let xmlns = XNamespace.op_Implicit
/// Gets the string value of any XObject subclass that has a Value property.
let inline xstr (x : ^a when ^a :> XObject) =
(^a : (member get_Value : unit -> string) x)
/// Gets a strongly-typed value from any XObject subclass, provided that
/// an explicit conversion to the output type has been defined.
/// (Many explicit conversions are defined on XElement and XAttribute)
/// Example: let value:int = xval foo
let inline xval (x : ^a when ^a :> XObject) : ^b =
((^a or ^b) : (static member op_Explicit : ^a -> ^b) x)
/// Dynamic lookup operator for getting an attribute value from an XElement.
/// Returns a string option, set to None if the attribute was not present.
/// Example: let value = foo?href
/// Example with default: let value = defaultArg foo?Name "<Unknown>"
let (?) (el:XElement) (name:string) =
match el.Attribute(xn name) with
| null -> None
| att -> Some(att.Value)
/// Dynamic operator for setting an attribute on an XElement.
/// Example: foo?href <- "http://www.foo.com/"
let (?<-) (el:XElement) (name:string) (value:obj) =
el.SetAttributeValue(xn name, value)
OK, this has nothing to do with snippets, but I keep forgetting this:
If you are in the interactive window, you hit F7 to jump back to the code window (without deselecting the code which you just ran...)
Going from code window to F# window (and also to open the F# window) is Ctrl Alt F
(unless CodeRush has stolen your bindings...)
Weighted sum of arrays
Calculating a weighted [n-array] sum of a [k-array of n-arrays] of numbers, based on a [k-array] of weights
(Copied from this question, and kvb's answer)
Given these arrays
let weights = [|0.6;0.3;0.1|]
let arrs = [| [|0.0453;0.065345;0.07566;1.562;356.6|] ;
[|0.0873;0.075565;0.07666;1.562222;3.66|] ;
[|0.06753;0.075675;0.04566;1.452;3.4556|] |]
We want a weighted sum (by column), given that both dimensions of the arrays can be variable.
Array.map2 (fun w -> Array.map ((*) w)) weights arrs
|> Array.reduce (Array.map2 (+))
First line: Partial application of the first Array.map2 function to weights yields a new function (Array.map ((*) weight) which is applied (for each weight) to each array in arr.
Second line: Array.reduce is like fold, except it starts on the second value and uses the first as the initial 'state'. In this case each value is a 'line' of our array of arrays. So applying an Array.map2 (+) on the first two lines means that we sum the first two arrays, which leaves us with a new array, which we then (Array.reduce) sum again onto the next (in this case last) array.
Result:
[|0.060123; 0.069444; 0.07296; 1.5510666; 215.40356|]
Performance testing
(Found here and updated for latest release of F#)
open System
open System.Diagnostics
module PerformanceTesting =
let Time func =
let stopwatch = new Stopwatch()
stopwatch.Start()
func()
stopwatch.Stop()
stopwatch.Elapsed.TotalMilliseconds
let GetAverageTime timesToRun func =
Seq.initInfinite (fun _ -> (Time func))
|> Seq.take timesToRun
|> Seq.average
let TimeOperation timesToRun =
GC.Collect()
GetAverageTime timesToRun
let TimeOperations funcsWithName =
let randomizer = new Random(int DateTime.Now.Ticks)
funcsWithName
|> Seq.sortBy (fun _ -> randomizer.Next())
|> Seq.map (fun (name, func) -> name, (TimeOperation 100000 func))
let TimeOperationsAFewTimes funcsWithName =
Seq.initInfinite (fun _ -> (TimeOperations funcsWithName))
|> Seq.take 50
|> Seq.concat
|> Seq.groupBy fst
|> Seq.map (fun (name, individualResults) -> name, (individualResults |> Seq.map snd |> Seq.average))
DataSetExtensions for F#, DataReaders
System.Data.DataSetExtensions.dll adds the ability to treat a DataTable as an IEnumerable<DataRow> as well as unboxing the values of individual cells in a way that gracefully handles DBNull by supporting System.Nullable. For example, in C# we can get the value of an integer column that contains nulls, and specify that DBNull should default to zero with a very concise syntax:
var total = myDataTable.AsEnumerable()
.Select(row => row.Field<int?>("MyColumn") ?? 0)
.Sum();
There are two areas where DataSetExtensions are lacking, however. First, it doesn't support IDataReader and second, it doesn't support the F# option type. The following code does both - it allows an IDataReader to be treated as a seq<IDataRecord>, and it can unbox values from either a reader or a dataset, with support for F# options or System.Nullable. Combined with the option-coalescing operator in another answer, this allows for code such as the following when working with a DataReader:
let total =
myReader.AsSeq
|> Seq.map (fun row -> row.Field<int option>("MyColumn") |? 0)
|> Seq.sum
Perhaps a more idiomatic F# way of ignoring database nulls would be...
let total =
myReader.AsSeq
|> Seq.choose (fun row -> row.Field<int option>("MyColumn"))
|> Seq.sum
Further, the extension methods defined below are usable from both F# and from C#/VB.
open System
open System.Data
open System.Reflection
open System.Runtime.CompilerServices
open Microsoft.FSharp.Collections
/// Ported from System.Data.DatasetExtensions.dll to add support for the Option type.
[<AbstractClass; Sealed>]
type private UnboxT<'a> private () =
// This class generates a converter function based on the desired output type,
// and then re-uses the converter function forever. Because the class itself is generic,
// different output types get different cached converter functions.
static let referenceField (value:obj) =
if value = null || DBNull.Value.Equals(value) then
Unchecked.defaultof<'a>
else
unbox value
static let valueField (value:obj) =
if value = null || DBNull.Value.Equals(value) then
raise <| InvalidCastException("Null cannot be converted to " + typeof<'a>.Name)
else
unbox value
static let makeConverter (target:Type) methodName =
Delegate.CreateDelegate(typeof<Converter<obj,'a>>,
typeof<UnboxT<'a>>
.GetMethod(methodName, BindingFlags.NonPublic ||| BindingFlags.Static)
.MakeGenericMethod([| target.GetGenericArguments().[0] |]))
|> unbox<Converter<obj,'a>>
|> FSharpFunc.FromConverter
static let unboxFn =
let theType = typeof<'a>
if theType.IsGenericType && not theType.IsGenericTypeDefinition then
let genericType = theType.GetGenericTypeDefinition()
if typedefof<Nullable<_>> = genericType then
makeConverter theType "NullableField"
elif typedefof<option<_>> = genericType then
makeConverter theType "OptionField"
else
invalidOp "The only generic types supported are Option<T> and Nullable<T>."
elif theType.IsValueType then
valueField
else
referenceField
static member private NullableField<'b when 'b : struct and 'b :> ValueType and 'b:(new:unit -> 'b)> (value:obj) =
if value = null || DBNull.Value.Equals(value) then
Nullable<_>()
else
Nullable<_>(unbox<'b> value)
static member private OptionField<'b> (value:obj) =
if value = null || DBNull.Value.Equals(value) then
None
else
Some(unbox<'b> value)
static member inline Unbox =
unboxFn
/// F# data-related extension methods.
[<AutoOpen>]
module FsDataEx =
type System.Data.IDataReader with
/// Exposes a reader's current result set as seq<IDataRecord>.
/// Reader is closed when sequence is fully enumerated.
member this.AsSeq =
seq { use reader = this
while reader.Read() do yield reader :> IDataRecord }
/// Exposes all result sets in a reader as seq<seq<IDataRecord>>.
/// Reader is closed when sequence is fully enumerated.
member this.AsMultiSeq =
let rowSeq (reader:IDataReader) =
seq { while reader.Read() do yield reader :> IDataRecord }
seq {
use reader = this
yield rowSeq reader
while reader.NextResult() do
yield rowSeq reader
}
/// Populates a new DataSet with the contents of the reader. Closes the reader after completion.
member this.ToDataSet () =
use reader = this
let dataSet = new DataSet(RemotingFormat=SerializationFormat.Binary, EnforceConstraints=false)
dataSet.Load(reader, LoadOption.OverwriteChanges, [| "" |])
dataSet
type System.Data.IDataRecord with
/// Gets a value from the record by name.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
member this.Field<'a> (fieldName:string) =
this.[fieldName] |> UnboxT<'a>.Unbox
/// Gets a value from the record by column index.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
member this.Field<'a> (ordinal:int) =
this.GetValue(ordinal) |> UnboxT<'a>.Unbox
type System.Data.DataRow with
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnName:string) =
this.[columnName] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnIndex:int) =
this.[columnIndex] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (column:DataColumn) =
this.[column] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnName:string, version:DataRowVersion) =
this.[columnName, version] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (columnIndex:int, version:DataRowVersion) =
this.[columnIndex, version] |> UnboxT<'a>.Unbox
/// Identical to the Field method from DatasetExtensions, but supports the F# Option type.
member this.Field2<'a> (column:DataColumn, version:DataRowVersion) =
this.[column, version] |> UnboxT<'a>.Unbox
/// C# data-related extension methods.
[<Extension; AbstractClass; Sealed>]
type CsDataEx private () =
/// Populates a new DataSet with the contents of the reader. Closes the reader after completion.
[<Extension>]
static member ToDataSet(this:IDataReader) =
this.ToDataSet()
/// Exposes a reader's current result set as IEnumerable{IDataRecord}.
/// Reader is closed when sequence is fully enumerated.
[<Extension>]
static member AsEnumerable(this:IDataReader) =
this.AsSeq
/// Exposes all result sets in a reader as IEnumerable{IEnumerable{IDataRecord}}.
/// Reader is closed when sequence is fully enumerated.
[<Extension>]
static member AsMultipleEnumerable(this:IDataReader) =
this.AsMultiSeq
/// Gets a value from the record by name.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
[<Extension>]
static member Field<'T> (this:IDataRecord, fieldName:string) =
this.Field<'T>(fieldName)
/// Gets a value from the record by column index.
/// DBNull and null are returned as the default value for the type.
/// Supports both nullable and option types.
[<Extension>]
static member Field<'T> (this:IDataRecord, ordinal:int) =
this.Field<'T>(ordinal)
Handling arguments in a command line application:
//We assume that the actual meat is already defined in function
// DoStuff (string -> string -> string -> unit)
let defaultOutOption = "N"
let defaultUsageOption = "Y"
let usage =
"Scans a folder for and outputs results.\n" +
"Usage:\n\t MyApplication.exe FolderPath [IncludeSubfolders (Y/N) : default=" +
defaultUsageOption + "] [OutputToFile (Y/N): default=" + defaultOutOption + "]"
let HandlArgs arr =
match arr with
| [|d;u;o|] -> DoStuff d u o
| [|d;u|] -> DoStuff d u defaultOutOption
| [|d|] -> DoStuff d defaultUsageOption defaultOutOption
| _ ->
printf "%s" usage
Console.ReadLine() |> ignore
[<EntryPoint>]
let main (args : string array) =
args |> HandlArgs
0
(I had a vague memory of this technique being inspired by Robert Pickering, but can't find a reference now)
A handy cache function that keeps up to max (key,reader(key)) in a dictionary and use a SortedList to track the MRU keys
let Cache (reader: 'key -> 'value) max =
let cache = new Dictionary<'key,LinkedListNode<'key * 'value>>()
let keys = new LinkedList<'key * 'value>()
fun (key : 'key) -> (
let found, value = cache.TryGetValue key
match found with
|true ->
keys.Remove value
keys.AddFirst value |> ignore
(snd value.Value)
|false ->
let newValue = key,reader key
let node = keys.AddFirst newValue
cache.[key] <- node
if (keys.Count > max) then
let lastNode = keys.Last
cache.Remove (fst lastNode.Value) |> ignore
keys.RemoveLast() |> ignore
(snd newValue))
Creating XElements
Nothing amazing, but I keep getting caught out by the implicit conversion of XNames:
#r "System.Xml.Linq.dll"
open System.Xml.Linq
//No! ("type string not compatible with XName")
//let el = new XElement("MyElement", "text")
//better
let xn s = XName.op_Implicit s
let el = new XElement(xn "MyElement", "text")
//or even
let xEl s o = new XElement(xn s, o)
let el = xEl "MyElement" "text"
Pairwise and pairs
I always expect Seq.pairwise to give me [(1,2);(3;4)] and not [(1,2);(2,3);(3,4)]. Given that neither exist in List, and that I needed both, here's the code for future reference. I think they're tail recursive.
//converts to 'windowed tuples' ([1;2;3;4;5] -> [(1,2);(2,3);(3,4);(4,5)])
let pairwise lst =
let rec loop prev rem acc =
match rem with
| hd::tl -> loop hd tl ((prev,hd)::acc)
| _ -> List.rev acc
loop (List.head lst) (List.tail lst) []
//converts to 'paged tuples' ([1;2;3;4;5;6] -> [(1,2);(3,4);(5,6)])
let pairs lst =
let rec loop rem acc =
match rem with
| l::r::tl -> loop tl ((l,r)::acc)
| l::[] -> failwith "odd-numbered list"
| _ -> List.rev acc
loop lst []
Naive CSV reader (i.e., won't handle anything nasty)
(Using filereadlines and List.transpose from other answers here)
///Given a file path, returns a List of row lists
let ReadCSV =
filereadlines
>> Array.map ( fun line -> line.Split([|',';';'|]) |> List.ofArray )
>> Array.toList
///takes list of col ids and list of rows,
/// returns array of columns (in requested order)
let GetColumns cols rows =
//Create filter
let pick cols (row:list<'a>) = List.map (fun i -> row.[i]) cols
rows
|> transpose //change list of rows to list of columns
|> pick cols //pick out the columns we want
|> Array.ofList //an array output is easier to index for user
Example
"C:\MySampleCSV"
|> ReadCSV
|> List.tail //skip header line
|> GetColumns [0;3;1] //reorder columns as well, if needs be.
Date Range
simple but useful list of dates between fromDate and toDate
let getDateRange fromDate toDate =
let rec dates (fromDate:System.DateTime) (toDate:System.DateTime) =
seq {
if fromDate <= toDate then
yield fromDate
yield! dates (fromDate.AddDays(1.0)) toDate
}
dates fromDate toDate
|> List.ofSeq
toggle code to sql
More trivial than most on this list, but handy nonetheless:
I'm always taking sql in and out of code to move it to a sql environment during development. Example:
let sql = "select a,b,c "
+ "from table "
+ "where a = 1"
needs to be 'stripped' to:
select a,b,c
from table
where a = 1
keeping the formatting. It's a pain to strip out the code symbols for the sql editor, then put them back again by hand when I've got the sql worked out. These two functions toggle the sql back and forth from code to stripped:
// reads the file with the code quoted sql, strips code symbols, dumps to FSI
let stripForSql fileName =
File.ReadAllText(fileName)
|> (fun s -> Regex.Replace(s, "\+(\s*)\"", ""))
|> (fun s -> s.Replace("\"", ""))
|> (fun s -> Regex.Replace(s, ";$", "")) // end of line semicolons
|> (fun s -> Regex.Replace(s, "//.+", "")) // get rid of any comments
|> (fun s -> printfn "%s" s)
then when you are ready to put it back into your code source file:
let prepFromSql fileName =
File.ReadAllText(fileName)
|> (fun s -> Regex.Replace(s, #"\r\n", " \"\r\n+\"")) // matches newline
|> (fun s -> Regex.Replace(s, #"\A", " \""))
|> (fun s -> Regex.Replace(s, #"\z", " \""))
|> (fun s -> printfn "%s" s)
I'd love to get rid of the input file but can't even begin to grok how to make that happen. anyone?
edit:
I figured out how to eliminate the requirement of a file for these functions by adding a windows forms dialog input/output. Too much code to show, but for those who would like to do such a thing, that's how I solved it.
Pascal's Triangle (hey, someone might find it useful)
So we want to create a something like this:
1
1 1
1 2 1
1 3 3 1
1 4 6 4 1
Easy enough:
let rec next = function
| [] -> []
| x::y::xs -> (x + y)::next (y::xs)
| x::xs -> x::next xs
let pascal n =
seq { 1 .. n }
|> List.scan (fun acc _ -> next (0::acc) ) [1]
The next function returns a new list where each item[i] = item[i] + item[i + 1].
Here's the output in fsi:
> pascal 10 |> Seq.iter (printfn "%A");;
[1]
[1; 1]
[1; 2; 1]
[1; 3; 3; 1]
[1; 4; 6; 4; 1]
[1; 5; 10; 10; 5; 1]
[1; 6; 15; 20; 15; 6; 1]
[1; 7; 21; 35; 35; 21; 7; 1]
[1; 8; 28; 56; 70; 56; 28; 8; 1]
[1; 9; 36; 84; 126; 126; 84; 36; 9; 1]
[1; 10; 45; 120; 210; 252; 210; 120; 45; 10; 1]
For the adventurous, here's a tail-recursive version:
let rec next2 cont = function
| [] -> cont []
| x::y::xs -> next2 (fun l -> cont <| (x + y)::l ) <| y::xs
| x::xs -> next2 (fun l -> cont <| x::l ) <| xs
let pascal2 n =
set { 1 .. n }
|> Seq.scan (fun acc _ -> next2 id <| 0::acc)) [1]
Flatten a List
if you have something like this:
let listList = [[1;2;3;];[4;5;6]]
and want to 'flatten' it down to a singe list so the result is like this:
[1;2;3;4;5;6]
it can be done thusly:
let flatten (l: 'a list list) =
seq {
yield List.head (List.head l)
for a in l do yield! (Seq.skip 1 a)
}
|> List.ofSeq
List comprehensions for float
This [23.0 .. 1.0 .. 40.0] was marked as deprecated a few versions backed.
But apparently, this works:
let dl = 9.5 / 11.
let min = 21.5 + dl
let max = 40.5 - dl
let a = [ for z in min .. dl .. max -> z ]
let b = a.Length
(BTW, there's a floating point gotcha in there. Discovered at fssnip - the other place for F# snippets)
Parallel map
let pmap f s =
seq { for a in s -> async { return f s } }
|> Async.Parallel
|> Async.Run
In F# code I have a tuple:
let myWife=("Tijana",32)
I want to access each member of the tuple separately. For instance this what I want to achieve by I can't
Console.WriteLine("My wife is {0} and her age is {1}",myWife[0],myWife[1])
This code doesn't obviously work, by I think you can gather what I want to achieve.
You want to prevent your wife from aging by making her age immutable? :)
For a tuple that contains only two members, you can fst and snd to extract the members of the pair.
let wifeName = fst myWife;
let wifeAge = snd myWife;
For longer tuples, you'll have to unpack the tuple into other variables. For instance,
let _, age = myWife;;
let name, age = myWife;;
Another quite useful thing is that pattern matching (just like when extracting elements using "let" binding) can be used in other situations, for example when writing a function:
let writePerson1 person =
let name, age = person
printfn "name = %s, age = %d" name age
// instead of deconstructing the tuple using 'let',
// we can do it in the declaration of parameters
let writePerson2 (name, age) =
printfn "name = %s, age = %d" name age
// in both cases, the call is the same
writePerson1 ("Joe", 20)
writePerson2 ("Joe", 20)
You can use the function fst to get the first element, and snd to get the second ekement. You can also write your own 'third' function:
let third (_, _, c) = c
Read more here: F# Language reference, Tuples
You can also write an unpack function for a certain length:
let unpack4 tup4 ind =
match ind, tup4 with
| 0, (a,_,_,_) -> a
| 1, (_,b,_,_) -> b
| 2, (_,_,c,_) -> c
| 3, (_,_,_,d) -> d
| _, _ -> failwith (sprintf "Trying to access item %i of tuple with 4 entries." ind)
or
let unpack4 tup4 ind =
let (a, b, c, d) = tup4
match ind with
| 0 -> a
| 1 -> b
| 2 -> c
| 3 -> d
| _ -> failwith (sprintf "Trying to access item %i of tuple with 4 entries." ind)