If I want to add all the elements of a list of tuples, I get an error with the following
let rec addTupLst (xs: 'a * 'a list) =
match xs with
| (a, b) :: rst -> a + b + (addTupLst rst)
| _ -> 0
addTupLst [(1, 2)]
I get the warning
error FS0001: This expression was expected to have type
'a * 'a list
but here has type
'b list
Is it not possible to pattern match on a list of tuples this way, or is there another error?
You just forgot a pair of parens
let rec addTupLst (xs: ('a * 'a) list) =
match xs with
| (a, b) :: rst -> a + b + (addTupLst rst)
| _ -> 0
addTupLst [(1, 2)]
The problem is that you declare the function as taking a 'a * 'a list, but what you actually want to write is ('a * 'a) list.
This is one of the reasons why I don't really like the common but (IMO) inconsistent style of using prefix notation for type parameters for some built-in types and postfix notation for the rest. I prefer to write the type as list<'a * 'a>.
Related
I have a few utility functions like this that return different length tuples:
let get1And2Of4(v1, v2, v3, v4) = v1, v2
let get1And2And3Of4(v1, v2, v3, v4) = v1, v2, v3
I'd like to be able to pattern match on a variable and return the appropriate utility function.
For example,
let bar =
match foo with
| "A" -> get1And2Of4
| "B" -> get1And2And3Of4
However, the compiler complains because the length of the tuples are different. Reading Variable length tuples in f# and Elegant pattern matching on nested tuples of arbitrary length explains that this isn't really possible.
Are there any other options to achieve this given that I need to work with tuples and not lists?
The only practicable, type-safe way I can imagine would be to spell it all out explicitly and wrap the functions as individual cases in a discriminated union. This is obviously awkward; on the plus side it helps you to keep track of the various type parameters that are involved, since in the end all code paths will need to return the same type.
type WhateverYouFeelNamingTheDU<'a,'b,'c,'d> =
| Get1And2Of4 of ('a * 'b * 'c * 'd -> 'a * 'b)
| Get1And2And3Of4 of ('a * 'b * 'c * 'd -> 'a * 'b * 'c)
let get1And2Of4 = Get1And2Of4(fun (v1, v2, _, _) -> v1, v2)
let get1And2And3Of4 = Get1And2And3Of4(fun (v1, v2, v3, _) -> v1, v2, v3)
match "foo" with
| "A" -> get1And2Of4
| _ -> get1And2And3Of4
|> function
| Get1And2Of4 f42 -> f42 (1,2,"a","b") |> string
| Get1And2And3Of4 f43 -> f43 (3,4,"c","d") |> string
// val it : string = "(3, 4, c)"
This project really is a source of questions for me.
I already learned about polymorphic recursion and I understand why it is a special case and therefore F# needs full type annotations.
For regular functions I might need some fiddeling but usually get it right. Now I'm trying to adapt a (working) basic toSeq to a more specialized finger tree, but can't.
My feeling is that the use of the computation expression has something to do with it. This is the condensed working version:
module ThisWorks =
module Node =
type Node<'a> =
| Node2 of 'a * 'a
| Node3 of 'a * 'a * 'a
let toList = function
| Node2(a, b) -> [a; b]
| Node3(a, b, c) -> [a; b; c]
module Digit =
type Digit<'a> =
| One of 'a
| Two of 'a * 'a
| Three of 'a * 'a * 'a
| Four of 'a * 'a * 'a * 'a
let toList = function
| One a -> [a]
| Two(a, b) -> [a; b]
| Three(a, b, c) -> [a; b; c]
| Four(a, b, c, d) -> [a; b; c; d]
module FingerTree =
open Node
open Digit
type FingerTree<'a> =
| Empty
| Single of 'a
| Deep of Digit<'a> * Lazy<FingerTree<Node<'a>>> * Digit<'a>
let rec toSeq<'a> (tree:FingerTree<'a>) : seq<'a> = seq {
match tree with
| Single single ->
yield single
| Deep(prefix, Lazy deeper, suffix) ->
yield! prefix |> Digit.toList
yield! deeper |> toSeq |> Seq.collect Node.toList
yield! suffix |> Digit.toList
| Empty -> ()
}
The one I don't manage to get to compile is this:
module ThisDoesnt =
module Monoids =
type IMonoid<'m> =
abstract Zero:'m
abstract Plus:'m -> 'm
type IMeasured<'m when 'm :> IMonoid<'m>> =
abstract Measure:'m
type Size(value) =
new() = Size 0
member __.Value = value
interface IMonoid<Size> with
member __.Zero = Size()
member __.Plus rhs = Size(value + rhs.Value)
type Value<'a> =
| Value of 'a
interface IMeasured<Size> with
member __.Measure = Size 1
open Monoids
module Node =
type Node<'m, 'a when 'm :> IMonoid<'m>> =
| Node2 of 'm * 'a * 'a
| Node3 of 'm * 'a * 'a * 'a
let toList = function
| Node2(_, a, b) -> [a; b]
| Node3(_, a, b, c) -> [a; b; c]
module Digit =
type Digit<'m, 'a when 'm :> IMonoid<'m>> =
| One of 'a
| Two of 'a * 'a
| Three of 'a * 'a * 'a
| Four of 'a * 'a * 'a * 'a
let toList = function
| One a -> [a]
| Two(a, b) -> [a; b]
| Three(a, b, c) -> [a; b; c]
| Four(a, b, c, d) -> [a; b; c; d]
module FingerTree =
open Node
open Digit
type FingerTree<'m, 'a when 'm :> IMonoid<'m>> =
| Empty
| Single of 'a
| Deep of 'm * Digit<'m, 'a> * Lazy<FingerTree<'m, Node<'m, 'a>>> * Digit<'m, 'a>
let unpack (Value v) = v
let rec toSeq<'a> (tree:FingerTree<Size, Value<'a>>) : seq<'a> = seq {
match tree with
| Single(Value single) ->
yield single
| Deep(_, prefix, Lazy deeper, suffix) ->
yield! prefix |> Digit.toList |> List.map unpack
#if ITERATE
for (Value deep) in toSeq deeper do
^^^^^
yield deep
#else
yield! deeper |> toSeq |> Seq.collect (Node.toList >> List.map unpack)
^^^^^
#endif
yield! suffix |> Digit.toList |> List.map unpack
| Empty -> ()
}
The error message I get says
Error Type mismatch. Expecting a
FingerTree<Size,Node<Size,Value<'a>>> -> 'b
but given a
FingerTree<Size,Value<'c>> -> seq<'c>
The type 'Node<Size,Value<'a>>' does not match the type 'Value<'b>'
and the squiggles underline the recursive call of toSeq.
I know that the “deeper” type is encapsulated in a Node and in the working code I just unpack it afterwards. But here the compiler trips already before I get the chance to unpack. Trying a for (Value deep) in toSeq deeper do yield deep has the same problem.
I already have a way out, namely to use the toSeq of the “base” Tree and Seq.map unpack afterwards. Not true, trying that yields a very similar error message.
I'm curious what makes this code break and how it could be fixed.
The compiler's error message seems clear to me: toSeq is applicable only to values of type FingerTree<Size, Value<'a>> for some 'a, but you're trying to call it on a value of type FingerTree<Size,Node<Size,Value<'a>>> instead, which is not compatible. There's nothing specific to polymorphic recursion or sequence expressions, these types just don't match.
Instead, it seems like it would be much simpler to make toSeq more generic by taking an input of type FingerTree<Size, 'a> (without any reference to Value), which would enable the recursive call you want. Then you can easily derive the more specific function you actually want by composing the more general toSeq with Seq.map unpack.
In this video about functional programming at 35:14 Jim Weirich writes a function to compute factorial without using recursion, library functions or loops:
see image of Ruby code here
The code in Ruby
fx = ->(improver) {
improver.(improver)
}.(
->(improver) {
->(n) { n.zero ? 1 : n * improver.(improver).(n-1) }
}
)
I'm trying to express this approach F#
let fx =
(fun improver -> improver(improver))(
fun improver ->
fun n ->
if n = 0 then 1
else n * improver(improver(n - 1)))
I'm currently stuck at
Type mismatch. Expecting a 'a but given a 'a -> 'b
The resulting type would be infinite when unifying ''a' and ''a -> 'b'
I can't seem find the right type annotation or other way of expressing the function
Edit:
*without the rec keyword
Languages with ML-style type inference won't be able to infer a type for the term fun improver -> improver improver; they start by assuming the type 'a -> 'b for a lambda-definition (for some undetermined types 'a and 'b), so as the argument improver has type 'a, but then it's applied to itself to give the result (of type 'b), so improver must simultaneously have type 'a -> 'b. But in the F# type system there's no way to unify these types (and in the simply-typed lambda calculus there's no way to give this term a type at all). My answer to the question that you linked to in your comment covers some workarounds. #desco has given one of those already. Another is:
let fx = (fun (improver:obj->_) -> improver improver)
(fun improver n ->
if n = 0 then 1
else n * (improver :?> _) improver (n-1))
This is cheating, but you can use types
type Self<'T> = delegate of Self<'T> -> 'T
let fx1 = (fun (x: Self<_>) -> x.Invoke(x))(Self(fun x -> fun n -> if n = 0 then 1 else x.Invoke(x)(n - 1) * n))
type Rec<'T> = Rec of (Rec<'T> -> 'T)
let fx2 = (fun (Rec(f ) as r) -> f r)(Rec(fun ((Rec f) as r) -> fun n -> if n = 0 then 1 else f(r)(n - 1) * n))
I found similar things, but not exactly what i'm looking for
Why does
let map x f = f x
return
val map : 'a -> ('a -> 'b) -> 'b
and
let rec merge f xs a =
match xs with
| [] -> a
| y::ys -> f y (merge f ys a);;
return
val merge : f:('a -> 'b -> 'b) -> xs:'a list -> a:'b -> 'b
and
let rec merge2 a = function
| [] -> a
| x::xs -> fold f (f a x) xs;;
returns
val merge2: f:('a -> 'b -> 'a) -> a:'a -> _arg1:'b list -> 'a
Thank you for the clarification.
Ok I'll show you the first one then you can check the others.
Notice how you do f x this means that f is a function (that always has signature 'a -> 'b for some 'a and 'b.
Now as x is the argument to f it must have the same type as the input to f which I named 'a here.
Now the result of f x is of course of type 'b
Now put it all together:
map takes two arguments:
x : 'a
and f : 'a -> 'b
and has result-type 'b
so it has signature map : 'a -> ('a -> 'b) ->b`
Here is the same argument for your
let rec merge f xs a =
match xs with
| [] -> a
| y::ys -> f y (merge f ys a);;
first look at the first case of the match:
it matches an empty list so xs much be a list of some type 'a: xs : 'a list
it returns a which has some unknown type 'b which is also the result of the match and therefore merge itself!
now to the second line:
As xs had type 'a list now you must have
y : 'a
ys : 'a list
therefore you plug in a 'a and then a 'b into f and return it's result to the match and merge (which, as we have seen, has return-type 'b)
You now see, that f must have type f : 'a -> 'b -> 'b and you are again done if you assemble the type of merge
side-note
remember how I claimed that every function has signature 'a -> 'b for some types and now I write stuff like 'a -> 'b -> 'b?
This is indeed consistent if you read the last as 'a -> ('b -> 'b) which is of course just what we call currying: you don't need functions with multiple arguments if you can just return functions and as long as you don't look to deep into the produced IL you should think of F# doing exactly this ;)
I think you will manage the last one yourself - try it, if you have problems edit your question to indicate your problem and we gonna help ;)
I'm implementing a packrat parser in OCaml, as per the Master Thesis by B. Ford. My parser should receive a data structure that represents the grammar of a language and parse given sequences of symbols.
I'm stuck with the memoization part. The original thesis uses Haskell's lazy evaluation to accomplish linear time complexity. I want to do this (memoization via laziness) in OCaml, but don't know how to do it.
So, how do you memoize functions by lazy evaluations in OCaml?
EDIT: I know what lazy evaluation is and how to exploit it in OCaml. The question is how to use it to memoize functions.
EDIT: The data structure I wrote that represents grammars is:
type ('a, 'b, 'c) expr =
| Empty of 'c
| Term of 'a * ('a -> 'c)
| NTerm of 'b
| Juxta of ('a, 'b, 'c) expr * ('a, 'b, 'c) expr * ('c -> 'c -> 'c)
| Alter of ('a, 'b, 'c) expr * ('a, 'b, 'c) expr
| Pred of ('a, 'b, 'c) expr * 'c
| NPred of ('a, 'b, 'c) expr * 'c
type ('a, 'b, 'c) grammar = ('a * ('a, 'b, 'c) expr) list
The (not-memoized) function that parse a list of symbols is:
let rec parse g v xs = parse' g (List.assoc v g) xs
and parse' g e xs =
match e with
| Empty y -> Parsed (y, xs)
| Term (x, f) ->
begin
match xs with
| x' :: xs when x = x' -> Parsed (f x, xs)
| _ -> NoParse
end
| NTerm v' -> parse g v' xs
| Juxta (e1, e2, f) ->
begin
match parse' g e1 xs with
| Parsed (y, xs) ->
begin
match parse' g e2 xs with
| Parsed (y', xs) -> Parsed (f y y', xs)
| p -> p
end
| p -> p
end
( and so on )
where the type of the return value of parse is defined by
type ('a, 'c) result = Parsed of 'c * ('a list) | NoParse
For example, the grammar of basic arithmetic expressions can be specified as g, in:
type nt = Add | Mult | Prim | Dec | Expr
let zero _ = 0
let g =
[(Expr, Juxta (NTerm Add, Term ('$', zero), fun x _ -> x));
(Add, Alter (Juxta (NTerm Mult, Juxta (Term ('+', zero), NTerm Add, fun _ x -> x), (+)), NTerm Mult));
(Mult, Alter (Juxta (NTerm Prim, Juxta (Term ('*', zero), NTerm Mult, fun _ x -> x), ( * )), NTerm Prim));
(Prim, Alter (Juxta (Term ('<', zero), Juxta (NTerm Dec, Term ('>', zero), fun x _ -> x), fun _ x -> x), NTerm Dec));
(Dec, List.fold_left (fun acc d -> Alter (Term (d, (fun c -> int_of_char c - 48)), acc)) (Term ('0', zero)) ['1';'2';'3';])]
The idea of using lazyness for memoization is use not functions, but data structures, for memoization. Lazyness means that when you write let x = foo in some_expr, foo will not be evaluated immediately, but only as far as some_expr needs it, but that different occurences of xin some_expr will share the same trunk: as soon as one of them force computation, the result is available to all of them.
This does not work for functions: if you write let f x = foo in some_expr, and call f several times in some_expr, well, each call will be evaluated independently, there is not a shared thunk to store the results.
So you can get memoization by using a data structure instead of a function. Typically, this is done using an associative data structure: instead of computing a a -> b function, you compute a Table a b, where Table is some map from the arguments to the results. One example is this Haskell presentation of fibonacci:
fib n = fibTable !! n
fibTable = [0,1] ++ map (\n -> fib (n - 1) + fib (n - 2)) [2..]
(You can also write that with tail and zip, but this doesn't make the point clearer.)
See that you do not memoize a function, but a list: it is the list fibTable that does the memoization. You can write this in OCaml as well, for example using the LazyList module of the Batteries library:
open Batteries
module LL = LazyList
let from_2 = LL.seq 2 ((+) 1) (fun _ -> true)
let rec fib n = LL.at fib_table (n - 1) + LL.at fib_table (n - 2)
and fib_table = lazy (LL.Cons (0, LL.cons 1 <| LL.map fib from_2))
However, there is little interest in doing so: as you have seen in the example above, OCaml does not particularly favor call-by-need evaluation -- it's reasonable to use, but not terribly convenient as it was forced to be in Haskell. It is actually equally simple to directly write the cache structure by direct mutation:
open Batteries
let fib =
let fib_table = DynArray.of_list [0; 1] in
let get_fib n = DynArray.get fib_table n in
fun n ->
for i = DynArray.length fib_table to n do
DynArray.add fib_table (get_fib (i - 1) + get_fib (i - 2))
done;
get_fib n
This example may be ill-chosen, because you need a dynamic structure to store the cache. In the packrat parser case, you're tabulating parsing on a known input text, so you can use plain arrays (indexed by the grammar rules): you would have an array of ('a, 'c) result option for each rule, of the size of the input length and initialized to None. Eg. juxta.(n) represents the result of trying the rule Juxta from input position n, or None if this has not yet been tried.
Lazyness is a nice way to present this kind of memoization, but is not always expressive enough: if you need, say, to partially free some part of your result cache to lower memory usage, you will have difficulties if you started from a lazy presentation. See this blog post for a remark on this.
Why do you want to memoize functions? What you want to memoize is, I believe, the parsing result for a given (parsing) expression and a given position in the input stream. You could for instance use Ocaml's Hashtables for that.
The lazy keyword.
Here you can find some great examples.
If it fits your use case, you can also use OCaml streams instead of manually generating thunks.