How do I define y-combinator without "let rec"? - f#

In almost all examples, a y-combinator in ML-type languages is written like this:
let rec y f x = f (y f) x
let factorial = y (fun f -> function 0 -> 1 | n -> n * f(n - 1))
This works as expected, but it feels like cheating to define the y-combinator using let rec ....
I want to define this combinator without using recursion, using the standard definition:
Y = λf·(λx·f (x x)) (λx·f (x x))
A direct translation is as follows:
let y = fun f -> (fun x -> f (x x)) (fun x -> f (x x));;
However, F# complains that it can't figure out the types:
let y = fun f -> (fun x -> f (x x)) (fun x -> f (x x));;
--------------------------------^
C:\Users\Juliet\AppData\Local\Temp\stdin(6,33): error FS0001: Type mismatch. Expecting a
'a
but given a
'a -> 'b
The resulting type would be infinite when unifying ''a' and ''a -> 'b'
How do I write the y-combinator in F# without using let rec ...?

As the compiler points out, there is no type that can be assigned to x so that the expression (x x) is well-typed (this isn't strictly true; you can explicitly type x as obj->_ - see my last paragraph). You can work around this issue by declaring a recursive type so that a very similar expression will work:
type 'a Rec = Rec of ('a Rec -> 'a)
Now the Y-combinator can be written as:
let y f =
let f' (Rec x as rx) = f (x rx)
f' (Rec f')
Unfortunately, you'll find that this isn't very useful because F# is a strict language,
so any function that you try to define using this combinator will cause a stack overflow.
Instead, you need to use the applicative-order version of the Y-combinator (\f.(\x.f(\y.(x x)y))(\x.f(\y.(x x)y))):
let y f =
let f' (Rec x as rx) = f (fun y -> x rx y)
f' (Rec f')
Another option would be to use explicit laziness to define the normal-order Y-combinator:
type 'a Rec = Rec of ('a Rec -> 'a Lazy)
let y f =
let f' (Rec x as rx) = lazy f (x rx)
(f' (Rec f')).Value
This has the disadvantage that recursive function definitions now need an explicit force of the lazy value (using the Value property):
let factorial = y (fun f -> function | 0 -> 1 | n -> n * (f.Value (n - 1)))
However, it has the advantage that you can define non-function recursive values, just as you could in a lazy language:
let ones = y (fun ones -> LazyList.consf 1 (fun () -> ones.Value))
As a final alternative, you can try to better approximate the untyped lambda calculus by using boxing and downcasting. This would give you (again using the applicative-order version of the Y-combinator):
let y f =
let f' (x:obj -> _) = f (fun y -> x x y)
f' (fun x -> f' (x :?> _))
This has the obvious disadvantage that it will cause unneeded boxing and unboxing, but at least this is entirely internal to the implementation and will never actually lead to failure at runtime.

I would say it's impossible, and asked why, I would handwave and invoke the fact that simply typed lambda calculus has the normalization property. In short, all terms of the simply typed lambda calculus terminate (consequently Y can not be defined in the simply typed lambda calculus).
F#'s type system is not exactly the type system of simply typed lambda calculus, but it's close enough. F# without let rec comes really close to the simply typed lambda calculus -- and, to reiterate, in that language you cannot define a term that does not terminate, and that excludes defining Y too.
In other words, in F#, "let rec" needs to be a language primitive at the very least because even if you were able to define it from the other primitives, you would not be able to type this definition. Having it as a primitive allows you, among other things, to give a special type to that primitive.
EDIT: kvb shows in his answer that type definitions (one of the features absent from the simply typed lambda-calculus but present in let-rec-less F#) allow to get some sort of recursion. Very clever.

Case and let statements in ML derivatives are what makes it Turing Complete, I believe they're based on System F and not simply typed but the point is the same.
System F cannot find a type for the any fixed point combinator, if it could, it wasn't strongly normalizing.
What strongly normalizing means is that any expression has exactly one normal form, where a normal form is an expression that cannot be reduced any further, this differs from untyped where every expression has at max one normal form, it can also have no normal form at all.
If typed lambda calculi could construct a fixed point operator in what ever way, it was quite possible for an expression to have no normal form.
Another famous theorem, the Halting Problem, implies that strongly normalizing languages are not Turing complete, it says that's impossible to decide (different than prove) of a turing complete language what subset of its programs will halt on what input. If a language is strongly normalizing, it's decidable if it halts, namely it always halts. Our algorithm to decide this is the program: true;.
To solve this, ML-derivatives extend System-F with case and let (rec) to overcome this. Functions can thus refer to themselves in their definitions again, making them in effect no lambda calculi at all any more, it's no longer possible to rely on anonymous functions alone for all computable functions. They can thus again enter infinite loops and regain their turing-completeness.

Short answer: You can't.
Long answer:
The simply typed lambda calculus is strongly normalizing. This means it's not Turing equivalent. The reason for this basically boils down to the fact that a Y combinator must either be primitive or defined recursively (as you've found). It simply cannot be expressed in System F (or simpler typed calculi). There's no way around this (it's been proven, after all). The Y combinator you can implement works exactly the way you want, though.
I would suggest you try scheme if you want a real Church-style Y combinator. Use the applicative version given above, as other versions won't work, unless you explicitly add laziness, or use a lazy Scheme interpreter. (Scheme technically isn't completely untyped, but it's dynamically typed, which is good enough for this.)
See this for the proof of strong normalization:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.127.1794
After thinking some more, I'm pretty sure that adding a primitive Y combinator that behaves exactly the way the letrec defined one does makes System F Turing complete. All you need to do to simulate a Turing machine then is implement the tape as an integer (interpreted in binary) and a shift (to position the head).

Simply define a function taking its own type as a record, like in Swift (there it's a struct) :)
Here, Y (uppercase) is semantically defined as a function that can be called with its own type. In F# terms, it is defined as a record containing a function named call, so for calling a y defined as this type, you have to actually call y.call :)
type Y = { call: Y -> (int -> int) }
let fibonacci n =
let makeF f: int -> int =
fun x ->
if x = 0 then 0 else if x = 1 then 1 else f(x - 1) + f(x - 2)
let y = { call = fun y -> fun x -> (makeF (y.call y)) x }
(y.call y) n
It's not supremely elegant to read but it doesn't resort to recursion for defining a y combinator that is supposed to provide recursion all by itself ^^

Related

Make Fish in F#

The Kleisli composition operator >=>, also known as the "fish" in Haskell circles, may come in handy in many situations where composition of specialized functions is needed. It works kind of like the >> operator, but instead of composing simple functions 'a -> 'b it confers some special properties on them possibly best expressed as 'a -> m<'b>, where m is either a monad-like type or some property of the function's return value.
Evidence of this practice in the wider F# community can be found e.g. in Scott Wlaschin's Railway oriented programming (part 2) as composition of functions returning the Result<'TSuccess,'TFailure> type.
Reasoning that where there's a bind, there must be also fish, I try to parametrize the canonical Kleisli operator's definition let (>=>) f g a = f a >>= g with the bind function itself:
let mkFish bind f g a = bind g (f a)
This works wonderfully with the caveat that generally one shouldn't unleash special operators on user-facing code. I can compose functions returning options...
module Option =
let (>=>) f = mkFish Option.bind f
let odd i = if i % 2 = 0 then None else Some i
let small i = if abs i > 10 then None else Some i
[0; -1; 9; -99] |> List.choose (odd >=> small)
// val it : int list = [-1; 9]
... or I can devise a function application to the two topmost values of a stack and push the result back without having to reference the data structure I'm operating on explicitly:
module Stack =
let (>=>) f = mkFish (<||) f
type 'a Stack = Stack of 'a list
let pop = function
| Stack[] -> failwith "Empty Stack"
| Stack(x::xs) -> x, Stack xs
let push x (Stack xs) = Stack(x::xs)
let apply2 f =
pop >=> fun x ->
pop >=> fun y ->
push (f x y)
But what bothers me is that the signature val mkFish : bind:('a -> 'b -> 'c) -> f:('d -> 'b) -> g:'a -> a:'d -> 'c makes no sense. Type variables are in confusing order, it's overly general ('a should be a function), and I'm not seeing a natural way to annotate it.
How can I abstract here in the absence of formal functors and monads, not having to define the Kleisli operator explicitly for each type?
You can't do it in a natural way without Higher Kinds.
The signature of fish should be something like:
let (>=>) (f:'T -> #Monad<'U>``) (g:' U -> #Monad<'V>) (x:'T) : #Monad<'V> = bind (f x) g
which is unrepresentable in current .NET type system, but you can replace #Monad with your specific monad, ie: Async and use its corresponding bind function in the implementation.
Having said that, if you really want to use a generic fish operator you can use F#+ which has it already defined by using static constraints. If you look at the 5th code sample here you will see it in action over different types.
Of course you can also define your own, but there is a lot of things to code, in order to make it behave properly in most common scenarios. You can grab the code from the library or if you want I can write a small (but limited) code sample.
The generic fish is defined in this line.
I think in general you really feel the lack of generic functions when using operators, because as you discovered, you need to open and close modules. It's not like functions that you prefix them with the module name, you can do that with operators as well (something like Option.(>=>)) , but then it defeats the whole purpose of using operators, I mean it's no longer an operator.

constrain generic type to inherit a generic type in f#

let mapTuple f (a,b) = (f a, f b)
I'm trying to create a function that applies a function f to both items in a tuple and returns the result as a tuple. F# type inference says that mapTuple returns a 'b*'b tuple. It also assumes that a and b are of the same type.
I want to be able to pass two different types as parameters. You would think that wouldn't work because they both have to be passed as parameters to f. So I thought if they inherited from the same base class, it might work.
Here is a less generic function for what I am trying to accomplish.
let mapTuple (f:Map<_,_> -> Map<'a,'b>) (a:Map<int,double>,b:Map<double, int>) = (f a, f b)
However, it gives a type mismatch error.
How do I do it? Is what I am trying to accomplish even possible in F#?
Gustavo is mostly right; what you're asking for requires higher-rank types. However,
.NET (and by extension F#) does support (an encoding of) higher-rank types.
Even in Haskell, which supports a "nice" way of expressing such types (once you've enabled the right extension), they wouldn't be inferred for your example.
Digging into point 2 may be valuable: given map f a b = (f a, f b), why doesn't Haskell infer a more general type than map :: (t1 -> t) -> t1 -> t1 -> (t, t)? The reason is that once you include higher-rank types, it's not typically possible to infer a single "most general" type for a given expression. Indeed, there are many possible higher-rank signatures for map given its simple definition above:
map :: (forall t. t -> t) -> x -> y -> (x, y)
map :: (forall t. t -> z) -> x -> y -> (z, z)
map :: (forall t. t -> [t]) -> x -> y -> ([x], [y])
(plus infinitely many more). But note that these are all incompatible with each other (none is more general than another). Given the first one you can call map id 1 'c', given the second one you can call map (\_ -> 1) 1 'c', and given the third one you can call map (\x -> [x]) 1 'c', but those arguments are only valid with each of those types, not with the other ones.
So even in Haskell you need to specify the particular polymorphic signature you want to use - this may be a bit of a surprise if you're coming from a more dynamic language. In Haskell, this is relatively clean (the syntax is what I've used above). However, in F# you'll have to jump through an additional hoop: there's no clean syntax for a "forall" type, so you'll have to create an additional nominal type instead. For example, to encode the first type above in F# I'd write something like this:
type Mapping = abstract Apply : 'a -> 'a
let map (m:Mapping) (a, b) = m.Apply a, m.Apply b
let x, y = map { new Mapping with member this.Apply x = x } (1, "test")
Note that in contrast to Gustavo's suggestion, you can define the first argument to map as an expression (rather than forcing it to be a member of some separate type). On the other hand, there's clearly a lot more boilerplate than would be ideal...
This problem has to do with rank-n types which are supported in Haskell (through extensions) but not in .NET type system.
One way I found to workaround this limitation is to pass a type with a single method instead of a function and then define an inline map function with static constraints, for example let's suppose I have some generic functions: toString and toOption and I want to be able to map them to a tuple of different types:
type ToString = ToString with static member inline ($) (ToString, x) = string x
type ToOption = ToOption with static member ($) (ToOption, x) = Some x
let inline mapTuple f (x, y) = (f $ x, f $ y)
let tuple1 = mapTuple ToString (true, 42)
let tuple2 = mapTuple ToOption (true, 42)
// val tuple1 : string * string = ("True", "42")
// val tuple2 : bool option * int option = (Some true, Some 42)
ToString will return the same type but operating with arbitrary types. ToOption will return two Generics of different types.
By using a binary operator type inference creates the static constraints for you and I use $ because in Haskell it means apply so a nice detail is that for haskellers f $ x reads already apply x to f.
At the risk of stating the obvious, a good enough solution might be to have a mapTuple that takes two functions instead of one:
let mapTuple fa fb (a, b) = (fa a, fb b)
If your original f is generic, passing it as fa and fb will give you two concrete instantiations of the function with the types you're looking for. At worst, you just need to pass the same function twice when a and b are of the same type.

Composing 2 (or n) ('a -> unit) functions with same arg type

Is there some form of built-in / term I don't know that kinda-but-its-different 'composes' two 'a -> unit functions to yield a single one; e.g.:
let project event =
event |> logDirections
event |> stashDirections
let dispatch (batch:EncodedEventBatch) =
batch.chooseOfUnion () |> Seq.iter project
might become:
let project = logDirections FOLLOWEDBY stashDirections
let dispatch (batch:EncodedEventBatch) =
batch.chooseOfUnion () |> Seq.iter project
and then:
let dispatch (batch:EncodedEventBatch) =
batch.chooseOfUnion () |> Seq.iter (logDirections FOLLOWEDBY stashDirections)
I guess one might compare it to tee (as alluded to in FSFFAP's Railway Oriented Programming series).
(it needs to pass the same arg to both and I'm seeking to run them sequentially without any exception handling trickery concerns etc.)
(I know I can do let project fs arg = fs |> Seq.iter (fun f -> f arg) but am wondering if there is something built-in and/or some form of composition lib I'm not aware of)
The apply function from Klark is the most straightforward way to solve the problem.
If you want to dig deeper and understand the concept more generally, then you can say that you are lifting the sequential composition operation from working on values to work on functions.
First of all, the ; construct in F# can be viewed as sequential composition operator. Sadly, you cannot quite use it as one, e.g. (;) (because it is special and lazy in the second argument) but we can define our own operator instead to explore the idea:
let ($) a b = a; b
So, printfn "hi" $ 1 is now a sequential composition of a side-effecting operation and some expression that evaluates to 1 and it does the same thing as printfn "hi"; 1.
The next step is to define a lifting operation that turns a binary operator working on values to a binary operator working on functions:
let lift op g h = (fun a -> op (g a) (h a))
Rather than writing e.g. fun x -> foo x + bar x, you can now write lift (+) foo bar. So you have a point-free way of writing the same thing - just using operation that works on functions.
Now you can achieve what you want using the lift function and the sequential composition operator:
let seq2 a b = lift ($) a b
let seq3 a b c = lift ($) (lift ($) a b) c
let seqN l = Seq.reduce (lift ($)) l
The seq2 and seq3 functions compose just two operations, while seqN does the same thing as Klark's apply function.
It should be said that I'm writing this answer not because I think it is useful to implement things in F# in this way, but as you mentioned railway oriented programming and asked for deeper concepts behind this, it is interesting to see how things can be composed in functional languages.
Can you just apply an array of functions to a given data?
E.g. you can define:
let apply (arg:'a) (fs:(('a->unit) seq)) = fs |> Seq.iter (fun f -> f arg)
Then you will be able to do something like this:
apply 1 [(fun x -> printfn "%d" (x + 1)); (fun y -> printfn "%d" (y + 2))]

Going more idiomatic with F#

A very simple example of what I'm trying to do: I know it's possible to write:
let myFunc = anotherFunc
instead of
let myFunc = fun x -> anotherFunc x
I've got two functions fDate1, fDate2 - both of type DateTime -> bool. I need to construct a function that takes a date and verifies if any of fDate1, fDate2 returns true. For now I've invented the following expression:
let myDateFunc = fun x -> (fDate1 x) || (fDate2 x)
Is there a better way of doing these (e.g. using '>>' or high order funcions) ?
I don't think there is anything non-idiomatic with your code. In my opinion, one of the strong points about F# is that you can use it to write simple and easy-to-understand code. From that perspective, nothing could be simpler than writing just:
let myDateFunc x = fDate1 x || fDate2 x
If you had more functions than just two, then it might make sense to write something like:
let dateChecks = [ fDate1; fDate2 ]
let myDateFunc x = dateChecks |> Seq.exists (fun f -> f x)
But again, this only makes sense when you actually need to use a larger number of checks or when you are adding checks often. Unnecessary abstraction is also a bad thing.
You can define a choice combinator:
let (<|>) f g = fun x -> f x || g x
let myDateFunc = fDate1 <|> fDate2
In general, you should use explicit function arguments. The elaborated form of myDateFunc can be written as:
let myDateFunc x = fDate1 x || fDate2 x
As other answers say, your current approach is fine. What is not said is that idiomatic style often produces less readable code. So if you are working in a real project and expect other developers to understand your code, it is not recommended to go too far with unnecessary function composition.
However, for purposes of self-education, you may consider the following trick, a bit in FORTH style:
// Define helper functions
let tup x = x,x
let untup f (x,y) = f x y
let call2 f g (x,y) = f x, g y
// Use
let myFunc =
tup
>> call2 fDate1 fDate2
>> untup (||)
Here, you pass the original value x through a chain of transformations:
make a tuple of the same value;
apply each element of the tuple to corresponding function, obtaining a tuple of results;
"fold" a tuple of booleans with or operator to a single value;
There are many drawbacks with this approach, including that both of fDate1 and fDate2 will be evaluated while it may not be necessary, extra tuples created degrading performance, and more.

F# and lisp-like apply function

For starters, I'm a novice in functional programming and F#, therefore I don't know if it's possible to do such thing at all. So let's say we have this function:
let sum x y z = x + y + z
And for some reason, we want to invoke it using the elements from a list as an arguments. My first attempt was just to do it like this:
//Seq.fold (fun f arg -> f arg) sum [1;2;3]
let rec apply f args =
match args with
| h::hs -> apply (f h) hs
| [] -> f
...which doesn't compile. It seems impossible to determine type of the f with a static type system. There's identical question for Haskell and the only solution uses Data.Dynamic to outfox the type system. I think the closest analog to it in F# is Dynamitey, but I'm not sure if it fits. This code
let dynsum = Dynamitey.Dynamic.Curry(sum, System.Nullable<int>(3))
produces dynsum variable of type obj, and objects of this type cannot be invoked, furthermore sum is not a .NET Delegate.So the question is, how can this be done with/without that library in F#?
F# is a statically typed functional language and so the programming patterns that you use with F# are quite different than those that you'd use in LISP (and actually, they are also different from those you'd use in Haskell). So, working with functions in the way you suggested is not something that you'd do in normal F# programming.
If you had some scenario in mind for this function, then perhaps try asking about the original problem and someone will help you find an idiomatic F# approach!
That said, even though this is not recommended, you can implement the apply function using the powerful .NET reflection capabilities. This is slow and unsafe, but if is occasionally useful.
open Microsoft.FSharp.Reflection
let rec apply (f:obj) (args:obj list) =
let invokeFunc =
f.GetType().GetMethods()
|> Seq.find (fun m ->
m.Name = "Invoke" &&
m.GetParameters().Length = args.Length)
invokeFunc.Invoke(f, Array.ofSeq args)
The code looks at the runtime type of the function, finds Invoke method and calls it.
let sum x y z = x + y + z
let res = apply sum [1;2;3]
let resNum = int res
At the end, you need to convert the result to an int because this is not statically known.

Resources