DRYer Generic types in Haskell - parsing

Say I have the following types:
newtype AddressID = {unAddressId :: UUID } deriving (Generic, Show, Eq)
newtype PersonID = {unPersonId :: UUID } deriving (Generic, Show, Eq)
data Address = { addressId :: AddressID} deriving (Generic, Show, Eq)
data Person = { personId :: PersonID } deriving (Generic, Show, Eq)
data AddressDto = AddressDto { addressDtoId :: !UUID } deriving (Generic, Show, Eq)
data PersonDto = PersonDto { personDtoId :: !UUID } deriving (Generic, Show, Eq)
type AddressListDto = HashMap UUID AddressDto
type PersonListDto = HashMap UUID PersonDto
instance FromJSON PersonDto where
parseJSON = genericParseJSON $ apiOptions "personDto"
instance ToJSON PersonDto where
toJSON = genericToJSON $ apiOptions "personDto"
toEncoding = genericToEncoding $ apiOptions "personDto"
instance FromJSON AddressDto where
parseJSON = genericParseJSON $ apiOptions "addressDto"
instance ToJSON AddressDto where
toJSON = genericToJSON $ apiOptions "addressDto"
toEncoding = genericToEncoding $ apiOptions "addressDto"
with the following utility functions:
fromAddress :: Address -> AddressDto
fromAddress Address{..} = AddressDto {addresDtoId = unAddressId addressId}
fromPerson :: Person -> PersonDto
fromAddress Person{..} = PersonDto {personDtoId = unPersonId personId}
appendPerson :: PersonListDto -> Person -> PersonListDto
appendPerson pld per = insert (personDtoId $ fromPerson per) (fromPolicy per) pld
fromPersonList :: [Person] -> PersonListDto
fromPersonList = foldl appendPerson empty
appendAddress :: AddressListDto -> Address -> AddressListDto
appendAddress ald addr = insert (addressDtoId $ fromAddress addr) (fromAddress addr) ald
fromAddressList :: [Address] -> AddressListDto
fromAddressList = foldl appendAddress empty
This code works just fine, however it is very repeated, and balloons with the number of these internal types. The functions are identical, but operate across different objects, with a naming convention on helper functions being the only thing separating the different implementations.
What is the Haskell approach to creating more generically reusable helper functions across these types? How do I go about creating a fromEntity, fromDto, appendEntityToDtoList and fromEntityList functions? Is there a way I can encode (the virtually identical) To/FromJSON instances without repeating myself? Are Typeclasses appropriate here? Is there some good material on how to use them for this purpose?

For the JSON instances, yes, you can definitely reduce the boilerplate, at the cost of some setup. The biggest trick is a relatively new GHC feature, DerivingVia. This lets you explain how to create an instance for one type in terms of the instance for another.
{-# language DeriveGeneric, DerivingVia, StandaloneDeriving
, TypeFamilies, UndecidableInstances, ScopedTypeVariables
, PolyKinds, TypeApplications #-}
-- A newtype wrapper for the type we want an instance for
newtype Optionish a = Optionish a
-- A FromJSON instance for Optionish
instance
( Generic a
, Rep a ~ M1 i c f
, Datatype c
, GFromJSON Zero f
)
=> FromJSON (Optionish a) where
parseJSON v = Optionish <$> genericParseJSON (apiOptions (lower name)) v
where
name = datatypeName (Proxy3 #c)
data Proxy3 d f a = Proxy3
lower :: String -> String
lower "" = ""
lower (x:xs) = toLower x : xs
We can use this like so:
deriving via Optionish PersonDto
instance FromJSON PersonDto
You can do just the same sort of thing for ToJSON.

Related

Is it possible to match a changing JSON key to a sum type data constructor with aeson inside a larger record type?

So I have this data type ItemType which is decoded using its data constructor name (see the FromJSON instance).
import Data.Aeson
import Data.Aeson.Types
import Data.Char (toLower)
import GHC.Generics
data ItemType =
MkLogin Login
| MkCard Card
| MkIdentity Identity
| MkSecureNote Note
deriving (Generic, Show)
lowercase :: String -> String
lowercase "" = ""
lowercase (s:ss) = toLower s : ss
stripPrefix :: String -> String
stripPrefix ('M':'k':ss) = ss
stripPrefix str = str
-- | Decode value using ItemType data constructor names
instance FromJSON ItemType where
parseJSON = genericParseJSON defaultOptions
{ constructorTagModifier = lowercase . stripPrefix
, sumEncoding = ObjectWithSingleField }
and what I want to do is add this type as a field to a larger record type called Item
data Item =
Item { _object :: String
, _id :: String
, _organizationId :: Maybe Int
, _folderId :: Maybe Int
, _type :: Int
, _name :: String
, _notes :: String
, _favorite :: Bool
, ??? :: ItemType -- don't know how to add this without a different field name
, _collectionIds :: [Int]
, _revisionDate :: Maybe String
} deriving (Generic, Show)
instance FromJSON Item where
parseJSON =
genericParseJSON defaultOptions { fieldLabelModifier = stripUnderscore }
However I don't want to create a new field name for the type. Instead I want to use the data constructor which aeson matched on ItemType as the field name because the key of the ItemType field in the JSON object I'm trying to model changes depending upon what ItemType it is. So in this case the key is either "login", "card", "identity", "secureNote". Perhaps I should be using TaggedObject for the sumEncoding, but I'm not totally sure how it works.
Example JSON list of Item objects: https://i.imgur.com/xzHy9MU.png. Here you can see the ItemType field by the keys "login", "card", "identity" depending on what type they are.
You can use a rather ugly hack to pre-process the incoming JSON Value, so that actual JSON input like:
{
"id": "foo",
"bool": false
}
is parsed as if it had been:
{
"id": "foo",
"itemtype": {"bool" : false}
}
which can be handled directly by the generic parsers using the ObjectWithSingleField sum encoding method.
As a simplified example, given:
data ItemType =
MkInt Int
| MkBool Bool
deriving (Generic, Show)
instance FromJSON ItemType where
parseJSON = genericParseJSON defaultOptions
{ constructorTagModifier = map toLower . \('M':'k':ss) -> ss
, sumEncoding = ObjectWithSingleField }
and:
data Item =
Item { _id :: String
, _itemtype :: ItemType
}
deriving (Generic, Show)
you can write a FromJSON instance for Item that nests an "int" or "bool" field inside an "itemtype" field. (A duplicate of the original field is left in place but ignored by the generic parser.)
instance FromJSON Item where
parseJSON v = do
v' <- withObject "Item" nest v
genericParseJSON defaultOptions { fieldLabelModifier = \('_':ss) -> ss } v'
where nest o = Object <$> (HM.insert "itemtype" <$> item <*> pure o)
where item = subObj "int" <|> subObj "bool" <|> fail "no item type field"
subObj k = (\v -> object [(k,v)]) <$> o .: k
Full code:
{-# LANGUAGE DeriveGeneric #-}
{-# LANGUAGE OverloadedStrings #-}
{-# LANGUAGE TupleSections #-}
import Control.Applicative
import Data.Aeson
import Data.Aeson.Types
import Data.Char (toLower)
import GHC.Generics
import qualified Data.HashMap.Strict as HM
data ItemType =
MkInt Int
| MkBool Bool
deriving (Generic, Show)
instance FromJSON ItemType where
parseJSON = genericParseJSON defaultOptions
{ constructorTagModifier = map toLower . \('M':'k':ss) -> ss
, sumEncoding = ObjectWithSingleField }
data Item =
Item { _id :: String
, _itemtype :: ItemType
}
deriving (Generic, Show)
instance FromJSON Item where
parseJSON v = do
v' <- withObject "Item" nest v
genericParseJSON defaultOptions { fieldLabelModifier = \('_':ss) -> ss } v'
where nest o = Object <$> (HM.insert "itemtype" <$> item <*> pure o)
where item = subObj "int" <|> subObj "bool" <|> fail "no item type field"
subObj k = (\v -> object [(k,v)]) <$> o .: k
test1, test2, test3 :: Either String Item
test1 = eitherDecode "{\"id\":\"foo\",\"bool\":false}"
test2 = eitherDecode "{\"id\":\"foo\",\"int\":10}"
test3 = eitherDecode "{\"id\":\"foo\"}"
main = do
print test1
print test2
print test3
Generally, though, unless you're doing this a lot, it's probably better for the sake of clarity and readability to just abandon the generics and write the necessary boilerplate. It's not that onerous, even for your original example. Yes, you have to keep the type and instance in sync, but a couple of simple tests should catch any problems. So, for example, something like:
instance FromJSON Item where
parseJSON = withObject "Item" $ \o ->
Item <$> o .: "object"
<*> o .: "id"
<*> o .:? "organizationId"
<*> o .:? "folderId"
<*> o .: "type"
<*> o .: "name"
<*> o .: "notes"
<*> o .: "favorite"
<*> parseItemType o
<*> o .: "collectionIds"
<*> o .:? "revisionDate"
where parseItemType o =
MkLogin <$> o .: "login"
<|> MkCard <$> o .: "card"
<|> MkIdentity <$> o .: "identity"
<|> MkSecureNote <$> o .: "securenote"
One way would be to have no field at all for ItemType in the Item data declaration. Then use either a tuple or a custom pair type to hold both pieces; so:
data ItemWithType = ItemWithType ItemType Item
instance FromJSON ItemWithType where
parseJSON v = liftA2 ItemWithType (parseJSON v) (parseJSON v)
You can also skip defining ItemWithType and just use
\o -> liftA2 (,) (parseJSON o) (parseJSON o)
directly to parse a tuple of the fields with a consistent name and the object under the variable key.

F# "Stateful" Computation Expression

I'm currently learning F# and hitting a few stumbling blocks; I think a lot of it is learning to think functionally.
One of the things I'm learning at the moment are computation expressions, and I want to be able to define a computation expression that handles some tracking state, e.g:
let myOptions = optionListBuilder {
let! opt1 = {name="a";value=10}
let! opt2 = {name="b";value=12}
}
I want to be able to have it so that myOptions is a Option<'T> list, so each let! bind operation effectively causes the builder to "track" the defined options as it goes along.
I don't want to have to do it using mutable state - e.g. having a list maintained by the builder and updated with each bind call.
Is there some way of having it so that this is possible?
Update: The resultant Option<'T> list type is just representative, in reality I'll likely have an OptionGroup<'T> type to contain a list as well as some additional information - so as Daniel mentioned below, I could use a list comprehension for a simple list.
I wrote a string builder computation expression here.
open System.Text
type StringBuilderUnion =
| Builder of StringBuilder
| StringItem of string
let build sb =
sb.ToString()
type StringBuilderCE () =
member __.Yield (txt : string) = StringItem(txt)
member __.Yield (c : char) = StringItem(c.ToString())
member __.Combine(f,g) = Builder(match f,g with
| Builder(F), Builder(G) ->F.Append(G.ToString())
| Builder(F), StringItem(G)->F.Append(G)
| StringItem(F),Builder(G) ->G.Append(F)
| StringItem(F),StringItem(G)->StringBuilder(F).Append(G))
member __.Delay f = f()
member __.Zero () = StringItem("")
member __.For (xs : 'a seq, f : 'a -> StringBuilderUnion) =
let sb = StringBuilder()
for item in xs do
match f item with
| StringItem(s)-> sb.Append(s)|>ignore
| Builder(b)-> sb.Append(b.ToString())|>ignore
Builder(sb)
let builder1 = new StringBuilderCE ()
Noticed the underlying type is immutable (the contained StringBuilder is mutable, but it doesn't have to be). Instead of updating the existing data, each yield combines the current state and the incoming input resulting in a new instance of StringBuilderUnion You could do this with an F# list since adding an element to the head of the list is merely the construction of a new value rather than mutating the existing values.
Using the StringBuilderCE looks like this:
//Create a function which builds a string from an list of bytes
let bytes2hex (bytes : byte []) =
string {
for byte in bytes -> sprintf "%02x" byte
} |> build
//builds a string from four strings
string {
yield "one"
yield "two"
yield "three"
yield "four"
} |> build
Noticed the yield instead of let! since I don't actually want to use the value inside the computation expression.
SOLUTION
With the base-line StringBuilder CE builder provided by mydogisbox, I was able to produce the following solution that works a charm:
type Option<'T> = {Name:string;Item:'T}
type OptionBuilderUnion<'T> =
| OptionItems of Option<'T> list
| OptionItem of Option<'T>
type OptionBuilder () =
member this.Yield (opt: Option<'t>) = OptionItem(opt)
member this.Yield (tup: string * 't) = OptionItem({Name=fst tup;Item=snd tup})
member this.Combine (f,g) =
OptionItems(
match f,g with
| OptionItem(F), OptionItem(G) -> [F;G]
| OptionItems(F), OptionItem(G) -> G :: F
| OptionItem(F), OptionItems(G) -> F :: G
| OptionItems(F), OptionItems(G) -> F # G
)
member this.Delay f = f()
member this.Run (f) = match f with |OptionItems items -> items |OptionItem item -> [item]
let options = OptionBuilder()
let opts = options {
yield ("a",12)
yield ("b",10)
yield {Name = "k"; Item = 20}
}
opts |> Dump
F# supports list comprehensions out-of-the-box.
let myOptions =
[
yield computeOptionValue()
yield computeOptionValue()
]

F# Append custom table with custom list

I'm trying to make a function that extends a symbolTable with a list of Decl
Definitions:
type Typ = |Integer
|Boolean
|Ft of Typ list * Typ;;
type Decl = string * Typ;;
type symbolTable = Map<string, Typ>;;
I'm trying to do it the following way:
let extendST (st: symbolTable) (lst: Decl list) : symbolTable =
let mutable returnValue = st
for (x,y) in lst do
returnValue = returnValue.Add(x,y)
returnValue;;
But apparently nothing is being added to the returnValue (the function returns the same symbolTable as is being input).
I'm obviously new at F# but this is giving me a headache so I hope someone can help me out.
In F#, assignment (to change the value of a mutable variable) is written using <- rather than using = (which means just equality testing). Your code is comparing the two values and then ignoring the result (which is valid, but not what you wanted). The corrected version looks like this:
let extendST (st: symbolTable) (lst: Decl list) : symbolTable =
let mutable returnValue = st
for (x,y) in lst do
returnValue <- returnValue.Add(x,y)
returnValue
However, if you are looking for a functional solution, you can write this using fold:
let extendST (st: symbolTable) (lst: Decl list) : symbolTable =
lst |> List.fold (fun map (k, v) -> Map.add k v map) st
The fold function takes some state (your symbol table) and calculates a new state for each element of a given list lst. Here, the new state is a new map extended with the key-value pair k,v.

Type unit does not have null as a proper value

Below I post a fragment of my F# program that causes problems.
...
match words with
| name :: "of" :: context :: "=" :: value :: _ when Double.TryParse(value) |> fst ->
let var = new FuzzyVariable(name, context, Double.Parse value)
fuzzyVars <- var :: fuzzyVars
In line:
fuzzyVars <- var :: fuzzyVars
I get "Type unit does not have null as a proper value" error. I am fairly new to F# programming and I don't quite know what might be causing this issue.
fuzzyVars is of type FuzzyVariable list. FuzzyVariable is a custom defined type.
EDIT.
As #Tomas Petricek pointed out there was a line in my code that returned null:
| [] -> null
My intention was to ignore value of the match. The proper way to do it is:
| [] -> ()
After that change everything works fine.
As already mentioned, the problem is not in the piece of code you posted - it looks like some other part of your program makes the F# compiler think that the expression fuzzyVars <- var :: fuzzyVars should have a type that admits null (but that's not the case, because it returns unit).
I was able to get the same error by writing:
open System
type FuzzyVariable(a:string, b:string, c:float) =
member x.A = ""
let mutable fuzzyVars : (FuzzyVariable list) = []
let words = [null; "of"; "context"; "="; "5"]
And the main part:
null = (match words with
| name :: "of" :: context :: "=" :: value :: _ when Double.TryParse(value) |> fst ->
let var = new FuzzyVariable(name, context, Double.Parse value)
fuzzyVars <- var :: fuzzyVars)
You probably do not have something like this in your code :-) but perhaps the error message might give you a pointer to where the null comes from. Here, I get:
error FS0001: The type 'unit' does not have 'null' as a proper value. See also C:\Users\Tomas\AppData\Local\Temp\~vs648E.fsx(8,0)-(8,4).
And the code on line 8 between character 0 and 4 is the null value! So perhaps check out whether the error message gives you some more information? Or try looking for null elsewhere in your code... (It might be also caused by some unexpected indentation.)
I tried this and it compiled just fine:
open System
type test = {Name:string}
let mutable fuzzyVars : (test list) = []
match [null; "of"; "context"; "="; "5"] with
| name :: "of" :: context :: "=" :: value :: _ when Double.TryParse(value) |> fst ->
let var = {Name=name}
fuzzyVars <- var :: fuzzyVars
| a -> a |> ignore
The problem isn't in this section of code.

how do i fix these errors generated by my computational expression that is using my custom workflow builder?

From the MSDN documentation I understand that if Run is implemented it will be called automatically at the end of the computational expression. It says that:
builder.Run(builder.Delay(fun () -> {| cexpr |}))
will be generated for the computational expression. Run and/or Delay will be omitted if they are not defined in the workflow builder. I was expecting my ReaderBuilder to return a list of MyItem objects when Run is called automatically. So I do not understand why I'm getting a type mismatch error. The errors are generated by the return statement inside the ProcedureBuilder foo at the end of my code listing here. Could someone please explain what I'm misunderstanding about workflow builders and what I have implemented incorrectly?
I'm getting the following errors:
The type ''a list' is not compatible with the type 'ReaderBuilder'
Type constraint mismatch. The type 'a list is not compatible with type ReaderBuilder The type ''a list' is not compatible with the type 'ReaderBuilder'
open System
open System.Data
open System.Data.Common
open System.Configuration
let config = ConfigurationManager.ConnectionStrings.Item("db")
let factory = DbProviderFactories.GetFactory(config.ProviderName)
type Direction =
| In
| Out
| Ref
| Return
type dbType =
| Int32
| String of int
type ReaderBuilder(cmd) =
let mutable items = []
member x.Foo = 2
member x.YieldFrom item =
items <- item::items
item
member x.Run item =
items
type ProcBuilder(procedureName:string) =
let name = procedureName
let mutable parameters = []
let mutable cmd:DbCommand = null
let mutable data = []
member x.Command with get() = cmd
member x.CreateCommand() =
factory.CreateCommand()
member x.AddParameter(p:string*dbType*Direction) =
parameters <- p::parameters
member x.Bind(v,f) =
f v
member x.Reader = ReaderBuilder(cmd)
member x.Return(rBuilder:ReaderBuilder) =
data
let (?<-) (builder:ProcBuilder) (prop:string) (value:'t) =
builder.Command.Parameters.[prop].Value <- value
type MyItem() =
let mutable _a = 0
let mutable _b = String.Empty
let mutable _c = DateTime.Now
member x.a
with get() = _a
and set n = _a <- n
member x.b
with get() = _b
and set n = _b <- n
member x.c
with get() = _c
and set n = _c <- n
let proc name = ProcBuilder(name)
let (%) (builder:ProcBuilder) (p:string*dbType*Direction) =
builder.AddParameter(p)
builder
let (?) (r:DbDataReader) (s:string) = r.GetOrdinal(s)
let foo x y =
let foo = proc "foo" % ("x", Int32, In) % ("y", String(15), In)
foo?x <- x
foo?y <- y
foo {
do! foo?x <- x
do! foo?y <- y
return foo.Reader {
let item = MyItem()
item.a <- r.GetInt32("a")
item.b <- r.GetString("b")
item.c <- r.GetDateTime("c")
yield! item
}
}
The problem in your example is that the foo.Reader { ... } block has a return type MyItem list (because this is what the Run member of the ReaderBuilder type returns). However, the Return member of ProcBuilder expects an argument of type ReaderBuilder.
The data field of ReaderBuilder will be always an empty list, so this is also suspicious. I think you probably want to change the Return of ProcBuilder to take an argument MyItem list instead.
However, I think that using custom computation builder for database access doesn't really give you much advantage. You're not creating a "non-standard computation" in some sense. Instead, you probably just want a nice syntax for calling commands & reading data. Using the dynamic operator can make this quite elegant even without computation builders - I wrote an article about this some time ago.

Resources