I'm new to Scala, and have been trying to use its excellent combinator parser library. I've been trying to get this code to compile:
import scala.util.parsing.combinator._
...
val r:Parsers#ParseResult[Node] = parser.parseAll(parser.suite,reader)
r match {
case Success(r, n) => println(r)
case Failure(msg, n) => println(msg)
case Error(msg, n) => println(msg)
}
...
But I keep getting these errors:
TowelParser.scala:97: error: not found: value Success
case Success(r, n) => println(r)
^
TowelParser.scala:98: error: not found: value Failure
case Failure(msg, n) => println(msg)
TowelParser.scala:99: error: object Error is not a case class constructor, nor does it have an unapply/unapplySeq method
case Error(msg, n) => println(msg)
I've tried lots of different things like:
case Parsers#Success(r, n) => println(r)
and
case Parsers.Success(r, n) => println(r)
and
import scala.util.parsing.combinator.Parsers.Success
But I can't seem to get this to compile. I'm sure there's probably something obvious I'm missing, but I've been at it for a while, and google doesn't seem to have any good examples of this.
Thanks!
You need to specify the full path for the ParseResult, which includes your Parsers instance. For example:
import scala.util.parsing.combinator._
object parser extends RegexParsers { def digits = "\\d+".r ^^ (_.toInt) }
val res = parser.parseAll(parser.digits, "42")
res match {
case parser.Success(r, n) => println(r)
case parser.Failure(msg, n) => println(msg)
case parser.Error(msg, n) => println(msg)
}
Note that you could also import these if you want a little extra syntactic convenience:
import parser.{ Error, Failure, Success }
Now your original version will work as expected.
Related
function method ListMap<T,X>(f : (T -> X), l : List<T>) : List<X>
ensures ListMap(x => x + 1, Cons(1, Cons(2, Nil))) == Cons(2, Cons(3, Nil))
{
match l {
case Nil => Nil
case Cons(n, l') => Cons(f(n), ListMap(f, l'))
}
}
Dafny raises two complaint here.
about "case Nil": A postcondition might not hold on this return path.
about "ensure...": This postcondition might not hold on a return path.
This snippet is from the book "Introducing Software Verification with Dafny Language: Proving Program Correctness", but I can't find the Errata for it.
There are two things here that you can solve at once:
The ensures won't terminate if computed (in an imaginary ghost environment) because you provide it as an intrinsic postcondition, so you will get into trouble.
Dafny is ok to ensure something about the output of the current function, but for everything other call to the function itself, it has to prove termination.
What you provide is an example of a postcondition, not a fact that ought to be known by every user of your ListMap function.
The solution is to refactor your ensures in a lemma:
datatype List<T> = Nil | Cons(t: T, tail: List<T>)
function method ListMap<T,X>(f : (T -> X), l : List<T>) : List<X>
{
match l {
case Nil => Nil
case Cons(n, l') => Cons(f(n), ListMap(f, l'))
}
}
lemma ListMapExample()
ensures ListMap(x => x + 1, Cons(1, Cons(2, Nil))) == Cons(2, Cons(3, Nil))
{
}
Background: I am trying to write parser combinators in Dafny. This requires working on very long lists which I do not want to fully compute unless they are needed, so I am using an IList instead of a seq in order to simulate lazy evaluation. The problem which I am having is that I cannot find a way to express an equivalent to forall x in sequence when working with ILists.
I'm defining IList in the same way as Dafny's documentation and tests:
codatatype IList<T> = Nil | Cons(head: T, tail: IList<T>)
I want to define an fmap function over ILists which allows partial functions. Here is my initial (incorrect) implementation:
function method fmap<S, T>(list: IList<S>, fn: S --> T): IList<T>
{
match list
case Nil => Nil
case Cons(s, rest) => Cons(fn(s), fmap(rest, fn))
}
This does not work, because the precondition to fn might not hold, and this is the root problem I'm trying to solve.
I tried to define a copredicate to express the concept of "forall" on infinite lists, and use that:
greatest predicate IListForall<T>(list: IList<T>, fn: T -> bool) {
match list
case Nil => true
case Cons(head, rest) => fn(head) && IListForall(rest, fn)
}
function method fmap<S, T>(list: IList<S>, fn: S --> T): IList<T>
requires IListForall(list, fn.requires)
{ /* body unchanged */ }
This makes fmap verify, but when I try to use fmap I can't find a way to make this precondition satisfied. It comes up when I try to define a mapping function which works on lists containing a certain type:
datatype Container<T> = Container(value: T)
function method fmapContainers<T, U>(cs: IList<Container<T>>, fn: T -> U):
IList<Container<U>>
{
fmap(cs, (container: Container) => Container(fn(container.value)))
}
The invocation of fmap here gives me the error possible violation of function precondition. This doesn't seem right to me. fn is total, and so is the lambda that I'm passing to fmap, so I don't think there should be any precondition in play? I've attempted to write fmapContainers a few different ways with no success, which makes me think that I messed up from the beginning when I tried to express forall as a copredicate.
Is there a better way to express forall than what I did?
footnote: fmapContainers might sound useless, but it's the simplest form of my actual problem. To explain my motivation, here's the full implementation that I'm trying to get working:
datatype OneParse<T> = OneParse(parsed: T, remainder: string)
datatype Result<T> = Failure | Success(forest: IList<OneParse>)
type parser<T> = string -> Result<T>
function method fmapSuccess<S, T>(result: Result<S>, fn: S --> T): Result<T>
requires result.Success?
{
Success(fmap(result.forest,
(one: OneParse<S>) => OneParse(fn(one.parsed), one.remainder)))
}
function method fmapParser<T, U>(p: parser<T>, fn: T --> U): parser<U> {
s => var result := p(s); match result
case Failure => Failure
case Success(forest) => fmapSuccess(result, fn)
}
I think I can figure out how to make the full solution work on my own if someone provides tips for implementing fmap and fmapContents, so this is just for context.
Problem here is greatest predicate (IListForall) is not proved for function (container: Container) => Container(fn(container.value)). This is trivial to prove
greatest lemma IListForallLemma<T, U>(cs: IList<T>, fn: T -> U)
ensures IListForall(cs, fn.requires)
{}
Now following code snippet verifies. I have made fmapContainers to method from function method to call above lemma.
codatatype IList<T> = Nil | Cons(head: T, tail: IList<T>)
greatest predicate IListForall<T>(list: IList<T>, fn: T -> bool)
{
match list
case Nil => true
case Cons(head, rest) => fn(head) && IListForall(rest, fn)
}
function method fmap<S, T>(list: IList<S>, fn: S --> T): IList<T>
requires IListForall(list, fn.requires)
{
match list
case Nil => Nil
case Cons(s, rest) => Cons(fn(s), fmap(rest, fn))
}
datatype Container<T> = Container(value: T)
greatest lemma IListForallLemma<T, U>(cs: IList<T>, fn: T -> U)
ensures IListForall(cs, fn.requires)
{}
method fmapContainers<T, U>(cs: IList<Container<T>>, fn: T -> U) returns (r: IList<Container<U>>)
{
IListForallLemma(cs, (container: Container) => Container(fn(container.value)));
r := fmap(cs, (container: Container) => Container(fn(container.value)));
}
Let's say that I want to create a simple directive to fetch a session identifier off the HTTP request but with a twist: The request might be using an old format but either should be considered valid for the moment because of backwards compatibility.
I sketched out something simple that basically composes two existing directives together with the resulting directive being a...
Directive[Option[String] :: Option[String] :: HNil]
Here's the meat of my question: Are there any concise ways that to basically say "given a Directive that is an HList of multiple Option's, get the first Some and use that otherwise use a default value"?
My current implementation. Works but seems a bit messy and isn't re-usable at all.
def sessionIdentifier: Directive1[String] = {
(optionalHeaderValueByName("New-Session-Header") & parameter("oldsessionparam"?)).hmap {
case Some(x) :: _ :: HNil if x.nonEmpty => x
case _ :: Some(x) :: HNil if x.nonEmpty => x
case _ => getNewSessionId()
}
}
If all the element types are guaranteed to be of type Option[String] then the following will be a little terser,
# import shapeless._
import shapeless._
# val opts = Option.empty[String] :: Option("foo") :: Option("bar") :: HNil
opts: Option[String] :: Option[String] :: Option[String] :: HNil = ...
# opts.toList.flatten.headOption.getOrElse("default")
res0: String = "foo"
Your implementation is quite good actually. It may be refactored as
val sessionParameters =
optionalHeaderValueByName("New-Session-Header") & parameter("oldsessionparam".?)
val sessionIdentifier =
sessionParameters.hmap {
case newId :: oldId :: HNil =>
newId.filter(_.nonEmpty)
.orElse(oldId.filter(_.nonEmpty))
.getOrElse(getNewSessionId())
}
The following snippet of parser combinator demonstrates am aim of generalising binary comparison ops like > by using Ordered[T]. Gt seems to accomplish this at the AST level but I'm having trouble extending this concept.
The intGt parser works but is it possible to generalise this around Ordered[T] such that we don't need to write a second parser for floatGt (and hence one for all supported orderable types * all supported ops - no thanks).
object DSL extends JavaTokenParsers {
// AST
abstract class Expr[+T] { def eval: T }
case class Literal[T](t: T) extends Expr[T] { def eval = t }
case class Gt[T <% Ordered[T]](l: Expr[T], r: Expr[T]) extends Expr[Boolean] {
def eval = l.eval > r.eval // view-bound implicitly wraps eval result as Ordered[T]
}
// Parsers
lazy val intExpr: Parser[Expr[Int]] = wholeNumber ^^ { case x => Literal(x.toInt) }
lazy val floatExpr: Parser[Expr[Float]] = decimalNumber ^^ { case x => Literal(x.toFloat) }
lazy val intGt: Parser[Expr[Boolean]] = intExpr ~ (">" ~> intExpr) ^^ { case l ~ r => Gt(l, r) }
}
I tried playing around and this is the best I could come up with in the time I had:
import scala.util.parsing.combinator.JavaTokenParsers
object DSL extends JavaTokenParsers {
// AST
abstract class Expr[+T] { def eval: T }
case class Literal[T](t: T) extends Expr[T] { def eval = t }
case class BinOp[T,U](
val l : Expr[T],
val r : Expr[T],
val evalOp : (T, T) => U) extends Expr[U] {
def eval = evalOp(l.eval, r.eval)
}
case class OrderOp[O <% Ordered[O]](symbol : String, op : (O, O) => Boolean)
def gtOp[O <% Ordered[O]] = OrderOp[O](">", _ > _)
def gteOp[O <% Ordered[O]] = OrderOp[O](">=", _ >= _)
def ltOp[O <% Ordered[O]] = OrderOp[O]("<", _ < _)
def lteOp[O <% Ordered[O]] = OrderOp[O]("<=", _ <= _)
def eqOp[O <% Ordered[O]] = OrderOp[O]("==", _.compareTo(_) == 0)
def ops[O <% Ordered[O]] =
Seq(gtOp[O], gteOp[O], ltOp[O], lteOp[O], eqOp[O])
def orderExpr[O <% Ordered[O]](
subExpr : Parser[Expr[O]],
orderOp : OrderOp[O])
: Parser[Expr[Boolean]] =
subExpr ~ (orderOp.symbol ~> subExpr) ^^
{ case l ~ r => BinOp(l, r, orderOp.op) }
// Parsers
lazy val intExpr: Parser[Expr[Int]] =
wholeNumber ^^ { case x => Literal(x.toInt) }
lazy val floatExpr: Parser[Expr[Float]] =
decimalNumber ^^ { case x => Literal(x.toFloat) }
lazy val intOrderOps : Parser[Expr[Boolean]] =
ops[Int].map(orderExpr(intExpr, _)).reduce(_ | _)
lazy val floatOrderOps : Parser[Expr[Boolean]] =
ops[Float].map(orderExpr(floatExpr, _)).reduce(_ | _)
}
Essentially, I defined a small case class OrderOp that relates a string representing
an ordering operation to a function which will evaluate that operation. I then defined a function ops capable of creating a Seq[OrderOp] of all such ordering operations for a given Orderable type. These operations can then be turned into parsers using orderExpr, which takes the sub expression parser and the operation. This is mapped over all the ordering operations for your int and float types.
Some issues with this approach:
There is only one node type in the AST type hierarchy for all binary operations. This isn't a problem if all you are ever doing is evaluating, but if you ever wanted to do rewriting operations (eliminating obvious tautologies or contradictions, for instance) then there is insufficient information to do this with the current definition of BinOp.
I still needed to map orderExpr for each distinct type. There may be a way to fix this, but I ran out of time.
orderExpr expects the left and right subexpressions to be parsed with the same parser.
I'm trying to write a parser in Scala for SML with Tokens. It almost works the way I want it to work, except for the fact that this currently parses
let fun f x = r and fun g y in r end;
instead of
let fun f x = r and g y in r end;
How do I change my code so that it recognizes that it doesn't need a FunToken for the second function?
def parseDef:Def = {
currentToken match {
case ValToken => {
eat(ValToken);
val nme:String = currentToken match {
case IdToken(x) => {advance; x}
case _ => error("Expected a name after VAL.")
}
eat(EqualToken);
VAL(nme,parseExp)
}
case FunToken => {
eat(FunToken);
val fnme:String = currentToken match {
case IdToken(x) => {advance; x}
case _ => error("Expected a name after VAL.")
}
val xnme:String = currentToken match {
case IdToken(x) => {advance; x}
case _ => error("Expected a name after VAL.")
}
def parseAnd:Def = currentToken match {
case AndToken => {eat(AndToken); FUN(fnme,xnme,parseExp,parseAnd)}
case _ => NOFUN
}
FUN(fnme,xnme,parseExp,parseAnd)
}
case _ => error("Expected VAL or FUN.");
}
}
Just implement the right grammar. Instead of
def ::= "val" id "=" exp | fun
fun ::= "fun" id id "=" exp ["and" fun]
SML's grammar actually is
def ::= "val" id "=" exp | "fun" fun
fun ::= id id "=" exp ["and" fun]
Btw, I think there are other problems with your parsing of fun. AFAICS, you are not parsing any "=" in the fun case. Moreover, after an "and", you are not even parsing any identifiers, just the function body.
You could inject the FunToken back into your input stream with an "uneat" function. This is not the most elegant solution, but it's the one that requires the least modification of your current code.
def parseAnd:Def = currentToken match {
case AndToken => { eat(AndToken);
uneat(FunToken);
FUN(fnme,xnme,parseExp,parseAnd) }
case _ => NOFUN
}