Simplfying Expression : Z3 SMT Solver - z3

Executing the following query with the Z3 solver:
(declare-const c0 Int)
(declare-const c1 Int)
(declare-const c2 Int)
(assert (exists ((c0_s Int) (c1_s Int) (c2_s Int))
(and
(= (+ c0 c1 c2) 5) (>= c0 0) (>= c1 1) (>= c2 1)
(= c0_s c0) (= c1_s (- c1 1)) (= c2_s (+ c2 1))
(= c2_s 3) (= (+ c0_s c1_s) 2)
))
)
(apply (then qe ctx-solver-simplify propagate-ineqs))
produces the following output:
(goals
(goal
(>= c0 0)
(<= c0 2)
(>= c1 1)
(<= c1 3)
(<= (+ (* (- 1) c0) (* (- 1) c1)) (- 3))
(<= (+ c1 c0) 3)
(= c2 2)
:precision precise :depth 3)
)
where I was expecting a result from the Z3 solver like this:
(goals
(goal
(>= c0 0)
(<= c0 2)
(>= c1 1)
(<= c1 3)
(= (+ c1 c0) 3)
(= c2 2)
:precision precise :depth 3)
)
Can anyone explain why Z3 is producing such a complex result instead of what I expected? Is there a way to get Z3 to simplify this output?

You may get a more detailed answer from a member of the core Z3 team, but from my experience working with Z3's integer solver at a low level, I can give a bit of intuition as to why this is happening.
Briefly, in order to solve integer equations, Z3's integer theory solver expects all of its constraints to appear in a very particular and restricted form. Expressions that do not follow this form must be rewritten before they are presented to the solver. Normally this happens internally by a theory rewriter, and any expression can be used in the input constraint set without issue.
The restrictions that apply here (that I am aware of), which help explain why you are seeing this strange-looking output, are as follows:
The integer solver can represent an equality constraint (= a b) as two separate inequality constraints (<= a b) and (>= a b). This is why you're seeing two separate constraints over your variables in the model instead of just one equality.
The integer solver rewrites subtractions, or negated terms, as multiplication by -1. This is why you are seeing these negations in your first constraint, and why the operator is addition instead of subtraction.
Arithmetic expressions are rewritten so that the second argument to a comparison operator is always a constant value.
In short, what you're seeing is likely an artifact of how the arithmetic theory solver represents constraints internally.
Since the output of your instance is a goal and not a model or proof, these expressions may not have been fully simplified yet, as I believe that intermediate goals are not always simplified (but I don't have experience with this part of the solver).

Related

Is it possible to use both bit-blast and soft-assert with the z3 solver?

I'm trying to use the z3 smt solver to allocate values to variables subject to constraints. As well as hard constraints I have some soft constraints (e.g. a != c). I expected to be able to specify the hard constraints with assert and the soft constraints as soft-assert and this works if I solve with (check-sat).
Some of the files are large and complex and only solve in a reasonable time if I turn on bit-blasting using (check-sat-using (then simplify solve-eqs bit-blast sat)). When I do this the soft asserts seem to be ignored (example below or at rise4fun). Is this expected? Is it possible to use both bit-blast solving and soft-assert at the same time?
The following SMT code defines 4 bitvectors, a, b, c & d which should all be able to take unique values but are only forced to do so by soft asserts. Using the check-sat (line 39) works as expected but the check-sat-using (line 38) assigns b and d to the same value.
(set-option :produce-models true)
(set-logic QF_BV)
;; Declaring all the variables
(declare-const a (_ BitVec 2))
(declare-const b (_ BitVec 2))
(declare-const c (_ BitVec 2))
(declare-const d (_ BitVec 2))
(assert (or (= a #b00)
(= a #b01)
(= a #b10)
(= a #b11)))
(assert (or (= b #b00)
(= b #b01)
(= b #b10)
(= b #b11)))
(assert (or (= c #b00)
(= c #b01)
(= c #b10)
(= c #b11)))
(assert (or (= d #b00)
(= d #b01)
(= d #b10)
(= d #b11)))
;; Soft constraints to limit reuse
(assert-soft (not (= a b)))
(assert-soft (not (= a c)))
(assert-soft (not (= a d)))
(assert-soft (not (= b c)))
(assert-soft (not (= b d)))
(assert-soft (not (= c d)))
(check-sat-using (then simplify solve-eqs bit-blast sat))
;;(check-sat)
(get-value (a
b
c
d))
Great question! When you use assert-soft the optimization engine kicks in by default. You can see this by using your program with the (check-sat) clause, and running with higher verbosity. I've put your program in a file called a.smt2:
$ z3 -v:3 a.smt2
(optimize:check-sat)
(sat.solver)
(optimize:sat)
(maxsmt)
(opt.maxres [0:6])
(sat.solver)
(opt.maxres [0:0])
found optimum
sat
((a #b01)
(b #b00)
(c #b11)
(d #b10))
So, we can see z3 is treating this as an optimization problem, which takes soft-constraints into account and gives you the "disjointness" you're seeking.
Let's do the same, but this time we'll use the check-sat call that specifies the tactics to use. We get:
$ z3 -v:3 a.smt2
(smt.searching)
sat
((a #b11)
(b #b11)
(c #b11)
(d #b10))
And this confirms your suspicion: When you tell z3 exactly what to do, it doesn't do the optimization pass. In hindsight, this is to be expected, but I do agree that it's rather surprising.
The question is then whether we can tell z3 to do the optimization explicitly. However I'm not sure if this is even possible within the tactic language. I think this question is well worthy of asking at their issues site (https://github.com/Z3Prover/z3/issues) and see if there's a magic incantation you can use to kick off the maxres engine from the tactic language. (This may not be possible due to a number of reasons, but there's no reason to speculate here.) Please report back here what you find out!

Is it possible to detect inconsistent equations in Z3, or before passing to Z3?

I was working with z3 with the following example.
f=Function('f',IntSort(),IntSort())
n=Int('n')
c=Int('c')
s=Solver()
s.add(c>=0)
s.add(f(0)==0)
s.add(ForAll([n],Implies(n>=0, f(n+1)==f(n)+10/(n-c))))
The last equation is inconsistent (since n=c would make it indeterminate). But, Z3 cannot detect this kind of inconsistencies. Is there any way in which Z3 can be made to detect it, or any other tool that can detect it?
As far as I can tell, your assertion that the last equation is inconsistent does not match the documentation of the SMT-LIB standard. The page Theories: Reals says:
Since in SMT-LIB logic all function symbols are interpreted
as total functions, terms of the form (/ t 0) are meaningful in
every instance of Reals. However, the declaration imposes no
constraints on their value. This means in particular that
for every instance theory T and
for every value v (as defined in the :values attribute) and
closed term t of sort Real,
there is a model of T that satisfies (= v (/ t 0)).
Similarly, the page Theories: Ints says:
See note in the Reals theory declaration about terms of the form
(/ t 0).
The same observation applies here to terms of the form (div t 0) and
(mod t 0).
Therefore, it stands to reason to believe that no SMT-LIB compliant tool would ever print unsat for the given formula.
Z3 does not check for division by zero because, as Patrick Trentin mentioned, the semantics of division by zero according to SMT-LIB are that it returns an unknown value.
You can manually ask Z3 to check for division by zero, to ensure that you never depend division by zero. (This is important, for example, if you are modeling a language where division by zero has a different semantics from SMT-LIB.)
For your example, this would look as follows:
(declare-fun f (Int) Int)
(declare-const c Int)
(assert (>= c 0))
(assert (= (f 0) 0))
; check for division by zero
(push)
(declare-const n Int)
(assert (>= n 0))
(assert (= (- n c) 0))
(check-sat) ; reports sat, meaning division by zero is possible
(get-model) ; an example model where division by zero would occur
(pop)
;; Supposing the check had passed (returned unsat) instead, we could
;; continue, safely knowing that division by zero could not happen in
;; the following.
(assert (forall ((n Int))
(=> (>= n 0)
(= (f (+ n 1))
(+ (f n) (/ 10 (- n c)))))))

What additional axioms do we need to add so that Z3 can verify the satisfiability of programs with recurrences?

As we know Z3 has limitations with recurrences. Is there any way get the result for the following program? what will additional equation help z3 get the result?
from z3 import *
ackermann=Function('ackermann',IntSort(),IntSort(),IntSort())
m=Int('m')
n=Int('n')
s=Solver()
s.add(ForAll([n,m],Implies(And(n>=0,m>=0),ackermann(m,n) == If(m!=0,If(n!=0,ackermann(m - 1,ackermann(m,n - 1)),If(n==0,ackermann(m - 1,1),If(m==0,n + 1,0))),If(m==0,n + 1,0)))))
s.add(n>=0)
s.add(m>=0)
s.add(Not(Implies(ackermann(m,n)>=0,ackermann(m+1,0)>=0)))
s.check()
With a nested recursive definition like Ackermann's function, I don't think there's much you can do to convince Z3 (or any other SMT solver) to actually do any interesting proofs. Such properties will require clever inductive arguments, and an SMT solver is just not the right tool for this sort of verification. A theorem prover like Isabelle, HOL, Coq, ... is the better choice here.
Having said that, the basic approach to establishing recursive function properties in SMT is to literally code up the inductive hypothesis as a quantified axiom, and arrange for the property you want proven to precisely line up with that axiom when the e-matching engine kicks in so it can instantiate the quantifiers "correctly." I'm putting the word correctly in quotes here, because the matching engine will go ahead and keep instantiating the axiom in unproductive ways especially for a function like Ackermann's. Theorem provers, on the other hand, precisely give you control over the proof structure so you can explicitly guide the prover through the proof-search space.
Here's an example you can look at: list concat in z3 which is doing an inductive proof of a much simpler inductive property than you are targeting, using the SMT-Lib interface. While it won't be easy to extend it to handle your particular example, it might provide some insight into how to go about it.
In the particular case of Z3, you can also utilize its fixed-point reasoning engine using the PDR algorithm to answer queries about certain recursive functions. See http://rise4fun.com/z3/tutorialcontent/fixedpoints#h22 for an example that shows how to model McCarthy's famous 91 function as an interesting case study.
Z3 will not try to do anything by induction for you, but (as Levent Erkok mentioned) you can give it the induction hypothesis and have it check that the result follows.
This works on your example as follows.
(declare-fun ackermann (Int Int) Int)
(assert (forall ((m Int) (n Int))
(= (ackermann m n)
(ite (= m 0) (+ n 1)
(ite (= n 0) (ackermann (- m 1) 1)
(ackermann (- m 1) (ackermann m (- n 1))))))))
(declare-const m Int)
(declare-const n Int)
(assert (>= m 0))
(assert (>= n 0))
; Here's the induction hypothesis
(assert (forall ((ihm Int) (ihn Int))
(=> (and (<= 0 ihm) (<= 0 ihn)
(or (< ihm m) (and (= ihm m) (< ihn n))))
(>= (ackermann ihm ihn) 0))))
(assert (not (>= (ackermann m n) 0)))
(check-sat) ; reports unsat as desired

simplification in Z3

(declare-datatypes () ((SE BROKEN ON OFF)))
(declare-const s SE)
(declare-const a Int)
(simplify (or (= s ON) (= s OFF) (= s BROKEN)))
(simplify (and (> a 0) (> a 1)))
The result is:
(or (= s ON) (= s OFF) (= s BROKEN))
(and (not (<= a 0)) (not (<= a 1)))
But the expected result was:
1
> a 1
Is it possible to simplify such expressions (the combinations of such expressions) in Z3?
Thank you!
The simplify command is just a bottom-up rewriter. It is fast, but will fail to simplify expressions such as the ones in your post. Z3 allows users to define their own simplification strategies using tactics. They are described in this article, and the Z3 tutorials (Python and SMT 2.0). The following posts also have additional information:
t>=1 or t>=2 => t>=1
Asymmetric behavior in ctx-solver-simplify
what's the difference between "simplify" and "ctx-solver-simplify" in z3
The first query in your example can be simplified using the tactic ctx-solver-simplify (also available online here).
(declare-datatypes () ((SE BROKEN ON OFF)))
(declare-const s SE)
(declare-const a Int)
(assert (or (= s ON) (= s OFF) (= s BROKEN)))
(assert (and (> a 0) (> a 1)))
(apply ctx-solver-simplify)
The command apply applies the tactic ctx-solver-simplify over the set of assertions, and displays the resulting set of goals. Note that, this tactic is way more expensive than the command simplify.

Simplifying an expression leads to time out

How can I simplify the following expression using Z3 Solver?
(declare-const c0 Int)
(declare-const c1 Int)
(declare-const c2 Int)
(assert (let ((a!1 (to_real (+ (* (* 2 c0) c2)
(* (* 2 c0) c1)
(* 2 c1 c2)
(* c0 (- c0 1))
(* c1 (- c1 1))))))
(let ((a!2 (/ (to_real (* (* 2 c0) c2)) a!1)))
(and (or (and (<= c2 1) (>= c2 1) (<= c0 2) (>= c0 2) (<= c1 3) (>= c1 3))
(and (<= c2 1) (>= c2 1) (<= c0 3) (>= c0 3) (<= c1 2) (>= c1 2)))
(= (/ 2.0 15.0) a!2))))
)
(apply (then qe propagate-values (repeat (then ctx-solver-simplify propagate-ineqs) 10)))
Link : http://rise4fun.com/Z3/u7F7
I tried all the possible tactics that I know about and yet ended up causing time out by the solver. Is there a way that I can avoid time out? Is it suppose to return false as a result in Java API?
It's hard to tell what's going on just by looking at that code. But I'd think that to_real might be the problematic part, as conversion between domains tend to generate non-linear constraints that can cause complexity problems.
I'd give it a try using purely Reals (i.e., declare c0, c1.. as Reals; and remove calls to to_real.)
If you do need integers/reals mixed; make sure that mixing is done at the leaves (i.e., at constants); or at the very-top, as much as you can push the conversions around; instead of at intermediate values.
But I'd guess that sticking to Reals would be the way to go here if your problem space allows for that.
The example uses non-linear integer arithmetic. Unfortunately, it is easy to produce examples in this domain where Z3 does not terminate. The ctx-solver-simplify routine calls the SMT solver multiple times and in each invocation has to check satisfiability of some combination of the non-linear constraints.

Resources