Boogie on Visual Studio - z3

My question is about Boogie, but since no Boogie tag was available, I used the dafny tag as one that is closely related to Boogie.
I have built Boogie on Visual Studio following the instructions in the documentation. What should I do next to write Boogie code or equivalently how can I run the .bpl files in the Test folder? From what I understood is that Boogie although an intermediate verification language can be used independently as well.
Thank you.

There is no interactive VS mode for Boogie. If you want to write Boogie code and get design-time feedback in the editor, then the best mode available is in emacs, see https://github.com/boogie-org/boogie-friends. Still, this emacs mode does not give you error traces or verification-debugger information.

Boogie depends on Z3, so make sure that's available.
Then, from the command line:
boogie.exe /help
boogie.exe file_to_verify.bpl

Related

Built-in code analysers vs NuGet packages

Having just switched to VS2019 I’m exploring whether to use code analysis. In the project properties, “code analysis” tab, there are numerous built-in Microsoft rule sets, and I can see the editor squiggles when my code violates one of these rules. I can customise these rule sets and “save as” to create my own.
I have also seen code analyser NuGet packages such as “Roslynator” and “StyleCop.Analyzers”. What’s the difference between these and the built-in MS rules? Is it really just down to more comprehensive sets of rules/more choice?
If I wanted to stick with the built-in MS rules, are there any limitations? E.g. will they still get run and be reported on during a TFS/Azure DevOps build?
What's the difference between legacy FxCop and FxCop analyzers?
Legacy FxCop runs post-build analysis on a compiled assembly. It runs as a separate executable called FxCopCmd.exe. FxCopCmd.exe loads the compiled assembly, runs code analysis, and then reports the results (or diagnostics).
FxCop analyzers are based on the .NET Compiler Platform ("Roslyn"). You install them as a NuGet package that's referenced by the project or solution. FxCop analyzers run source-code based analysis during compiler execution. FxCop analyzers are hosted within the compiler process, either csc.exe or vbc.exe, and run analysis when the project is built. Analyzer results are reported along with compiler results.
Note
You can also install FxCop analyzers as a Visual Studio extension. In this case, the analyzers execute as you type in the code editor, but they don't execute at build time. If you want to run FxCop analyzers as part of continuous integration (CI), install them as a NuGet package instead.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-analyzers-faq?view=vs-2019
So, the built-in legacy FxCop and NuGet analyzers only run at build time while the extension analyzers can run at the same time the JIT compiler does as you type. Also, you have to specifically say to run legacy code analysis on build, whereas the NuGet analyzers will run on build just because they are installed. And analyzers installed as NuGet or extensions won't run when you go to the menu option "Run Code Analysis".
At least, that's what I get out of that page.
There's a link near the bottom of that page that takes you to what code analysis rules have moved over to the new analyzers, including rules that are now deprecated.
https://learn.microsoft.com/en-us/visualstudio/code-quality/fxcop-rule-port-status?view=vs-2019
The different analyzers attempt to cover different coding styles and things Microsoft didn't cover when they built FxCop. With the little research I just did on this, there's a whole rabbit hole to follow, Alice, that would take more time than I have right now to devote to it. And it seems to be filled with lots of arcane knowledge and OCD style code nitpicks that make Wonderland seem normal. But that's just my opinion.
There's lots of personal and professional opinion about various rules in these and basic Microsoft rules, so there's plenty of room to use what you want and disable what you don't. For a beginner, I'd suggest turning on only a few rules at a time. That way you aren't inundated with more warnings and errors than lines of code you might have. Ok, so that might be a bit of an exaggeration, but there's so many rules that really are nitpicks, especially on legacy code, that they aren't really worth it to have enabled, since you likely won't have time to fix it all. You will also want to do basic research and use "common sense" when you decide what to enable. ("Do I really need to worry about variable capitalization coding style consistency on an app that's been ported into 4 different languages over 15+ years and has 10k files?") This is both personal and professional opinion here, so follow it or not.
And don't forget the rules that contradict each other. Those are fun to deal with.......

Detailed Explanation on "Re-Hosting" and "Retargeting" both for compilers and binary data (such as .exe or .obj)

Sometimes in s/w companies, customers provide data in multiple formats. There are linkable and executable data that are said to be "Rehosted" and compiled object files that are said to be "Retargeted". I am trying to understand what rehosting and retargeting mean in this area. Is it similar to the Bootstrap theory in computer science? I have the understanding of the following process (if not incorrect):
PROBLEM:
I need to write a compiler for a new language called "MyLang" to run on PowerPC
Solution:
1. I need to write a compiler for a language "MyLang-Mini"; a subset of "MyLang" to run on PowerPC.
2. I need to write a compiler for "MyLang" using "MyLang-Mini" to run on PowerPC.
3. I run the compiler obtained from no. 1 through the compiler obtained from no. 2 to
obtain the compiler for MyLang to run on PowerPC.
IN BESPOKE "T" DIAGRAM (...ISH):
MyLang PowerPC MyLang PowerPC
MyLangMini MyLangMini PowerPC PowerPC(instr.)
PowerPC(instr.)
What I am getting confused about is rehosting and retargeting. How are they coonected to this concept? What am I rehosting and retargeting if I have some binary data such as .exe or .obj? I would appreciate some detailed explanation if possible please!
I know that this will embark onto "CROSS-COMPILERS", but would prefer expert opinions to be sure.
Thanks in advance.
I now know that in s/w engineering:
REHOSTING - If you have a third-party application linkable/executable that requires usage on your host machine, you do rehosting. The target in this case are most often the same (OS platform, processor, etc.). In worst case, there is a virtualisation required. The rehosted application will run as if it was one of the application running in the host machine
RETARGETTING - If you have a third-party source code, you might need to recompile that to match with your target environment. It may also be that you have third-party .o or .obj compiled models and you want to link them with your source code (retargeted) in order to host it on a host machine. Just like REHOSTED application, it will be as if the application was installed on the host machine.
It will be good to know how this is similar to the compiler rehosting and retargeting. Sorry, I am a newbee is this area and will appreciate even a slap on the wrist.

z3 MaxSAT example error

I'm interested in playing around with the MaxSAT/MaxSMT c example (specifically, maxsat.c) provided on the z3 (Microsoft Research) website. Using Visual Studio 2010, I eventually got the example to compile (using a fresh install of z3 4.0). However, I can't get any of my (SMT 2.0) models to run using them. Further, I cannot get the example posted in this question to work either.
In the first case, my compiled program crashes when it tries to call Z3_get_smtlib_num_formulas in get_hard_constraints of the file. I don't know why, instead, I get the standard windows 7 "this program has stopped working" popup.
In the second case, it reports unsupported ;benchmark.
In order to help me to get this work, I was wondering if
(a) Has anyone had similar issues when compiling this code, and if so, how did you resolve them?
or
(b) How can I debug either compilation of the file (assuming it is correct)? Namely, can someone enumerate the correct libraries (and library versions - e.g., z3 4.0?) to include in the compiler options to get this example working?
In either case, information on the error reported in the second case would also be appreciated: what does it mean exactly? The keyword was not valid? That the SMT input is the wrong version? Or something else?
Thanks.
The MaxSAT example has not been updated to SMTLIB 2.0 yet. It uses the function Z3_parse_smtlib_file to parse the input, which means that it supports only SMTLIB 1.0 at the moment.
This example is distributed alongside Z3, i.e., you should have received a copy in Z3-4.0/examples/maxsat/, which also contains compilation and execution scripts.
Compilation should be straight-forward by running build.cmd in a Visual Studio Command Prompt, or build.sh on Linux.

Can I generate a .RSM file myself for the Delphi Debugger to use?

I wish to debug executables for which I have no code, using the Delphi Debugger.
WinDBG and other debuggers are no option in this case, as the given executables all call into my DLL, for which I do have code, obviously. My ultimate goal is, to see a stack-trace right down into the functions of the running executable.
I do have symbol-information for these executables, so I was hoping I could write my own .RSM files for this purpose. Will this work? Will the Delphi debugger pick up any .RSM file that it can find? And would that mean that other debug-information should be left out?
Do note that there are lots of executables that I need to debug, and for all of them I detect the symbols inside them myself, using a moderately advanced function-detection algorithm. So my main problem mainly is how to write .RSM files. For this I have to know the structure of the .RSM file-format. Is there documentation or example code available somewhere that shows me how to create such a file?
Any help is appreciated!
PS: Might you be wondering why I'm doing all this : It's all related to Dxbx - an open-source Xbox1 emulator. See sourceforce for details. New members are welcome!
I found a page that says the format is similar to CodeView (www.openwatcom.org/index.php/Debugging_Format_Interoperability)
There is a link to this reference at Microsoft's CodeView format specs
I doubt this fully answers your question, but maybe it will get you a little further?

FxCop/StyleCop for Delphi?

Does anyone know of an equivalent to FxCop/StyleCop for Delphi? I would really like to get the automatic checking of style, etc. into Continuous Integration.
There's Pascal Analyzer from Peganza: http://www.peganza.com/products_pal.htm
I don't know how the features compare to FxCop, since I haven't really used either one.
The closest I've seen is CodeHealer from SOCK software. We use it, and we have integrated it into our FinalBuilder build. It differs from FxCop in one important way: It analyzes the source code, rather than the produced executable. It also doesn't check quite as much as FxCop does. But I think it is the best thing which is available in this category for Delphi.
Delphi 2009 support isn't there just yet, but they say they're working on it.
Delphi Code Analyzer is another one that is open source.
The DGrok project started with something like FxCop some years ago. The parser and analysis parts are still available, read more at "DGrok 0.8.1: multithreading, default options, GPL" - The parser is a .Net project but
DGrok is a set of tools for parsing
Delphi source code and telling you
stuff about it. Read more about it on
the DGrok project page.
There is a new Delphi plugin for Sonar, which uses a Delphi grammar to run automatic tests over the source code.
I've heard of something called Delforex but haven't used it myself (yet)
Delforex is great for actually formatting the code. It does not do much more than that though. (we have/do use it).
I would second the votes for either Pascal Analyzer or Code Healer.
Vaccano
Doesn't Delphi output .net compatible IL code? I haven't used it in an age but I thought newer versions output .net assemblies.
If so then I would have thought FXcop would work and you could always add some of your own custom rules to it. Stylecop would not work but you could at least get FXCop running.

Resources