I lost my sourcecode of my grails project is there any way to retrieve
the sourcecode from the war file. Maybe a decompiler, I'm not sure
Please help.
JD-GUI (http://jd.benow.ca/) is the best decompiler I've used, and it's pretty good with classes compiled from Groovy. But what you get is far from the original code since it includes a lot of extra code that the Groovy compiler adds, and also code that is added by AST transformations.
It's likely going to be less time to rewrite the app from scratch.
Related
I want to evaluate the performance of Rascal for a given rewrite system that I've written. I'm wondering if there's a good way of doing it?
Ideally, I'd generate some compiled Java classes from the system and then run them manually against my inputs. Is there an easy or recommended way to do it?
Cheers,
One way to do this is to use the functions in the library util::Benchmark. Typically, you could write something like
cpuType( (){ call_the_function_I_want_to_observe(); } ). This will execute your function and print the cpu time used.
Note that Rascal can be executed in two ways: interpreted and compiled which makes a big difference when measuring performance. We are working hard at the moment to fully integrate the compiler in the Eclipse IDE, but a stand alone version is available as well. This can be called as java -Xss8m -jar rascal-0.8.4-SNAPSHOT.jar --compiledREPL followed by at least values for directories for sources (--src), and binaries (--bin). Here rascal-0.8.4-SNAPSHOT.jar (but most likely named differently) is downloaded from the https://update.rascal-mpl.org/console/rascal-shell-unstable.jar.
If you need more information, don't hesitate to ask for more details: this part of our tool chain is unfortunately still undocumented.
I'm thinking it might be easiest if I modify the Java syntax used in Rascal to better fit our Java-like language.
Is there a way I can build Rascal from the source? I've cloned the repo from Github and imported it as a project into Eclipse but there are some compilation errors regarding org.eclipse.imp. Before I head down the rabbit-hole of trying to get this all to work in Eclipse I thought I would post here to see if there is an easy way to handle this.
Thanks!
Sure you could build Rascal from scratch; following the developer instructions at https://github.com/cwi-swat/rascal/wiki/Rascal-Developers-Setup---Step-by-Step
On the other hand, if you wish to simply adapt the Java syntax definition it would be better to clone it into your own files. Grammars may look modular, but in reality there are complex interactions between different parts of the grammar. Better to clone and manage the whole thing as your own than depend on two co-evolving definitions.
If you clone the Java grammar Rascal will generate new parsers for you on-the-fly. If this generation becomes cumbersome, a "cached parser" can help you to optimize the deployment of your tools. Please contact us if you need help with that.
I have few own APIs with around 2000 classes overall. Some of them use the new Path API from JDK7. Most other classes, however, do not rely on any new JDK APIs or new language features. So most classes could be used in a JDK6 environment (which I plan to do). Let's assume, I've annotated all JDK7-only classes with #Java7Only.
What I need now, is a way to create a JDK6-only subset of all my projects more-or-less automatically, without introducing new version branching or product lines (would be too complicated to maintain).
All projects are created using Netbeans, thus using Ant. Many projects depend on others.
Please help me evaluate, which ideas according to my problem is most appropriate. Which problems could occur with each idea?
Common first step for all ideas
Let an annotation processor search for #Java7Only-annotated classes and store the list to a properties file.
Idea 1 (specific)
Write a tool which would use the properties file to recursively copy the whole project, except JDK7-only files.
Build the copied project using JDK6 by invoking ant, thus getting a JDK6-compliant jar.
Idea 2 (specific)
Write a second annotation processor which would use the properties file to pass everything except JDK7-only files to a JavaCompiler instance.
Either build a jar using Java APIs or use Ant API for that.
(This would be a Java-only idea, but probably too complicated)
Idea X (abstract)
Somehow influence the Ant build process (by overwriting some targets?) and for each JDK6-compliant class: let Ant compile two versions of it (one time with JDK6 compiler, another time with JDK7 compiler).
(JDK7-only classes would be compiled only once, using the JDK7 compiler, of course)
Package each bunch to a separate jar.
Possible common problems to the ideas
Some projects dependent on others, so some actions (such as packaging) should consider this.
Remember: the JDK7 compiler generates downward incompatible class files, that's why every possible idea has to happen on sources-level (before or during the build process, not afterwards).
My thoughts on Idea 2:
Essentially this is invoking a compiler within a compiler. Annotation processors are run as part of compilation. Can this be done safely? Is there any static state in Sun's javac that would cause problems. (I don't know the answer but from memory there might be some static state that could cause problems in this scenario).
Idea 1 seems simpler and better to me.
But taking a step back, is it possible to separate out all the JDK 7 specific stuff into a separate module and compile it separately, into a different JAR?
Have the 'main' project, compiled using JDK 6 (which JDK 7 would have no problems reading because it is backwards compatible)
The JDK 7 specific module(s), with source in a different directory, which includes the 'main' JAR on the compilation classpath, could be built separately, with a different build.xml if necessary.
This only partially applies but I'd thought I'd mention it anyway.
The problem with just using -source 1.6 -target 1.6 options for validation is that you can still use Java 7 API when compiled using JDK 7.
I've used the Animal Sniffer Maven Plugin for a few projects now and it has proved quite useful. This plugin scans byte-code of your classes for JDK API usage. That is, you can tell it to fail the build if you attempt to use JDK 7 API when you are targeting JDK 6. This wont help much for separating out classes as you need but it could be useful as a final validation step combined with -source 1.6 -target 1.6 compiler options.
There is also an animal sniffer Ant plugin, as mentioned from the Animal Sniffer main page.
The team I work for manages a large collection of technical documentation which is written in LaTeX.
Currently all the documentation we have is manually built by the editors and then checked into a version control system. Sometimes people forget to compile their documents so we have a situation where the PDF and .tex files are often out of step. Unfortunately when this happens our users find themselves reading old versions of our document.
I've managed to hack a simple script to build PDFs using Make - it's rather clumsy.
I was wondering if there was a better way to do it? Most people in our department use Eclipse + Pydev for a Python project which means we are all very familiar with this IDE. I know that Ant plays nicely with Eclipse, so might we be able to use this tool for our doc building?
So what's the best way of doing this? I hope I will not have to learn everything there is to know about a new build-system in order to automate the building of some quite simple docs.
There is an external Ant task for LaTeX PDF generation, though the site is in German.
To use it, download the jar to a location on your machine, then define a taskdef as follows:
<taskdef name="latex" classname="de.dokutransdata.antlatex.LaTeX"
classpath="/path/to/ant/lib/ant_latex.jar"/>
Then to use it, define a target like this:
<target name="doLaTeX">
<latex
latexfile="${ltx2.file}"
verbose="on"
clean="on"
pdftex="off"
workingDir="${basedir}"
/>
</target>
Where ltx2.file is the file to process.
This is a link to the howto page listing the parameters. If you need any more options, my German is just about passable enough to explain, maybe.
There is also a maven plugin for LaTeX, but I can't find any documentation.
Haven't tried it, but I remember seeing a blog post about it.
If you know python, this blog post might be interesting
EDIT: Also, I would assume that you're using some kind of version control system, and I can't say for sure, but I use git to manage all my latex docs, and it might be possible to use some kind of post-commit hook to execute a script to rebuild the document. This would depend on how your repository is structured... just thinking out loud, so to speak.
I went into great detail on a large number of build systems for latex in this question, but its slightly different in your case. I think you want rubber or latexmk. The latex-makefile seems a good idea, but only supports building via postscript, which might not be your build process.
In general, its a good idea to keep generated files outside of version control for just this reason. A good exception is when specialist build tools are not widely available, and your situation sounds similar. You might do better with a commit-hook to build automatically upon commit.
I guess I should also point out that committing something without first building it and checking it is a deadly sin, so a better solution might be to stamp that out.
Maven is a better alternative as build system compared to Ant. So I would recommend a maven-plugin to generate PDF from LaTeX sources. Have a look at mathan-latex-maven-plugin
Does anyone know of an equivalent to FxCop/StyleCop for Delphi? I would really like to get the automatic checking of style, etc. into Continuous Integration.
There's Pascal Analyzer from Peganza: http://www.peganza.com/products_pal.htm
I don't know how the features compare to FxCop, since I haven't really used either one.
The closest I've seen is CodeHealer from SOCK software. We use it, and we have integrated it into our FinalBuilder build. It differs from FxCop in one important way: It analyzes the source code, rather than the produced executable. It also doesn't check quite as much as FxCop does. But I think it is the best thing which is available in this category for Delphi.
Delphi 2009 support isn't there just yet, but they say they're working on it.
Delphi Code Analyzer is another one that is open source.
The DGrok project started with something like FxCop some years ago. The parser and analysis parts are still available, read more at "DGrok 0.8.1: multithreading, default options, GPL" - The parser is a .Net project but
DGrok is a set of tools for parsing
Delphi source code and telling you
stuff about it. Read more about it on
the DGrok project page.
There is a new Delphi plugin for Sonar, which uses a Delphi grammar to run automatic tests over the source code.
I've heard of something called Delforex but haven't used it myself (yet)
Delforex is great for actually formatting the code. It does not do much more than that though. (we have/do use it).
I would second the votes for either Pascal Analyzer or Code Healer.
Vaccano
Doesn't Delphi output .net compatible IL code? I haven't used it in an age but I thought newer versions output .net assemblies.
If so then I would have thought FXcop would work and you could always add some of your own custom rules to it. Stylecop would not work but you could at least get FXCop running.