Can EclEmma be used with obfuscated jar file - code-coverage

Can EclEmma eclipse plugin be used for code coverage with obfuscated (with Proguard) jar file?
Thanks.

I don't think that this makes sense, because the source files would have to be obfuscated too, to generate an appropriate report.
A report is always generated using the source files of the project and these must match the class files exactly.
I don't think that you want to look at a coverage report of obfuscated code.

Related

How can I define a third Java source folder for Maven which gets compiled into a third JAR?

By default, Maven standard directory layout has two Java source folders:
src/main/java
src/test/java
For my purposes, I need a third one src/junit/java which should be packaged into a JAR with the classifier junit.
If possible, the new source folder should have it's own classpath (compile + everything with scope junit).
My guess is that for this, I will have to modify at least the resource and compile plugins.
Or is there an easier way?
I have a workaround as explained here but for that, I have to put things like Mockito and JUnit on the compile classpath which violates my sense of purity.
For all people who doubt the wisdom of my approach: I have support code that help to write unit tests when you work with code from src/main/java. Since I'm using the same support code in the tests for the project itself, this code needs to be compiled after src/main/java and before src/test/java.
Specifically, my support code needs to import code from src/main/java and the tests need to be able to import the support code.
I've seen a couple of Maven setups, which bundle test code in an own Maven module. You could then create a simple main-module <- support-module <- test-module dependency chain with that. But then main-module would compile fine, if you build it on it's own without test-module. Ofc you could aggreate them together with a reactor-pom and just build the project via this pom.
Edit:
If you have problems with this setup regarding code coverage, you can use the Jacoco Maven plugin to aggregate the test coverage generated by test-module to main-module. See this for further information: http://www.petrikainulainen.net/programming/maven/creating-code-coverage-reports-for-unit-and-integration-tests-with-the-jacoco-maven-plugin/

How to retrieve source code from .war on grails

I lost my sourcecode of my grails project is there any way to retrieve
the sourcecode from the war file. Maybe a decompiler, I'm not sure
Please help.
JD-GUI (http://jd.benow.ca/) is the best decompiler I've used, and it's pretty good with classes compiled from Groovy. But what you get is far from the original code since it includes a lot of extra code that the Groovy compiler adds, and also code that is added by AST transformations.
It's likely going to be less time to rewrite the app from scratch.

With Ant/JUnit, what good is specifying .java files instead of .class files?

I'm writing a junit task in my Ant script, and I see that the documentation says that you can specify both .java and .class files.
Why would you specify .java files?
The junit task doesn't compile the source files, so no benefit there. And you already have to specify the class files in your classpath. I thought maybe specifying java source files would cause my stacktrace to show line numbers, but experimentation shows that that's not happening.
So I'm stumped. Is there any difference at all? In fact, it seems less convenient.
Is there a benefit to specifying .java files?
If you have a list of sources that you need to compile, you would pass that to a java task, which will compile them. In that case it's a (small) convenience to be able to pass that list to the junit task without having to map all the .java filenames to .class filenames.

Test code coverage without source code?

What tools are out there that can perform code coverage analysis at the machine code level rather than the source code level? I'm looking for a possible solution to perform fuzz testing on software that I do not have source code access.
I think the IBM Rational test coverage tools instrument object code.
Assuming you had such a tool, but no access to the source, what exactly
would code coverage mean, other than 100%?
If you didn't have 100% coverage, you'd know you hadn't exercised something.
But you would have no way of knowing what.
For compiled code (not Java), try Valgrind.
Old post... but my two cents.
If you have a bunch of jars and if you know what classes/methods you are using, you can instrument the jars with Emma and run your sample application against those jars.
In my case, I have jars which are actually proprietary components (to generate html code) which our company uses to build it's web-pages. We have a sample application that utilizes these components and a bunch of tests that are run against the sample app. I wrote an ant task to copy the maven dependencies to a directory, instrument them and run the tests against these instrumented jars. This task is invoked from the maven POM and is hence part of the build process.
Also, as part of the build process, we process the emma coverage data to produce a report. This report shows the classes and methods in the jar for which we do not have the source code! Hope this helps.
If you have the number of entry points (public methods), you can test the coverage for that. I don't know any tool for that though.
Otherwise you would have to test the assembly code coverage, and I don't know if it is possible.

Generate Javadoc for just interfaces using Ant?

I am using Apache Ant to generate Javadoc for my program which has many projects (or modules). However, I want to generate Javadoc for Interfaces only, and I do not know how to check if a file is a class or interface in Ant. Someone suggested me that I should use <fileset> and specify a list of files to exclude or include. However, there are hundreds of files in my program and specifying a list of class files to exclude is impossible.
Does anyone have some ideas, please?
I don't believe this is possible unless you write your own custom ant-task, (which wouldn't be that hard actually) and reference that in your Ant-script.
Another, (much uglier) way would be to generate the complete java-doc and remove non-interface files. These could for instance be identified by looking at the allclasses-frame.html:
ComponentView
<I>Composite</I>
where you have both the type (in the title=...) and file (href=...) available.
Have you considered writing your own doclet? Instead of trying to get ant to do the work, create a doclet that knows how to discard every non-interface. Then, using the javadoc task in ant is simple.
Hope that helps.

Resources