How can i generate an javascript test coverage using jstestdriver?
I am able to run the javascript test using jstestdriver. I want to generate a coverage report to check how much percentage of javascript files i have tested.
I have read through this website http://code.google.com/p/js-test-driver/wiki/CodeCoverage and followed the instructions provided. However, i still can not get a coverage report.
My folder struture right now is and the relevant files i have imported for coverage report:
-trunk
-app
-test
-lib
-jstestdriver
-javascipt (includes coverage.js, CoverageTest.js, Instrumentable.js, plugin.js)
-plugins (includes coverage.jar)
-unit
-controllerSpecs.js
-config
-coverage.conf
-scripts
- test-server.sh
- test.sh
- web-server.js
In my coverage.conf, I have the following content:
server: http://localhost:9876
load:
- test/lib/jstestdriver/javascript/coverage.js
- test/lib/jstestdriver/javascript/CoverageTest.js
- test/unit/*.js
plugin:
- name: "coverage"
jar: "test/lib/jstestdriver/plugins/coverage.jar"
module: "com.google.jstestdriver.coverage.CoverageModule"
exclude:
Thank you in advance for your help=)
It didn't work as expected for me either, but now it does. I'm not sure why but I think all I changed was the whitespace in the config file:
plugin:
- name: "coverage"
jar: "coverage-1.3.4.b.jar"
module: "com.google.jstestdriver.coverage.CoverageModule"
Note: the .dat file which was generated was empty the first time I ran this.
(I have the CodeCoverage jar in the same directory as the .conf file)
Related
The path to my project is /project/
My build file structure is src/main/package/subpackage/Class.java
My test file structure is src/test/package/subpackage/Test.java
I would like my compiled code to be in bin/main/package/subpackage/Class.class
Compiled test code in bin/test/package/subpackage/Test.class
My pom.xml has the entry
<build>
<sourceDirectory>${project.basedir}/src/main</sourceDirectory>
<testSourceDirectory>${project.basedir}/src/test</testSourceDirectory>
<outputDirectory>${project.basedir}/bin/main</outputDirectory>
<testOutputDirectory>${project.basedir}/bin/test</testOutputDirectory>
<finalName>${project.artifactId}-${project.version}</finalName>
</build>
Running mvn clean install causes the following.
Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:2.6:resources (default-resources) on project Hello-Maven: Error loading property file '/project/': /project (Is a directory) -> [Help 1]
...
[Help 1] https://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
Now I've tried the link, but it suggest that it's an issue with the plugins.
However, commenting this block out, and running mvn clean install again returns an almost empty jar file inside /project/target/Hello-Maven-1.0-SNAPSHOT.jar, containing only the pom and the manifest. Additionally, there aren't any plugins, only dependencies: junit and javafx.
EDIT: I realize that plugins specifically for running maven but finding information over why the error happens for "install" is difficult at best.
My goal is to translate a Java Project (3 .java files) to .pas files with Embarcadero's Java2OP tool.
I'm using it via cmd, started in the java2op directory. After several unsuccessful tries, and a lot of time of googling, I am frustrated.
Commands in the documentation:
Input Options
-------------
-classes Space-separated list of names of classes or packages to export. -classes lets
you define a specific subset of the specified Java sources (-jar or -source options)
or the Android API.
-jar Space-separated list of input Java compiled libraries (.jar files).
-source Space-separated list of input folders containing Java source files (.java files).
Output Options
--------------
-unit File name of the output unit.
Default: Android.JNI.Interfaces
Examples
--------
Exporting some classes and packages from the Android API:
Java2OP.exe -classes android.net.ConnectivityManager android.location.*
Exporting all classes from mylib.jar:
Java2OP.exe -jar mylib.jar
Exporting a single class from mylib.jar:
Java2OP.exe -jar mylib.jar -classes com.mypackage.ClassName
Exporting all the classes from a folder of Java sources specifying the
file name of the output Delphi unit:
Java2OP.exe -source myproject/src -unit Android.JNI.UnitName
My commands:
java2op.exe -src A:\Tests -unit Android.JNI.TestFile
So it matches the guidelines:
Java2OP.exe -source myproject/src -unit Android.JNI.UnitName
If I try to change -src to the suggested -source, I get the following error:
Invalid option: -source
After executing the command, 1 error occurs:
Command: javadoc -J-Xmx1024m -encoding "UTF-8" -sourcepath "A:\java2op\Java"
-subpackages "" -classpath "" -bootclasspath "A:\java2op\bootclasses.jar" -docletpath "A:\java2op\doclava.jar"
-doclet com.google.doclava.Doclava -nodocs -public -apixml "A:\java2op\temp.xml"
Output: javadoc: error - Illegal package name: "" 1 error
I read the following post about the same issue:
How can I extract a Delphi class from this JAVA file for use with Android?
but without success.
What am I missing or doing wrong?
When I run the jar in the GCE, it had the following error:
java -jar mySimple.jar --project=myProjcet
Aug 13, 2015 1:22:26 AM com.google.cloud.dataflow.sdk.runners.DataflowPipelineRunner detectClassPathResourcesToStage
SEVERE: Unable to convert url (rsrc:./) to file.
Aug 13, 2015 1:22:26 AM simple.SimpleV1 main
SEVERE: Failed to construct instance from factory method com.google.cloud.dataflow.sdk.runners.BlockingDataflowPipelineRunner#fromOptions
I am working on Eclipse(window). And it succeeded to run dataflow through the eclipse. Packaging the project to Runable jar file and uploaded to the GCE (ubuntu). And i had errors when i run the jar file on the GCE(ubuntu).
the runner is BlockingDataflowPipelineRunner(batch mode).
there are other options in source code.
the follow is manifest.
Manifest-Version: 1.0
Rsrc-Class-Path: ./ httpclient-4.3.6.jar httpcore-4.3.3.jar commons-lo
gging-1.1.3.jar commons-codec-1.6.jar mybatis-3.2.8.jar mysql-connect
or-java-5.1.34.jar ibatis2-common-2.1.7.597.jar ibatis2-dao-2.1.7.597
.jar ibatis2-sqlmap-2.1.7.597.jar geoip-api-1.2.14.jar google-api-cli
ent-java6-1.20.0.jar google-api-client-1.20.0.jar google-oauth-client
-1.20.0.jar guava-jdk5-13.0.jar google-oauth-client-java6-1.20.0.jar
google-oauth-client-jetty-1.20.0.jar jetty-6.1.26.jar jetty-util-6.1.
26.jar servlet-api-2.5-20081211.jar google-http-client-jackson2-1.20.
0.jar google-http-client-1.20.0.jar jsr305-1.3.9.jar joda-time-2.8.1.
jar slf4j-api-1.7.7.jar slf4j-jdk14-1.7.7.jar commons-csv-1.1.jar aws
-java-sdk-sqs-1.10.5.1.jar aws-java-sdk-core-1.10.5.1.jar google-clou
d-dataflow-java-sdk-all-0.4.150710.jar google-api-services-dataflow-v
1b3-rev4-1.19.1.jar google-cloud-dataflow-java-proto-library-all-0.4.
150612.jar protobuf-java-2.5.0.jar google-api-services-bigquery-v2-re
v187-1.19.1.jar google-api-services-compute-v1-rev46-1.19.1.jar googl
e-api-services-pubsub-v1beta2-rev1-1.19.1.jar google-api-services-sto
rage-v1-rev25-1.19.1.jar google-api-services-datastore-protobuf-v1bet
a2-rev1-2.1.2.jar google-http-client-protobuf-1.15.0-rc.jar google-ht
tp-client-jackson-1.15.0-rc.jar jackson-annotations-2.4.2.jar jackson
-databind-2.4.2.jar avro-1.7.7.jar jackson-core-asl-1.9.13.jar jackso
n-mapper-asl-1.9.13.jar paranamer-2.3.jar snappy-java-1.0.5.jar commo
ns-compress-1.9.jar jetty-server-9.2.10.v20150310.jar javax.servlet-a
pi-3.1.0.jar jetty-http-9.2.10.v20150310.jar jetty-io-9.2.10.v2015031
0.jar jetty-jmx-9.2.10.v20150310.jar jetty-util-9.2.10.v20150310.jar
jackson-core-2.6.0.jar
Class-Path: .
Rsrc-Main-Class: simple.SimpleV1
Main-Class: org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader
When exporting a Runnable JAR file using Eclipse, there are three ways to package your project:
Extract required libraries into generated JAR
Package required libraries into generated JAR
Copy required libraries into a sub-folder next to the generated JAR
All 3 options, have the same usage pattern when executing, e.g.
java -jar myrunnable.jar --myCommandLineOption1=...
Currently, only option 1 is compatible with how the Dataflow SDK for Java is able to detect resources to stage because it is dependent on them being file URIs from a URLClassLoader.
For an explanation of how the Runnable Jars are created and more specific details of why this was problematic, read further below.
An alternative solution to using the Runnable Jars, is to execute your project using mvn exec.
Option 1
This creates a jar which copies all the class files & resources in each individual jar into a single jar. This allows for a manifest where the entire classpath is composed of file based URIs:
Manifest-Version: 1.0
Main-Class: com.google.cloud.dataflow.starter.StarterPipeline
Class-Path: .
Option 2
This creates a jar file with additional jars embedded within it. It uses a custom main entry point (org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader) which knows how to read the custom manifest entries (Rsrc-Class-Path & Rsrc-Main-Class) and creates a classloader with non file based URIs. Since the Dataflow SDK for Java currently only knows how to handle file based resources and doesn't know how to interpret the rsrc:... URIs, you get the exception that your seeing.
Manifest-Version: 1.0
Rsrc-Class-Path: ./ httpclient-4.3.6.jar ...
Class-Path: .
Rsrc-Main-Class: simple.SimpleV1
Main-Class: org.eclipse.jdt.internal.jarinjarloader.JarRsrcLoader
Option 3
This creates a jar file which contains your project resources and then creates a folder along side the runnable jar containing all your projects dependent jars. This allows for a more complex standard manifest listing all your project dependencies.
Manifest-Version: 1.0
Main-Class: com.google.cloud.dataflow.starter.StarterPipeline
Class-Path: . runnable_lib/google-cloud-dataflow-java-sdk-all-manual_build.jar ...
The Class-Path manifest is not returned part of the URLClassLoader and hence these classes are not discoverable. Furthermore, those jars are only meant to be loaded by classes from that jar which can lead to a jar loading hierarchy. More details are available here: http://docs.oracle.com/javase/7/docs/technotes/tools/findingclasses.html
I am trying to get the code coverage with jenkins and jacoco plugins.
I have a jacoco agent jar on the machine where my testing is being executed. I then retrieve the dump and try to get the code coverage on jenkins.
However I keep getting the below error,
[JaCoCo plugin] Collecting JaCoCo coverage data...
[JaCoCo plugin] \**/coverage/jacoco.exec;\**/coverage/classes-cov;\**/application/; locations are configured
[JaCoCo plugin] Number of found exec files for pattern \**/coverage/jacoco.exec: 1
[JaCoCo plugin] Saving matched execfiles: /home/ec2-user/slave/workspace/Automation_Code_Coverage_POMS/coverage/jacoco.exec
[JaCoCo plugin] Saving matched class directories for class-pattern: \**/coverage/classes-cov: /home/ec2-user/slave/workspace/Automation_Code_Coverage_POMS/coverage/classes-cov
[JaCoCo plugin] Saving matched source directories for source-pattern: \**/application/:
[JaCoCo plugin] Loading inclusions files..
[JaCoCo plugin] inclusions: [\**/com/test/poms/\**]
[JaCoCo plugin] exclusions: [\**/poms/convertors/\**:\**/poms/scheduler/\**]
ERROR: Publisher 'Record JaCoCo coverage report' aborted due to exception:
java.io.IOException: Error while analyzing class /home/ec2-user/.jenkins/jobs/Automation_Code_Coverage_POMS/builds/43/jacoco/classes/com/test/poms/convertors/DtoToSroConverter.83f57acb46d004b5.class.
at org.jacoco.core.analysis.Analyzer.analyzerError(Analyzer.java:150)
at org.jacoco.core.analysis.Analyzer.analyzeClass(Analyzer.java:144)
at org.jacoco.core.analysis.Analyzer.analyzeAll(Analyzer.java:175)
at org.jacoco.core.analysis.Analyzer.analyzeAll(Analyzer.java:208)
at hudson.plugins.jacoco.ExecutionFileLoader.analyzeStructure(ExecutionFileLoader.java:126)
at hudson.plugins.jacoco.ExecutionFileLoader.loadBundleCoverage(ExecutionFileLoader.java:133)
at hudson.plugins.jacoco.JacocoReportDir.parse(JacocoReportDir.java:102)
at hudson.plugins.jacoco.JacocoBuildAction.loadRatios(JacocoBuildAction.java:291)
at hudson.plugins.jacoco.JacocoBuildAction.load(JacocoBuildAction.java:273)
at hudson.plugins.jacoco.JacocoPublisher.perform(JacocoPublisher.java:371)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:779)
at hudson.model.AbstractBuild$AbstractBuildExecution.performAllBuildSteps(AbstractBuild.java:726)
at hudson.model.Build$BuildExecution.post2(Build.java:185)
at hudson.model.AbstractBuild$AbstractBuildExecution.post(AbstractBuild.java:671)
at hudson.model.Run.execute(Run.java:1769)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:98)
at hudson.model.Executor.run(Executor.java:374)
Caused by: java.lang.IllegalStateException: Can't add different class with same name: com/test/poms/convertors/DtoToSroConverter
at org.jacoco.core.analysis.CoverageBuilder.visitCoverage(CoverageBuilder.java:106)
at org.jacoco.core.analysis.Analyzer$1.visitEnd(Analyzer.java:92)
at org.objectweb.asm.ClassVisitor.visitEnd(ClassVisitor.java:317)
at org.jacoco.core.internal.flow.ClassProbesAdapter.visitEnd(ClassProbesAdapter.java:98)
at org.objectweb.asm.ClassReader.accept(ClassReader.java:697)
at org.objectweb.asm.ClassReader.accept(ClassReader.java:506)
at org.jacoco.core.analysis.Analyzer.analyzeClass(Analyzer.java:107)
at org.jacoco.core.analysis.Analyzer.analyzeClass(Analyzer.java:142)
... 17 more
Notifying upstream projects of job completion
JaCoCo Can't add different class with same name: org/hamcrest/BaseDescription
The above link suggest to exclude the files, but if you look at the above logs, they are already being excluded but I still see this issue.
In my case see this:
12:35:12 [JaCoCo plugin] exclusions: [**/*koba*.class] in Jenkins logs while Jenkins Jacoco plugin performs the analysis. There's no backslashes as compared to what you get.
Secondly, the error that you are getting is due to either of the 2 reasons:
You have .java/.groovy files and after compilation, you create .class files.
It seems like there's class file (for ex: abc.java or com/test/poms/convertors/DtoToSroConverter in your case) which is present in the source folder that you have mentioned in the plugin's "source" field.
If you are creating any class files (for which you don't have a .java/.groovy file in source (src/main/java or src/main/groovy or src/test/java, src/test/groovy, src/xxx/java or src/xxx/groovy folders) then, jacoco analysis will error out with the same error i.e. it won't be able to find the respective source file (.java/.groovy) for the .class file it's analyzing.
Check, how many files are there with name starting with: DtoToSroConverter in your project.
Then, make sure the values that you mention in Jacoco plugin in Jenkins looks like this. NOTE: source code should NOT contains any test (unit/non-Unit test source) folders.
In my case, I'm saying, process all .exec files (anywhere in my project's workspace after the build/tests/jacoco process is complete) i.e. ****/*.exec**
Then, Path to class directories should always mention only the MAIN source class files (not test classes of either unit/non-unit types) i.e. I only used "build/classes/main" as these classes are generated against my source main code (src/main/java OR src/java). This value is the folder which contains your main source code classes only.
Path to source directories field should always contain the folder where the actual main source code exists (instead of including any test source code) i.e. I have used "src/java". I could have used "src/main/java" which is the Gradle/Maven standard folder structure for main source code. In my case, my main source code is in src/java folder.
Check if Path to source directories field and Path to class directories is set correctly? If yes, is there more than one file with the name: DtoToSroConverter
I am new to Code Coverage and EMMA tool.
I am trying to:
1. Write a simple java program "testClass1.java". Compile it, and a got a "testClass1.class" file.
2. I package this as a jar. "myJar.jar"
3. Instrument this jar using this emma command and got a coverage.em file
C:\Users\emahaboo\Desktop>java -cp emma-2.0.5312.jar emma instr -m overwrite -cp myJar.jar
EMMA: processing instrumentation path ...
EMMA: instrumentation path processed in 156 ms
EMMA: [1 class(es) instrumented, 0 resource(s) copied]
EMMA: metadata merged into [C:\Users\emahaboo\Desktop\coverage.em] {in 7 ms}he emma command.
Now I want to execute this "myJar.jar" file.
I am not sure, what should I do exactly here, because I get the below error:
C:\Users\emahaboo\Desktop>java -cp myJar.jar:emma-2.0.5312.jar myJar
Error: Could not find or load main class myJar
C:\Users\emahaboo\Desktop>java -cp myJar.jar:emma-2.0.5312.jar testClass1
Error: Could not find or load main class testClass1
Can someone help me to proceed. I want to successfully run this program and get the emma code coverage report.
On Windows, items on the classpath need to be separated with semicolons. I would put the whole parameter in quotes:
-cp "myJar.jar;emma-2.0.5312.jar"
If you want to execute a jar file, there must be a main class as the entrance. For the error "Error: Could not find or load main class...", it just means that it can't find the main class. I think the root cause for this is due to your second step: "I package this as a jar. "myJar.jar"". when you packaged it, you need to set the testClass1 as the main class for this jar file. Hope this helps.