I followed contribution-guide for Apache Beam to setup my development environment. The project compiles fine, however I always get following failure when I run mvn verify:
[ERROR] Tests run: 2, Failures: 1, Errors: 0, Skipped: 0, Time
elapsed: 60.315 s <<< FAILURE! - in
org.apache.beam.runners.apex.examples.WordCountTest [ERROR]
testWordCountExample(org.apache.beam.runners.apex.examples.WordCountTest)
Time elapsed: 32.259 s <<< FAILURE! java.lang.AssertionError: result
files exist at org.junit.Assert.fail(Assert.java:88)
When I run test org.apache.beam.runners.apex.examples.WordCountTestvia IntelliJ, all assertions passes. I have a feeling there is a race condition in the assertion that is failing. Am I missing something to setup the environment?
Related
We have in our System SmokeTests on Jenkins. In the log, I see:
[ERROR] Tests run: 170, Failures: 1, Errors: 0, Skipped: 17
Jenkins give it out as successful. What can be the problem?
I have found the oroblem. The instance of the jenkins had not enogh memory to catch all. So i reduced the debugmode to write less informations. Know it run without a problem
This question already has an answer here:
Unable to use read('classpath:') when running tests with standalone karate.jar
(1 answer)
Closed 1 year ago.
Is there a way I can debug through the test in a similar way that I would debug a Java app using Maven?
For example, if I set breakpoint in the implementstion of step and click on debug with next parameters of CL:
mvn clean test -Dkarate.options="--tags ~#ignore" -Dtest=MainRunner -DforkCount=0
and I'm getting next errors:
[INFO] --- maven-surefire-plugin:2.22.2:test (default-test) # karateSberApi ---
[WARNING] useSystemClassloader setting has no effect when not forking
[INFO] Running examples.MainRunner
before all
get credentialsMap by cmd
19:27:39.169 [main] INFO com.intuit.karate.RunnerOptions - found system property 'karate.options': --tags ~#ignore
after all
[ERROR] Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0.875 s <<< FAILURE! - in examples.MainRunner
[ERROR] testAll Time elapsed: 0.872 s <<< ERROR!
java.lang.NoClassDefFoundError: jdk/nashorn/api/scripting/ScriptObjectMirror
Caused by: java.lang.ClassNotFoundException: jdk.nashorn.api.scripting.ScriptObjectMirror
I tried to run next and it's working:
public static void main(String[] args) throws ScriptException {
ScriptEngine engine = new ScriptEngineManager().getEngineByName("nashorn");
engine.eval("print('Hello World!');");
}
What's wrong?
env:
Intellij Idea 2019.3
Bundled (Maven 3) version 3.6.1
java version "1.8.0_201"
Java(TM) SE Runtime Environment (build 1.8.0_201-b09)
Java HotSpot(TM) 64-Bit Server VM (build 25.201-b09, mixed mode)
thanks for your answers!
Just use the Visual Studio Code extension / Karate Runner: https://github.com/intuit/karate/wiki/IDE-Support#vs-code-karate-plugin
Video here: https://twitter.com/KarateDSL/status/1167533484560142336
Currently we are using IntelliJ, Scala, SBT to kick off our tests in our local enviroment. With SBT command line, we can specify specific test, suites, wildcards, as instructed here:
ScalaTest.org Page
Such as "test-only *RedSuite"
However on our CI Jenkins server, with the SBT pluging, when specifying this, it gives an error.
In the action field, the following values were used:
Action:compile test-only test.package.name
Using the following does work for ALL tests:
Action:compile test
[success] Total time: 240 s, completed Apr 28, 2014 12:29:36 PM
[error] Expected ID character
[error] Not a valid command: org (similar: export)
[error] Expected project ID
[error] Expected configuration
[error] Expected ':' (if selecting a configuration)
[error] Expected key
[error] Not a valid key: org (similar: fork, run, doc)
[error] org.company.scalatest.abc.regressionsuite
[error] ^
Build step 'Build using sbt' changed build result to FAILURE
Build step 'Build using sbt' marked build as failure
Recording test results
Does anyone know if there is a way we can pass these parameters through the jenkins SBT plugin?
This is a quoting problem, your Action field is parsed as 3 commands:
compile
test-only
org.company.scalatest.abc.regressionsuite
And it chokes because that org is not a valid command.
Using compile "test-only org.company.scalatest.abc.regressionsuite" should fix that.
I'm running TestNG from Ant. I'm using my own test listeners. I'm refactoring the code and once a while I got
[testng] Total tests run: 7, Failures: 0, Skips: 7
[testng] Configuration Failures: 1, Skips: 2
What will be the best approach to fix configuration failures ?
The HTML reports will tell you which configuration methods failed.
We're currently using JUnit 4.4 and Ant 1.7.1 for our builds. I have some test cases that are #Ignored. When I run them in Eclipse, the JUnit test runner reports them as ignored. I would like to see them listed in the XML output from Ant (so I can report on them), but they do not seem to be there.
Does anyone have this working? Is there a switch to turn them on? An upgrade I need to do?
It looks like this is a known Ant issue/bug.
This thread talks about the same issue, but it provides some additional information: you can get data on ignored tests when running the tests using maven surefire, and hudson is able to display that data.
http://jenkins.361315.n4.nabble.com/Is-it-possible-to-show-Ignore-JUnit-tests-td1565288.html
A fix for this issue has now been applied to the head of Ant core, scheduled for release as part of the upcoming version of Ant 1.9.0.
It should be possible to try this fix locally be replacing ant-junit.jar in your Ant distribution's lib directory with the version from the nightly builds, or by running the full nightly Ant distribution, or by building the Ant sources directly. Since the Ant team are currently voting on preparing a new release it may just be worth waiting for 1.9.0 to be officially packaged and pushed out for download.
Just tried Ant 1.9.0 with JUnit 4.11. If you use <junit printsummary="on"> you'll get output like:
[junit] Running com.example.IgnoredTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.01
[junit] Running com.example.PassingTest
[junit] Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.01
[junit] Running com.example.FailingTest
[junit] Tests run: 1, Failures: 1, Errors: 0, Skipped: 0, Time elapsed: 0.01
I think it'd be preferrable if we could get output like this with printsummary=off:
[junit] Test com.example.IgnoredTest SKIPPED
[junit] Test com.example.FailingTest FAILED
but it seems the more verbose output above is the best we can do, unless I'm missing some obscure trick with one of the junit task arguments.