I have E2E tests written in golang and java for my application.
I want to kick off all E2E tests together as part of one Jenkins job. Is it doable?
Related
I needed to setup a Jenkins pipeline and I wanted to know if unit test should run before the actual compilation of the project
The Jenkins pipeline currently runs unit and Integration test before the bundling and wanted know if that's ok
I created a VueJs project with some unit tests (using Jest) and integration tests using Cypress.
I have also a Jenkins pipeline in order to build, test and deploy the application.
I Integrated, as test stage, the unit tests but I would like to integrate also Cypress in order to run the integration tests into a dedicated pipeline step.
Is is possible to have this without installing any additional Cypress Jenkins plugin?
I mean, Is it possible to use a docker image to run the tests using Cypress?
Can you point me to some examples?
You should be able to use this with Jenkins Docker capabilities. Also here is another example you can refer to.
I am setting up the integration test framework for a Java rest api in our project and we want to run integration test in the gitlab pipeline. Since these tests are running in the same project as the API, we are wondering couple of things:
We dont want to run Karate tests during the maven build process. We want to run them only at integration test stage after the application deployment stage is complete. How do we do that as the maven build process runs both the junit unit tests and karate tests.
Since the API requires authentication, we need to run the karate test in a docker container, since we can inject our credentials only in the container as we are using hashi-corp vault to store the credentials. How do we launch a container with Karate client.
There are ways to run only a subset using Maven. What I do is define a different JUnit test, and call that from the command-line. Read the docs for more: https://github.com/karatelabs/karate#command-line
As long as you can pass environment variables (which you certainly can in Docker) you are good. Refer: https://stackoverflow.com/a/52821230/143475
I'm trying to get a bunch of MSpec tests to run on multiple cores in TFS 2013. It doesn't appear to do it out of the box. It can run MSpec, but only in sequence and it takes a over an hour.
I am following this guide, but in step 4 he says replace the Foreach Xaml element with ParallelForEach to get the tests to run in parallel. I downloaded the default build template in TFS 2013. It is a lot simpler, but it doesn't have this tag.
It has:
<mtba:RunAgileTestRunner
DisplayName="Run VS Test Runner"
Enabled="[Not AdvancedTestSettings.GetValue(Of Boolean("DisableTests", false)]"
TestSpecs="[AutomatedTests]"
ConfigurationsToTest="[ConfigurationsToBuild]" />
The default MSpec test runner cannot run tests in parallel. That's why you see the reimplementation of a parallel test runner.
I doubt that TFS is implementing an MSpec test runner from the framework source code (although that would be possible). That parallel test runner is using internal classes, like ISpecificationRunner, and running them in parallel.
Your only options, if you must stick with MSpec and TFS, are
Split your tests into multiple projects/assemblies and feed them to a TFS parallel task that shell-executes the default test runner
Use a TFS shell-execute task to run your tests through the parallel runner
I am assuming that if you want to run tests in parallel they are integration tests that take a long time to run.
If that is the cas then you should move all non Unit Tests out of the build and push them further down the pipeline.
http://nakedalm.com/execute-tests-release-management-visual-studio-2013/
You can use Release Management to deploy your application and run your integration tests. Here you can run a larger number of long running tests without locking your build servers.
I am using Jenkins for integration testing.
Just to give the context. At the moment I have a separate build server which produces the build daily and Jenkins is not used as the build server. The build server executes the unit testing in my case.
When build process is complete it invokes the Jenkins job. In that job Jenkins start to deploy the build into the Virtual machine. I have a script for doing this.
Followed to that my plan is to run several scripts for doing the end-to-end testing.
Now I have several question in this regard:
How to parallelize the execution of the end-to-end tests?
As I am adding scripts after script I am getting worried how manageable it will be?
I am always using the web interface for adding and changing the scripts. How to do this from the command line?
Any ideas for a good tutorial? Any pointers from all of you? Thanks!
Looks like Build Flow Plugin is what I need.
https://github.com/jenkinsci/build-flow-plugin
You might want to try and see if you can use the Build Pipeline plugin before build flow. Much better visualization of what is going on, less scripting.
I link Build and deploy jobs in one sequence and then have unit and integration test jobs linked separately off the build job. You can then use Fail The Build plugin to have downstream jobs fail upstream ones.