We use multi configuration job to test various combination of 2 separate components for compatibility test. These are our compatibility acceptance tests. We perform the following steps
Prepare test environment using docker and docker-compose where we deploy these components. (Pre build for us)
Run multi configuration job to test compatibility
Clean up - stop containers(Post build)
We have dedicated jobs for step 1 and 3 and use multijob for consolidated view.
Is it possible to have a consolidated one job which does that ? Possibly
Before starting the parent job of multi configuration, we set our environment.
Run all combinations of multi configuration
Parent job post build does clean up.
You can use "Prepare Env for the run" to setup your environment
and using the "Trigger builds on other project(Build Step)" combinations of multi configuration can be run,
and at the end with in the same job using Post build actions should be able to cleanup can be done.
Related
We have set up a Jenkins instance as a remote testing resource for our developers. Every time a tag is created matching our refspec a job is triggered and the results emailed to the developer.
A job is defined as follows:
1 phase consisting of three jobs (frontend tests, integration tests,
unit tests)
All subjobs are executed, irrespective of success
Email the developer the test results
This setup mostly works except for two issues:
I cannot get the job to run in parallel. The subjobs run in
parallel, but only one instance of the job runs at a time. Is this
something I can configure differently somewhere, or is this inherent
in the way the plugin works?
The main job checks out and occupies one of our build servers for
the duration of the job. Is there a way to do git polling and then
just grab the hashref and release the build server on which the
polling was done before continuing building the subjobs?
In the multi job plugin, everything runs in parallel that is listed in the same "Phase", however the multijob itself needs somewhere to run. If you have a build followed by a test phase, you can add a "Build Phase" prior to the test phase, and only that phase will require a "build server".
There is an option called "Execute concurrent builds if necessary" that will allow multiple jobs of the same name to run simultaneously. This option must be set for the parent job and the subjobs as the default behavior of Jenkins is to only allow one build of a Project (job) to run at a time. Beware: Read the comments as this may have unintended side effects.
Not clear what you mean about polling however if using git, you may want to use webhooks so that pushes to the git repository directly invoke Jenkins. No need to poll.
I have a CI pipeline configured on Jenkins. When the jobs execute successfully, I want a trigger to pass on to XL Release so that it automatically triggers the deployment process. Is this possible ?
There's the Jenkins-XLR plugin that you can install straight from Jenkins. In Jenkins, go to Manage Jenkins > Manage Plugins and search for the XL Release plugin.
The plugin page is here: https://wiki.jenkins-ci.org/display/JENKINS/XL+Release+Plugin
Some more information can be found in this blog post.
One other approach you could consider is having XL Release drive your CI, by creating a Build / (Provision) / Deploy / Test template. This template polls your SCM, and when kicked off, executes the phases:
Build Phase: Jenkins plugin to run Jenkins, and store output to variable
Provision Phase: Some customers have this phase, since they need to run Salt/Puppet/Chef/Ansible type provisioning as part of the overall deployment
Deploy Phase: XL Deploy plugin
Test Phase: Kick off any other tests you do as part of the deployment
Some benefits to this approach:
XL Release gives you visibility / information across the disparate tools used for delivery
If you include testing in each phase, you can make decisions about proceeding or not during each phase
You could automatically kick off subsequent releases (to QA, for example) if the entire release passed.
You can also see an example this here XLRelease Provision, Build, Deploy and Test
We have 1 build controller and 2 build agents. One build agent (tfsbuild01) is on the build controller machine (tfsbuild01). Another is on a separate machine (tfsbuild02).
We have 2 branches:
Main
8.0
and 3 build definitions
Main (Gated Checkin)
8.0 (Gated Checkin)
Main Coded UI Tests (Scheduled for 12PM daily)
If someone queues up an 8.0 build and a Main build at the same time, the builds are correctly distributed across the two build agents and they build at the same time.
If the Coded UI Tests are running (which take an hour) and someone tries to check in to Main (which is Gated), the Main build sits in the queue until the Coded UI Tests finish. How can I get concurrent builds in the same branch working?
Tags configuration on all build definitions look like this:
This is really a horses for courses question. You have your build of your asset and your instance tests mixxed up. I would recommend that you push your CodedUI tests off to an environment and only run the tests that can be executed from code, and ideally unit tests only, with no instance on the build box.
To execute your codedUI tests you should create a release pipleine in Release Management and also set up your release environment as a Lab environment. This way you can have quick builds and then ouch the longer validation out of that pipeline.
http://nakedalm.com/building-release-pipeline-release-management-visual-studio-2013/
http://nakedalm.com/execute-tests-release-management-visual-studio-2013/
This will free up your build agents and hopefully make your environment slicker.
I'm new to Jenkins. I'm trying to implement a specific scenario in a single job to build mobile applications using Jenkins.
In a single job I want to launch several tasks sequentially:
Task 1 (Windows) ---> Task 2 (Windows) ---> Task 3 (Windows) ---> Task 4 (Mac OSX)
Each job will be dedicated to a single project. Passing results from a task to another can be realised through the workspace, but it seems that job tasks must all run on the same environment. Is there any plugin that will let me run some tasks of the job in a particular slave ?
Thanks in advance
You could use trigger builds remotely on your slave jobs.
Then from the master job you can execute slave builds using curl. Like this:
$(curl --user "username:password" "http://jenkins.yourdomain.org/job/JOB-name/buildWithParameters?SOMEPARAMETER=$SOMEPARAMETER&token=TheSecretToken")
TheSecretToken is the token password you specified on your slave plugins.
And username:password is a valid user on your jenkins. Don't use your own account here but rather a 'build trigger' account that only has permissions to start specific jobs.
Define a job for each task you have mentioned.
Have a slave on the remote machine(s) - presumably the Mac.
In each job, set the relevant host that will run it (you have a parameter for that).
Use the "trigger parameterized build" plugin to trigger the jobs in the correct sequence, and make sure you pass "Current build parameters" in that section.
This plugin will allow you to pass other values as well - read its help for more details.
Try this
Build flow Plugin
Multijob Plugin
I need to build and test on multiple configurations: linux, osx and
solaris. I have slave nodes labeled "linux", "osx" and "solaris". On
each configuration, I want to (a) build (b) run smoke tests
(c) if smoke tests pass, then run full tests, and perhaps more.
I thought that multi-configuration jobs might be the answer, so I setup a
multi-configuration build job and it starts concurrent builds on each
OS. The build job will trigger a downstream smoke-test build, which, in
turn, triggers the full-test job.
I've run into the following issues
If one of the configurations fails, the job as a whole fails, and
Jenkins will not fire any downstream jobs (e.g., if the solaris build
fails, Jenkins will not run smoke tests or full tests for osx and
linux).
The solaris build takes about twice as long as the others (on the
order of an hour), and I'd prefer the linux and osx smoke tests not
wait for the solaris build to finish.
Does that mean I'm left with hand-crafting three pipelines of jobs, and
putting them behind a "start-all" job (i.e., creating and hand-chaining
the following jobs)?
build-linux smoke-test-linux full-test-linux
build-osx smoke-test-osx full-test-osx
build-solaris smoke-test-solaris full-test-solaris
Did I miss something obvious?
As far as I know the answer is to create 3 matrix jobs, one for each system. They then would have 3 subjobs (build, smoke-test, fulltest) with the build-job as a touchstone.
Have you thought about combining the build, smoke-test and full tests into a single multi-configuration job? Other than being a little messy, this should work for you.
To answer your first issue: to trigger a downstream job regardless of result, use trigger parameterized build to run when complete (always trigger) and then check "build w/o parameters"
To answer your second issue: either use an all encompassing multi-configuration (matrix) job or use three separate job streams as you mentioned. UPDATE: you could run 3 sequential matrix jobs for each step (build, smoke-test, full tests) but it would mean that if any of the build steps failed then none of the smoke-tests would be run.