I have a Jenkins pipeline for APIC(API Connect), which downloads the code from git hub, validates the code, deploys the code on API Manager(present on Cloud Pak), tests it using Soap UI and finally stores the test report on Nexus.
I now have to containerize this entire pipeline, so that it could be put on any server or any machine and can be started as a self sufficient service.
I understand that I would need to use docker for this, push the images for the tools used and then have some sort of interconnection between the images.
Please help me if my understanding is correct and what should be the approach that I should follow. Any reference links are appreciated.
Thanks in advance.
You can make use of Jenkins DSL Plugin to create a seed job of your pipeline and then, you can make use of Docker container as build slaves for Jenkins.
So, whenever you run the seed job, the child jobs will run on the docker container slaves.
You can refer to this stackoverflow post or Jobs as Code with Groovy DSL to learn more about Jenkins Job DSL.
Related
I am Implementing Jmeter/taurus for performance testing for microservices. We are using Openshift PaaS solution to run all microservices. I am able to deploy jmeter/taurus inside Openshift using jenkins pipeline and generated the taurus report using jmx report in the container. My requirement is to publish the taurus report to Jenkins, rather than storing it to cloud storage or nexus. Can someone advise me what should be best approach to publish performance report for developers on Jenkins or any other optimal way to publish.
I found something by googling where they Jenkins agent was deployed inside Openshift and checkout the test suite Git repo into the agent's workspace just want to make sure if this is the best approach for my scenario. Our Jenkins master is running on Google cloud platform VM's with some dynamic slaves.
Thanks in Advance!
According to Dump Summary for Jenkins Plugins Taurus User Manual Chapter, you just need to add reporting module definition to your YAML configuration file like:
reporting:
- module: final-stats
dump-xml: stats.xml
And "feed" this stats.xml file to Jenkins Performance Plugin
That's it, you should get Performance Report added to your build dashboard. Check out How to Run Taurus with the Jenkins Performance Plugin article for more information if needed.
I'm new to Jenkins/Docker. So far I've found lots of Jenkins official Documents recommended to be used with Docker. But the necessity and advantages of running Jenkins as a docker container remain vague to me. In my case, it's a node/react app and environment required is not complicated.
Disadvantages I've found running Jenkins as a Docker container:
High usage of hard drive
Directory path in docker container is more complicated to deal with, esp when working with ssh in pipeline scripts
Without docker, I can easily achieve the same and there's also blueocean plugin available.
So, what's the main benefits of Docker with Jenkins/Jenkins Pipeline? Are there pitfalls for my node application using Jenkins without Docker? Articles to help me dive into are also appreciated.
Jenkins as Code
The main advantages of Jenkins in Docker is that it helps you to get: Jenkins as Code
Advantages of Jenkins as code are:
SCM: Code can be in put under version control
History is transparant, backup and roll-back becomes easy.
The code is the documentation of your Jenkins setup.
Jenkins becomes portable, so you can run Jenkins locally to try new plugins etc.
Jenkins pipelines work really well with Docker. As #Ivthillo mentioned: there is no need to install additional tools, you just use images of these tool. Jenkins will download them from internet for you (Docker Hub).
For each stage in the pipeline you can use a different image (i.e. tool). Essentially you get "micro Jenkins agents" which only exists temporary. This makes your Jenkins setup much more clean.
Disadvantage is:
Jenkins initial (Groovy) configuration is poorly documented on the web.
Simple Node setup
Most arguments also holds for a simple Node setup.
Change the node version or run multiple job each with a different Node version becomes easy.
Add your Jenkinsfile inside the Node repo. So everyone with a Jenkins+Docker setup can run your CI/CD.
And finaly: gather knowledge on running your app inside a container will enable your to run your production app in Docker in the future.
Getting started
A while ago I have written an small blog on how to get started with Jenkins and Docker, i.e. create a Jenkins image for development which you can launch and destroy in seconds.
I am trying to understand how to best deploy an instance of Jenkins, complete with plugins, users and jobs using Chef. I am currently using the Chef Jenkins Supermarket cookbook.
I am attempting to achieve automated deployment of our Pipelines as part of the project. From what I have gathered, the best way to go about this is to have Chef configure a seed job in Jenkins initial setup and configuration.
The seed job should specify, among other things, the git repository from which to find and use a Jenkinsfile for a given job. I've found this resource by Daniel Spilker to be helpful in explaining seed jobs.
So the seed Jenkins job would be run, which would then generate the Jenkins job we have just scripted with it (in this case the seed job would be to pull the Jenkinsfile from source control and configure a new Jenkins job (our pipeline), with the details of the Jenkinsfile).
Am I understanding this correctly as the proper way to not only automate Jenkins job configuration, but also as the proper way to always have an up to date job configuration for any given job in the event the job configuration were to change?
If we used a seed job to setup our pipeline, what are some possible solutions to having the initial seed job run automatically once Jenkins is fully configured by Chef?
As for job configuration changes that may occur over time, would we need to setup the seed job to poll source control for any changes in the Jenkinsfile periodically in the event the Jenkinsfile has been modified? (It may be helpful to note that we are currently using BitBucket for source control).
Just getting started with pipeline as code. Thanks to everybody in advance for their patience and guidance.
I've mentioned this a bit in your other questions, but the least painful approach is to treat Jenkins as a database, not a web service. Have Chef do the basic install, but then configure the initial bits by hand. For DR, rely on your backups rather than Chef.
does anyone know if its possible to add a Jenkins pipeline build into a Jenkins docker image? For example, I may have a Jenkinsfile that defines my pipeline in groovie, and would like to ADD that into my image when building from the Jenkins image.
something like:
FROM jenkins:latest
ADD ./jobs/Jenkinsfile-pipeline-example $JENKINS_HOME/${someplace}
And have that pipeline ready to go when i run it.
Thanks.
It's a lot cleaner to use Jenkinsfile for this instead. This way, as your repositories develop you can change the build process without needing to recompile and redeploy your Jenkins instance everytime. (less work, and less CI downtime) Also, having the Jenkinsfile in source code allows a simpler decoupling.
If you have any questions about extending Jenkins on Docker further to handle building NodeJS, Ruby or something else I go into how to do all that in an article.
You can create any job in Jenkins by passing in an XML file that describes the job. See https://support.cloudbees.com/hc/en-us/articles/220857567-How-to-create-a-job-using-the-REST-API-and-cURL
The way I've done this is to manually create the job I want in Jenkins, then append config.xml to the URL and it shows you the XML content needed to generate the pipeline job. Save that XML and you can deliver it to your newly deployed Jenkins instance.
I use a system similar to this to generate several hundred jobs based on our external build specifications.
My development setup is such that for every svn checkin code is built,unit tested, packaged and published in Artifactory. Now I want to automate my deployment process & run integration(Selenium) test as part of this process. I am thinking of using Puppet to managed the deployment
Is puppet the correct tool for this
What is the process I should use to trigger puppet master to initiate a fresh installation on agents, I couldn't find any Jenkins plugin that would actually trigger puppet. One option is to call
puppet apply ...
as a Jenkins post build task
Any suggestions welcome, thank you.
Have a look at this Selenium Jenkins article from Saucelabs, a service that automates cross-browser testing. Though they are a vendor with a service to sell, the article covers how to do Selenium testing yourself with Jenkins. It also exposes common pain points you are likely to run into with this approach.
A Puppet master doesn't serve the function of orchestrating client convergences. Take a look at Mcollective. This is a tool that will allow you to trigger puppet runs on target systems from a Jenkins agent via script commands.
Some Mcollective getting started material:
http://www.slideshare.net/PuppetLabs/presentation-16281121
http://puppetlabs.com/mcollective