I am working with Jenkins, and we have quite a few projects that all use the same tasks, i.e. we set a few variables, change the version, restore packages, start sonarqube, build the solution, run unit/integration tests, stop sonarqube etc. The only difference would be like {Solution_Name}, everything else is exactly the same.
What my question is, is there a way to create 1 'Shared' job, that does all that work, while the job for building the project passes the variables down to that shared worker job. What i'm looking for is the ability to not have to create all the tasks for all of our services/components. It be really nice if each of our services/components could have only 2 tasks, one to set the variables, another to run the shared job.
Is this possible?
Thanks in advance.
You could potentially benefit from looking into the new pipelines as code feature.
https://jenkins.io/doc/book/pipeline/
Using this pattern, you define your build pipeline in a groovy script rather than the jenkins' UI. This script is then kept in the codebase of the project it builds in a file called Jenkinsfile.
By checking this pipeline into a git repository, you can create a minimal configuration on the jenkins' side and simply tell it to look towards a specific repo and do the things that pipeline says to do.
There's a few benefits to this approach if it works for your setup. The big one being that your build pipeline will be fully versioned just like the project it builds. And the repository becomes portable, easily able to be built on any jenkins' installation across as many jobs as long as the pipeline plugins are installed.
Related
Following the Jenkins Best Practices, I want to avoid that Build Jobs/Pipelines could be executed into my Jenkins Master.
To do so, I've installed the Job Restrictions Plugin, using it to configure the Master to run only some Maintenance Pipelines.
The problem is that now Build Pipelines that are configured to run on specific Agents, are not executed anymore. I see that the Build Queue continuously grows, and the Pipelines are not runned. I think that this behaviour could be related to Flyweight Executors of the Master.
So, the question is the following: How can I execute on Master just a little subset of Maintenance Pipelines and, in the mean time, execute Build Pipelines only on specific Agent?
You can configure the master node to only be used when explicitly named. Just click the master node > go to configure and change Use this node as much as possible to Only build jobs with label expressions matching this node
I found the solution that perfectly fits with my needs, here.
To quickly sum up the solution, I was to able to exclude all the user Builds from Master and run on it only the Jobs/Pipelines of a specific Jenkins folder (IuA in my case), configuring the Job Restrictions Plugin in the following way:
In order to better understand the logic behind this solution, I recommend you to give a look at link that I posted above.
I was hoping someone could help with ideas in Jenkins for sharing a build increment across multibranch pipelines on multiple build machines. Looking through similar questions, I didn't see anything better than timestamp instead of build number but it's not quite what we were looking for.
I am using Jenkinsfiles to define multiple pipelines, then 'multibranch' to instantiate them across all branches. Currently these just call shared freestyle jobs to implement the stages. Every pipeline invokes the same job to do a build, across a pool of build machines, so we can just use that build number to increment the version. For example I have v1.2 being worked on by several branches, each having a CI, Nightly, and Release pipelines. They all invoke the same build sub-job so you might have CI/BranchA run sub-job #503 on buildVM1 so will have version string 1.2.503, then Nightly/BranchB will run sub-job #504 on buildVM2 so will have version string 1.2.504. This works great as long as I am invoking the same build sub-job.
The next step is to implement the builds as part of the Jenkinsfile pipeline, but then I lose my convenient build increment. How can I define a shared build increment for this component across all branches defining the same major.minor version?
Timestamp is a bit unwieldy since the multiple branches/teams/build machines means I need to go to seconds. Do I really need versions like 1.2.20180118165007? There's got to be a better way.
How else can I manage this?
I'm searching for a way to execute automatically a global configured script BEFORE a Jenkins job will be started.
My use case is, all Jenkins jobs are only allowed to start if a specific environment variable is set.
If a variable is not set, the build should be aborted.
I found the Global Post Plugin https://wiki.jenkins.io/display/JENKINS/Global+Post+Script+Plugin, i only need the oposite what this Plugin does.
Maybe there's another solution?
I needed to chmod my /data/jenkins/.npm and /data/jenkins/.sbt directories before running all my builds.
I could either add a prebuild step to every job (redundant and messy) or I could go under Manage Jenkins -> Configure System.
We have a Cloud -> Amazon EC2 configuration section with "Init script" - you can add what you want to run there on slave startup.
However, if you really want something to run something for every job (not enough to run on jenkins slave startup) then you probably don't want to manually configure it for each job.
I suggest you look into Jenkins DSL as you can define preBuildSteps section on any/all job(s) which can then reference a common snippet (eg. a shell script to run).
Partial Solution:
Take a look at the Global Pre Script plugin. This plugin is less feature-rich than the Global Post Script plugin, but it should do at least a part of what you want. It notably lacks the option to abort the build, but it is able to manipulate parameters or other preconditions that your jobs rely on. You may also be able to submit a PR to add some means of preventing the build from executing.
Some options:
Modify Global Pre Script to be able to cleanly abort the build from groovy.
Change your existing jobs to check for a precondition (manually or via script). This not the most scalable option.
Replace your existing jobs with Pipeline jobs and use Shared Libraries to bottleneck the logic. (This is what I do).
Generate your jobs using the Job DSL Plugin and enforce a pre build step in every generated job. (This is what I also do)
Limitations:
Something to keep in mind for both global plugins: neither plugin provides a proper build step. The groovy code executes on the master.
One use case that neither plugin will handle is a between-job slave cleanup/sanity check.
does anyone know if its possible to add a Jenkins pipeline build into a Jenkins docker image? For example, I may have a Jenkinsfile that defines my pipeline in groovie, and would like to ADD that into my image when building from the Jenkins image.
something like:
FROM jenkins:latest
ADD ./jobs/Jenkinsfile-pipeline-example $JENKINS_HOME/${someplace}
And have that pipeline ready to go when i run it.
Thanks.
It's a lot cleaner to use Jenkinsfile for this instead. This way, as your repositories develop you can change the build process without needing to recompile and redeploy your Jenkins instance everytime. (less work, and less CI downtime) Also, having the Jenkinsfile in source code allows a simpler decoupling.
If you have any questions about extending Jenkins on Docker further to handle building NodeJS, Ruby or something else I go into how to do all that in an article.
You can create any job in Jenkins by passing in an XML file that describes the job. See https://support.cloudbees.com/hc/en-us/articles/220857567-How-to-create-a-job-using-the-REST-API-and-cURL
The way I've done this is to manually create the job I want in Jenkins, then append config.xml to the URL and it shows you the XML content needed to generate the pipeline job. Save that XML and you can deliver it to your newly deployed Jenkins instance.
I use a system similar to this to generate several hundred jobs based on our external build specifications.
I would like to be able to configure centrally something like "build profiles" which I can apply to multiple projects in Jenkins.
For instance, I want to setup a compile, email, deploy chain to be used by several projects. When I change something in this chain, I want to automatically apply the changes to all linked projects.
Is there a convenient way to do this? I am also open to suggestions for other build systems, as long as they can deal with sbt projects.
I see there is a SBT plugin for Jenkins which looks popular-I haven't used it
I have used the jenkins job-dsl which covers sbt out the box. This works by a build step in a job to create/regenerate other jobs (with an optional template)
The problem with having a generic job building separate projects is that all the job history gets merged together. I think it is better to use stand-alone jobs for each task and the job-dsl will allow you to do that
TeamCity supports build configuration templates out of the box and recently added basic sbt support.