is there a way to share scripts in jenkins (not using PIPELINE) - jenkins

Is there a way to have a shared script repo/library in Jenkins?
For example, there are building steps that run scripts I wrote. Instead of copying these script to each repo.. it would be nice to have a way to get them through one central place.
The only thing I found was:
https://jenkins.io/doc/book/pipeline/shared-libraries/
but its for pipelines which isn't an option for me.
What workaround do I have?

We use Managed Scripts plugin in order to share scripts between jobs.
After installing the "Managed Scripts" plugin, you have a new option to create a managed script. These scripts can be reused in build steps in every job. Within the job, a user is able to select the required script from a drop-down menu.

Related

Global Jenkins script that will be executed before a build is started

I'm searching for a way to execute automatically a global configured script BEFORE a Jenkins job will be started.
My use case is, all Jenkins jobs are only allowed to start if a specific environment variable is set.
If a variable is not set, the build should be aborted.
I found the Global Post Plugin https://wiki.jenkins.io/display/JENKINS/Global+Post+Script+Plugin, i only need the oposite what this Plugin does.
Maybe there's another solution?
I needed to chmod my /data/jenkins/.npm and /data/jenkins/.sbt directories before running all my builds.
I could either add a prebuild step to every job (redundant and messy) or I could go under Manage Jenkins -> Configure System.
We have a Cloud -> Amazon EC2 configuration section with "Init script" - you can add what you want to run there on slave startup.
However, if you really want something to run something for every job (not enough to run on jenkins slave startup) then you probably don't want to manually configure it for each job.
I suggest you look into Jenkins DSL as you can define preBuildSteps section on any/all job(s) which can then reference a common snippet (eg. a shell script to run).
Partial Solution:
Take a look at the Global Pre Script plugin. This plugin is less feature-rich than the Global Post Script plugin, but it should do at least a part of what you want. It notably lacks the option to abort the build, but it is able to manipulate parameters or other preconditions that your jobs rely on. You may also be able to submit a PR to add some means of preventing the build from executing.
Some options:
Modify Global Pre Script to be able to cleanly abort the build from groovy.
Change your existing jobs to check for a precondition (manually or via script). This not the most scalable option.
Replace your existing jobs with Pipeline jobs and use Shared Libraries to bottleneck the logic. (This is what I do).
Generate your jobs using the Job DSL Plugin and enforce a pre build step in every generated job. (This is what I also do)
Limitations:
Something to keep in mind for both global plugins: neither plugin provides a proper build step. The groovy code executes on the master.
One use case that neither plugin will handle is a between-job slave cleanup/sanity check.

Jenkins with Shared jobs

I am working with Jenkins, and we have quite a few projects that all use the same tasks, i.e. we set a few variables, change the version, restore packages, start sonarqube, build the solution, run unit/integration tests, stop sonarqube etc. The only difference would be like {Solution_Name}, everything else is exactly the same.
What my question is, is there a way to create 1 'Shared' job, that does all that work, while the job for building the project passes the variables down to that shared worker job. What i'm looking for is the ability to not have to create all the tasks for all of our services/components. It be really nice if each of our services/components could have only 2 tasks, one to set the variables, another to run the shared job.
Is this possible?
Thanks in advance.
You could potentially benefit from looking into the new pipelines as code feature.
https://jenkins.io/doc/book/pipeline/
Using this pattern, you define your build pipeline in a groovy script rather than the jenkins' UI. This script is then kept in the codebase of the project it builds in a file called Jenkinsfile.
By checking this pipeline into a git repository, you can create a minimal configuration on the jenkins' side and simply tell it to look towards a specific repo and do the things that pipeline says to do.
There's a few benefits to this approach if it works for your setup. The big one being that your build pipeline will be fully versioned just like the project it builds. And the repository becomes portable, easily able to be built on any jenkins' installation across as many jobs as long as the pipeline plugins are installed.

Adding Jenkins Pipelines on Build

does anyone know if its possible to add a Jenkins pipeline build into a Jenkins docker image? For example, I may have a Jenkinsfile that defines my pipeline in groovie, and would like to ADD that into my image when building from the Jenkins image.
something like:
FROM jenkins:latest
ADD ./jobs/Jenkinsfile-pipeline-example $JENKINS_HOME/${someplace}
And have that pipeline ready to go when i run it.
Thanks.
It's a lot cleaner to use Jenkinsfile for this instead. This way, as your repositories develop you can change the build process without needing to recompile and redeploy your Jenkins instance everytime. (less work, and less CI downtime) Also, having the Jenkinsfile in source code allows a simpler decoupling.
If you have any questions about extending Jenkins on Docker further to handle building NodeJS, Ruby or something else I go into how to do all that in an article.
You can create any job in Jenkins by passing in an XML file that describes the job. See https://support.cloudbees.com/hc/en-us/articles/220857567-How-to-create-a-job-using-the-REST-API-and-cURL
The way I've done this is to manually create the job I want in Jenkins, then append config.xml to the URL and it shows you the XML content needed to generate the pipeline job. Save that XML and you can deliver it to your newly deployed Jenkins instance.
I use a system similar to this to generate several hundred jobs based on our external build specifications.

Jenkins: Can I Generate groovy script from existing Job?

I have a job in Jenkins and want the groovy script for the same? Is there a way I can do that?
I have created a Job using Jenkins, like add shell Command, Sync from repo,etc. Now I wish to have groovy script for the same, but I dont want to go to the trouble of writing the entire thing again. Is there something like Export to .groovy?
P.S. I am not sure of the correct tags.
You want to copy your configuration using Groovy script to create a new job or to persist.
Below Links are helpful for clone/ copy and create jobs using groovy script.
http://jenkins-ci.361315.n4.nabble.com/How-to-Copy-Clone-a-Job-via-Groovy-td4097412.html
Can I use Jenkins CLI or some groovy scripts to create a new job
https://wiki.jenkins-ci.org/display/JENKINS/Clone+all+projects+in+a+View
In case you want to persist your Job, always backup your resources file such as config.xml, jenkins.xml,etc..
you can recreate job from config.xml which holds all your job configuration
Check out this plugin. It seems to be appropriate for your purpose. Though it has some restrictions for plugins in your jobs that are not capable of Pipeline syntax. Anyway, it could be useful for code generation.

Centrally managed build profiles in Jenkins

I would like to be able to configure centrally something like "build profiles" which I can apply to multiple projects in Jenkins.
For instance, I want to setup a compile, email, deploy chain to be used by several projects. When I change something in this chain, I want to automatically apply the changes to all linked projects.
Is there a convenient way to do this? I am also open to suggestions for other build systems, as long as they can deal with sbt projects.
I see there is a SBT plugin for Jenkins which looks popular-I haven't used it
I have used the jenkins job-dsl which covers sbt out the box. This works by a build step in a job to create/regenerate other jobs (with an optional template)
The problem with having a generic job building separate projects is that all the job history gets merged together. I think it is better to use stand-alone jobs for each task and the job-dsl will allow you to do that
TeamCity supports build configuration templates out of the box and recently added basic sbt support.

Resources