How can Jenkins read a polling text file checked in GIT to trigger a deployment? - jenkins

Current scenario: Build and deployment happens in development environment and the code is checked in to GIT and the JAR file is placed in Nexus. Then a change request is raised to deploy the same to the QA environments. The CR is attached with two parameterized text files (One of which contains the nexus path and other contains website URL) which act as input for parametrized build along with selection of environment. Run deploy
Target Scenario:We want to remove the CR part and in doing so we want a file (containing parameters which were attached in CR) which when pushed to GIT, a copy-paste should happen to the parametrized Jenkins job in respective parameters and select the environment from dropdown.
What is the best way to achieve this, either by creating another Jenkins job which can read the parameters from the file or is there any other way.
P.S. We don't want to make any editing in the existing Parameterized Jenkins jobs.

Using the Jenkins GitHub Plugin, you can create a separate job with a GitHub build trigger. By adding the GitHub repo (where the parameter file is pushed) to this Jenkins job, you can process the file to get the parameters you want in order to kick off the appropriate Jenkins jobs.
For Jenkins to process the parameters, one option is to use the EnvInject Plugin. (As suggested in this answer.) Another suggestion: Extended Choice Parameter Plugin (from this answer).

Related

Jenkins: is there a simple way to trigger jenkins job when a specific file is modified in enterprise github?

Is there a simple way to trigger jenkins job when a specific file is modified in enterprise github?
For example:
github: https://example.git.lab.com
repo: testrepo
If a file named base.cfg in the repo is changed, then a Jenkins freestyle job in Jenkins is triggered and executed. Thanks.
I checked git and github plugin, but didn't find they have this functionality.
In your Jenkins freestyle job configuration page:
Under Source Code Management, select Git.
Specify the repository URL, credentials, and branch.
Next to Additional Behaviours, click Add and then select Polling ignores commits in certain paths.
In the Included Regions textbox, specify the filepath relative to the root of the repository. You can specify multiple filepaths or regex patterns in new lines.
Save the changes.
Assuming that you have already configured Gitlab webhooks, this job will be triggered only when the specified file(s) have been updated. The other files in the repository will be ignored.

Pipeline to use artifacts from 2 projects associated by the same git branch name

the company where I work for is evaluating jenkins 2.71, in particular the pipeline and blue ocean plugins. We already tested also GoCD and we need, as in GoCD, a way for a pipeline to automatically fetch the artifacts from 2 other pipelines (taking the last successful result of each one of them), here our case.
We have these initial pipelines (build & run tests), which reflect 2 projects:
frontend, ~ 15 minutes
backend, ~10 minutes
I created a pipeline called configure (~1 minute), with e.g. a parameter called customer-name, which takes backend and frontend files and puts them together, then applies specific customer specific configurations and customizations and produces deployable artifacts. Instead of "customer-name" I could also parallelize this job to create all the artifacts for each customer at once, separated in different directories.
The next pipeline would be to deploy them on different test servers separated for each customer. This could be also part of the same configure pipeline, we still have to see how to put things together in jenkins...
Ideally, I need configure pipeline to be triggered automatically (or also on demand) after each frontend or backend success and take as input the last successful artifacts from these 2 pipelines, but not just having the last successful build, we need as dependency the git branch name.
E.g. we have:
backend branches:
master
release/2017.2
frontend braches:
master
release/2017.2
In the pipeline editor, I found a Build Triggers option and set it as follows: Build after other projects are built > Projects to watch: frontend, backend > Check Trigger only if build is stable or better in my test environment full of failures Trigger even if the build is unstable.
Searching further, I found Copy Artifact Plugin
But now the big question, how to fetch the last successful artifacts from these pipelines with the same git branch name?
Because we don't want to mix e.g. a backend build of "release/2017.2" with frontend "master", it has to find as the last successful build having the same relationship or parameter or whatever you wanna call it, in our case the association is the git branch name.
Is it possible to achieve this? If yes, how?
The copy artifact plugin seems to work in a freestyle project. Would it work in a pipeline? That's also a concern...
Thanks
Yes, the Copy Artifact plugin does work in both freestyle and pipeline projects; pipeline uses the copyArtifact function that I referenced in my comment. Note that if you go to the Pipeline Syntax link, it's kind of hidden: you have to first select "step: General Build Step" from the drop-down, then it will give you the Copy Artifact pipeline command builder.
I'm going to assume that your frontend and backend projects are built as multi-branch pipelines, as that would probably be easiest to maintain so that you don't have to keep creating new projects for every release. You can reference these projects from other projects by referencing <project name>/<branch name> (sometimes I've had to replace the / with %2f instead, I think mostly on freestyle projects). You could then set up your configure project as a parameterized build (either pipeline or freestyle), say with a string parameter of PROJECT_BRANCH_NAME. Then put in the following in your frontend/backend project pipeline scripts to trigger a build of your configure project
build job: 'configure', parameters: [[$class: 'StringParameterValue', name: 'PROJECT_BRANCH_NAME', value: ${env.BRANCH_NAME}]]
Then you should just be able to make your configure project reference the frontend/%PROJECT_BRANCH_NAME% and backend/%PROJECT_BRANCH_NAME% (or ${env.PROJECT_BRANCH_NAME} in a pipeline script) when copying the artifacts.
Also, is there a particular reason why you're evaluating specifically Jenkins 2.7? 2.7 is a year old now, and there have been a few new LTS releases since then. I'd recommend staying reasonably up-to-date unless you know there's a specific reason you want 2.7.

Jenkins Pipeline per branch environment variable configuration

I have several Jenkins Pipeline jobs set up on my Jenkins installation all of them with a Jenkinsfile inside the repository.
These pipelines are run for all branches, and contains all steps necessary to build and deploy the branch. However, there are some differences for the different branches with regards to building and deploying them, and I would like to be able to configure different environment variables for the different branches.
Is that possible with Jenkins, or do I need to reevaluate my approach or use another CI system?
#rednax answer works if you're using a branch-per-environment git strategy. But if you're using git-flow (or any strategy where you assume that changes will be propogated up, possibly without human intervention, to master/production) you'll run into headaches where a merge will overwrite scripts/variables.
We use a set of folders which match the environment names : infrastructure/Jenkinsfile contains the common steps, and infrastructure/test/Jenkinsfile contains the steps specific to the test environment (the folders also contain Dockerfiles and cloudformation scripts). You could make that very complex with cascading includes or file merges, or simply have almost-identical copies of each file in each folder.
When configuring the job you can specify for Jenkins to grab the script (Jenkins file) from the branch on which you are running. This mean that technically you can adjust the script on each of your branches to set up parameters there. Or you can grab the script from the same source control location, but commit a configuration file in each of your branches and have the script read that file after the checkout.

Perform jenkins build on local files

We have a Jenkins server where I have already defined my job. It uses Perforce as SCM.
I would like to replicate all the steps that Jenkins takes to build the project but use the files in my local workspace instead. Basically, I would like to run a jenkins build locally based on a job defined on another server.
How would I do the same?
Something like what I created for my Perforce users might work for you -- I added a job in Jenkins that will grab shelved files (so, the user would need to shelve the files), create a build from there, then let the user know if it was successful (they also have the option of running tests or creating a deployable build). The gist of it is to request the shelved changelist #, then do this: "p4 unshelve -s %SHELVEDCL% " and proceed as usual. They use it when they feel like it; it's been useful. But it does require access to Jenkins.
1) Install Jenkins on your local workstation (if you have no already done so).
2) Copy the /Jenkins/jobs/ directory to the /Jenkins/jobs/ directory on your local workstation.
3) Fire it up and edit the Perforce workspace (and any other settings) as necessary.
IMO, you should probably takes these steps:
Create a new Jenkins job from the existing one. 2) Modify the job to
be a string "parameterized" job where you pass branch-name as the
parameter. You can do using "This build is parameterized" option in
the configuration of the job. 3) Under the configuration of the job,
for Source Code Management section, change the Branch Specifier to
use the String Parameter variable name (created from #2 above).
4) Create your feature branch on Perforce and make intended changes
there. 5) Run the newly created job with your branch as the parameter
to it.
Hope this helps.

How to get URL of pipeline job in jenkins

We are setting up a continuous delivery pipeline in Jenkins, using the build pipeline plugin.
Our deployment steps uses a proprietary deploy tool (triggered by a HTTP request from jenkins), but we need to have an additional Jenkins step for acceptance tests on the then deployed project. So our deploy tool will need to trigger the last pipeline step.
The jenkins setup for this is obvious:
For a Manually Triggered downstream build step: To add a build step
that will wait for a manual trigger:
Select the Build Pipeline Plugin, Manually Execute Downstream Project check-box
Enter the name(s) of the downstream projects in the Downstream
Project Names field. (n.b. Multiple projects can be specified by using comma, like "abc, def".)
Source: Build Pipeline Plugin
The problem is: I can't seem to find a way to trigger this downstream build through a URL.
In fact I'd need the URL in the deploy job, so I can send it to the deploy tool as a callback URL. Can anybody help?
If I understand correctly, you want to use remote access API, which to my knowledge is no different between general project or pipeline one.
Take a look here:
https://wiki.jenkins-ci.org/display/JENKINS/Remote+access+API
Submitting jobs
Jobs without parameters
You merely need to perform an HTTP POST on JENKINS_URL/job/JOBNAME/build?token=TOKEN where TOKEN is set up in the job configuration.
As stated above by #rafal S do
read a file which has list projects name for which build job has to be triggered do a curl HTTP POST on JENKINS_URL/job/${JOBNAME from the file}/build?token=TOKEN within a for loop , where for loop has list of all project names from the file you read

Resources