on a multi job I have two phases:
PhaseA running Build_job1, with a project name Build_job1, pulling stuff from git to dir: /var/lib/jenkins/workspace/Build_job1
PhaseB running Deploy_job2, that rsyncs /var/lib/jenkins/workspace/Build_job1/* to a bunch of servers.
For internal reasons I need to replicate the multijob, the build job and the deploy job to different environments (PROD, QA, Staging). I each case, the deploy job rsync will need to copy files from a different build directory (Build_QA, Build_Prod, Build_whatever etc.).
As Jenkins creates the dir per project name, I need the rsync command in the deploy job to get the project name as a parameter that is passed down from the build job.
help?
Are you wanting to pass down the current job's project name down to its children? If so, you can pass down this information via a Jenkins Set Environment Variables call "JOB_NAME" in conjunction with a predefined job parameter. For example, something like:
Param1=${JOB_NAME}
If the Multijob job name is "QA", you can pass that down to both the build and deploy phase jobs via a predefined parameter and then construct the final "Build_QA" path by doing something like "Build_${Param1}" or "Build_%Param1%".
Related
So this is my entire Use case:
I have a parameterized build job which accepts file parameters. After the build I need to send a mail with that file and the size of the file. For this, I'm trying to add the name and size of the file as an Env variable using EnvInject Plugin.
But EnvInject is in the Build Environment step. The file parameter gets stored in the Workspace of the build only in the Build step, not in Build environment. So, there will be an error like File not found.
Due to which, I'm trying a crooked way of defining a properties file somewhere on my local system.I'm mentioning this properties file in "Properties File Path" of Inject Environment variables. In the build step I'm adding FOO=BAR and other values in the file so that I can use those values as my env variables down the line, like when I configure my e-mail template in Post Build Actions.
Can this process be done easily? I was initially making the properties file in JENKINS_HOME. I just got to know that I'm not allowed to do that as in master-agent architecture, JENKINS_HOME will be different and build will fail.
PS-1. The workspace gets deleted after every build
2. Any other plugins which can be used? If possible, please suggest without installing some new plugin as I'm not Jenkins admin
One way of solving this problem is by creating 3 different jobs as follows -
job 1 --> should call below 2 jobs(job 2 & job 3)
job 2:
Build --> Trigger/call builds on other projects --> job 2(block until the triggerred projects finish their builds)
select"Build on the same node" from Add Parameters.
job 3:
Build --> Trigger/call builds on other projects --> job 2(block until the triggerred projects finish their builds)
select"Build on the same node" from Add Parameters.
Job 2 (Create this job as follows):
Execution API --> using "GET_FILE" option you can download the required details on to the current working directory of your job.
Execute shell -->
now within "Execute Shell", download "consoleText" using wget command.
process the "consoleText" using unix command prepare a key-value pairs and store it under /tmp folder. i.e. "/tmp/env.prop"
Job 3 (Create this job as follows):
Bindings:
select "Inject environment variable to the build process" and under 'Properties File Path' enter "/tmp/env.prop"
now you can use the variable which you created in Job2 in the current job without any issue.
please note that it is important to select "Build on the same node" in Job 1, because this will preserve the data and it allows other jobs to access this information.
Let me know if its not clear.
We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.
I have a jenkins job that runs some tests and promotes the build if all tests pass.
I then have a second job that has a 'Promoted build parameter' which is used for deployments.
THe idea is that the deply job should let you pick one of the prompted builds for deploying. The issue I'm having is that I can pick a promoted build, but I have not idea how I access information about the build.
I've named the build parameter
SELECTED_BUILD
And according to the docs this should then be available via the environment.
The problem is that it doesn't seem to be bound to anything.
If I run a build step to exectute this sheel script:
echo $SELECTED_BUILD
echo ${SELECTED_BUILD}
The values are not interpolated / set.
Any idea how I can access the values of this param?
Many Thanks,
Vackar
Inject the information as variable user jenkins plugin (eject evn variable).
while triggering parameterized build on other job, pass above variable as parameter to next job.
Parameterized variable are not getting updated in jenkins
I m using the conditional build-step plugin to update the jenkins job parameter by executing shell script its showing me the new value of variable as well but its not get reflected.
You can try the EnvInject Plugin. One of the features is a build step that allows you to "inject" parameters into the build job from a settings file.
Create a property for the email list in the env.properties file:
echo "variable=`value`"> env.properties
It will create the properties file in the job workspace directory.
env.properties
In shell script:
"$variable"
If I understand correctly, you are trying to change the value of a pre-defined parameter
from within a script that is run by the job.
This will not work, because of "scope" (or "call-stack"),
as a process (your script) cannot change the environment of a parent process (your Jenkins job).
I have 2 jobs: "Helper" and "Main" and the single jenkins instance (which is the host and the executor).
The helper manages 3rd party resource and makes the preparation for the Main job (to be precise - it creates the environment for the application to be deployed for testing).
The only artifact for the helper job is a single file with IP of the environment prepared especially for the Main job.
How would I pass back the build from the Helper to the Main in this case?
You are saying that you only need to pass a file with an IP to the "Main" job. If all you need is that IP, there are easier ways of doing it (without files), I will describe both.
To pass an artifact from one job to another
In the "Helper" job, you need to archive that file from your workspace.
In post-build actions, choose Archive the artifacts
Put a path relative to the workspace. You can use wildcards, or hardcode the name of the file if it is always same.
Configure this job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Now, in the "Main" job, you need to copy this artifact from the previous ("Helper") job.
For the first build step, select Copy artifacts from another project build step. If you don't have this plugin, you can get it here
For the Project name, enter the name of your "Helper" job
For Which build, select Latest successful build
For Artifacts to copy, use **/yourartifactname*.* Your artifact name will be what you configured in "Helper" job. Using **/ in front makes sure it will ignore any directory structure before getting to the artifact
For, Target directory, specify a location in your "Main" job's workspace where this file will be copied too.
Checkmark Flatten directories, so the file goes directly to the location specified in Step 5, else it will retain the directory structure that it was archived under (in "Helper" job)
Now, your "Main" job has the file from "Helper" job in it's workspace. Use it like you would any other file in your workspace
To pass a variable from one job to another
Like I mentioned, if all you need is that one IP address, that you have as a variable at one point in time in "Helper" job, you just send it to "Main" job using the Trigger/Call builds on other projects step that you configured in steps 3 and 4 of the "Helper" job. In this case, you don't need any special configuration on "Main" job.
Configure "Helper" job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Click Add Parameters button
Select Predefined parameters
Type VarForMain=$VarFromHelper, where VarFromHelper is your environment variable from the "Helper" job that contains your IP address, and VarForMain is the environment variable that will be set in your "Main" job to this value. There is no reason why these can't have the same name.
Now, in your "Main" job, you can reference $VarForMain as you would any other environment variable
The accepted answer wasn't helpful in my case, but I've just came up with a trick:
Create a main job with a shell command of
echo "PARAMS_FILE=${WORKSPACE}/build-${BUILD_NUMBER}.params" > "${WORKSPACE}/build-${BUILD_NUMBER}.params"
Create sub-jobs by adding them to the build steps (not steps after build)
Pass the file as a parameters source to the sub-builds and have the builds updated the file with a line in their scripts like:
echo "MY_VAR=some_value" >> "$PARAMS_FILE"
That way all subsequent jobs have environment updated with the results of their predecessors.