I have 2 jobs: "Helper" and "Main" and the single jenkins instance (which is the host and the executor).
The helper manages 3rd party resource and makes the preparation for the Main job (to be precise - it creates the environment for the application to be deployed for testing).
The only artifact for the helper job is a single file with IP of the environment prepared especially for the Main job.
How would I pass back the build from the Helper to the Main in this case?
You are saying that you only need to pass a file with an IP to the "Main" job. If all you need is that IP, there are easier ways of doing it (without files), I will describe both.
To pass an artifact from one job to another
In the "Helper" job, you need to archive that file from your workspace.
In post-build actions, choose Archive the artifacts
Put a path relative to the workspace. You can use wildcards, or hardcode the name of the file if it is always same.
Configure this job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Now, in the "Main" job, you need to copy this artifact from the previous ("Helper") job.
For the first build step, select Copy artifacts from another project build step. If you don't have this plugin, you can get it here
For the Project name, enter the name of your "Helper" job
For Which build, select Latest successful build
For Artifacts to copy, use **/yourartifactname*.* Your artifact name will be what you configured in "Helper" job. Using **/ in front makes sure it will ignore any directory structure before getting to the artifact
For, Target directory, specify a location in your "Main" job's workspace where this file will be copied too.
Checkmark Flatten directories, so the file goes directly to the location specified in Step 5, else it will retain the directory structure that it was archived under (in "Helper" job)
Now, your "Main" job has the file from "Helper" job in it's workspace. Use it like you would any other file in your workspace
To pass a variable from one job to another
Like I mentioned, if all you need is that one IP address, that you have as a variable at one point in time in "Helper" job, you just send it to "Main" job using the Trigger/Call builds on other projects step that you configured in steps 3 and 4 of the "Helper" job. In this case, you don't need any special configuration on "Main" job.
Configure "Helper" job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Click Add Parameters button
Select Predefined parameters
Type VarForMain=$VarFromHelper, where VarFromHelper is your environment variable from the "Helper" job that contains your IP address, and VarForMain is the environment variable that will be set in your "Main" job to this value. There is no reason why these can't have the same name.
Now, in your "Main" job, you can reference $VarForMain as you would any other environment variable
The accepted answer wasn't helpful in my case, but I've just came up with a trick:
Create a main job with a shell command of
echo "PARAMS_FILE=${WORKSPACE}/build-${BUILD_NUMBER}.params" > "${WORKSPACE}/build-${BUILD_NUMBER}.params"
Create sub-jobs by adding them to the build steps (not steps after build)
Pass the file as a parameters source to the sub-builds and have the builds updated the file with a line in their scripts like:
echo "MY_VAR=some_value" >> "$PARAMS_FILE"
That way all subsequent jobs have environment updated with the results of their predecessors.
Related
So this is my entire Use case:
I have a parameterized build job which accepts file parameters. After the build I need to send a mail with that file and the size of the file. For this, I'm trying to add the name and size of the file as an Env variable using EnvInject Plugin.
But EnvInject is in the Build Environment step. The file parameter gets stored in the Workspace of the build only in the Build step, not in Build environment. So, there will be an error like File not found.
Due to which, I'm trying a crooked way of defining a properties file somewhere on my local system.I'm mentioning this properties file in "Properties File Path" of Inject Environment variables. In the build step I'm adding FOO=BAR and other values in the file so that I can use those values as my env variables down the line, like when I configure my e-mail template in Post Build Actions.
Can this process be done easily? I was initially making the properties file in JENKINS_HOME. I just got to know that I'm not allowed to do that as in master-agent architecture, JENKINS_HOME will be different and build will fail.
PS-1. The workspace gets deleted after every build
2. Any other plugins which can be used? If possible, please suggest without installing some new plugin as I'm not Jenkins admin
One way of solving this problem is by creating 3 different jobs as follows -
job 1 --> should call below 2 jobs(job 2 & job 3)
job 2:
Build --> Trigger/call builds on other projects --> job 2(block until the triggerred projects finish their builds)
select"Build on the same node" from Add Parameters.
job 3:
Build --> Trigger/call builds on other projects --> job 2(block until the triggerred projects finish their builds)
select"Build on the same node" from Add Parameters.
Job 2 (Create this job as follows):
Execution API --> using "GET_FILE" option you can download the required details on to the current working directory of your job.
Execute shell -->
now within "Execute Shell", download "consoleText" using wget command.
process the "consoleText" using unix command prepare a key-value pairs and store it under /tmp folder. i.e. "/tmp/env.prop"
Job 3 (Create this job as follows):
Bindings:
select "Inject environment variable to the build process" and under 'Properties File Path' enter "/tmp/env.prop"
now you can use the variable which you created in Job2 in the current job without any issue.
please note that it is important to select "Build on the same node" in Job 1, because this will preserve the data and it allows other jobs to access this information.
Let me know if its not clear.
We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.
on a multi job I have two phases:
PhaseA running Build_job1, with a project name Build_job1, pulling stuff from git to dir: /var/lib/jenkins/workspace/Build_job1
PhaseB running Deploy_job2, that rsyncs /var/lib/jenkins/workspace/Build_job1/* to a bunch of servers.
For internal reasons I need to replicate the multijob, the build job and the deploy job to different environments (PROD, QA, Staging). I each case, the deploy job rsync will need to copy files from a different build directory (Build_QA, Build_Prod, Build_whatever etc.).
As Jenkins creates the dir per project name, I need the rsync command in the deploy job to get the project name as a parameter that is passed down from the build job.
help?
Are you wanting to pass down the current job's project name down to its children? If so, you can pass down this information via a Jenkins Set Environment Variables call "JOB_NAME" in conjunction with a predefined job parameter. For example, something like:
Param1=${JOB_NAME}
If the Multijob job name is "QA", you can pass that down to both the build and deploy phase jobs via a predefined parameter and then construct the final "Build_QA" path by doing something like "Build_${Param1}" or "Build_%Param1%".
I am using Promote Build plugin in Jenkins .
I need to take the approver information from the user in Jenkins and provide him the approval rights .
Here is what I am trying to do :
Is it feasible ?
Don't think you can use variables there. However, you could skip that condition, and instead have an Execute Shell build step, and there check for variables $PROMOTED_USER_NAME. Parse the name, and make your decision based on that.
Parent parameters don't automatically get passed to Promoted builds. However, you can export them to file, archive the file (important to archive as opposed to keep it in workspace), bring the file over in the promotion step, and either load it to environment variables with EnvInject plugin, or simply use the file as is in a script
On Parent Job
Configure parameter approverid
Have an Execute Shell build step with the following:
echo approverid=$approverid > myfile
At the end, make sure to Archive myfile
On Promotion Configuration
Skip the approval criteria
Add Copy Artifacts from another project step
For Project Name, use $PROMOTED_JOB_NAME
For Which build, use Specific Build, then provide $PROMOTED_NUMBER
For Artifacts to copy, use myfile
Add Inject Environment Variables build step
For Properties File Path, enter myfile
Add Execute Shell build step
In that shell, compare values of $approverid and $PROMOTED_USER_NAME
If they match, continue, else abort/exit promotion.
Or course, the history of execution (and abort) will be noted however.
Is it possible to have a Jenkins Job with has been triggered by the Join plugin copy artifacts from multiple upstream jobs?
I'm trying to set-up a Jenkins configuration with a "diamond" of jobs: my-trigger runs and spawns two jobs, my-fork1 and my-fork2, that can run concurrently and take varying amounts of time, and the Join plugin sets off the job my-join once both forks have completed.
Each of my-trigger, my-fork1 and my-fork2 creates and fingerprints artifacts (say, text files).
I want to copy the artifacts from each of the upstream jobs in my-join using the "Copy artifacts from another project" tool, with the "Which build" parameter set to "Upstream build that triggered this job". However, I see output like this in the console of my-join:
Building remotely on build-machine in workspace /path/to/workspace/my-join
Copied 1 artifact from "my-trigger" build number 63
Copied 1 artifact from "my-fork1" build number 63
Unable to find a build for artifact copy from: my-fork2
and the job fails. In this case, my-fork2 finished first, so my-fork1 triggered the join step. I believe that that means that my-join only has record of my-fork1 and my-trigger as being upstream. If my-fork1 finishes first, then my-fork2 kicks off the join, and the job fails when trying to copy from my-fork1.
If I change the configuration to copy the artifact from the build "Latest successful build" then the build succeeds, but my-trigger may run many times in succession so there would be no guarantee that my-join is joining related artifacts.
How can I get the join step to copy artifacts from multiple forks upstream?
Note: the second point of this question seems to be asking the same thing, but the only answer there doesn't address it, and has been accepted.
Thanks
tensorproduct
If your builds are parameterized with a unique parameter for each run of the join-diamond, you can use that parameter in the CopyArtifact plugin to determine which build to copy from. You would want to specify "Latest successful build" and qualify it with the parameter and value.
We have a similar situation where I work; multiple simultaneous runs of a join-diamond. The parameter in the build allows the downstream jobs to get the correct artifacts from the upstream jobs.
Step by Step settings of the provided solution from Jason Swager:
Project dependencies:
diamond->fork->diamond_ready
Project "fork":
String parameter "UNIQUE_ID" (only dummy not used inside)
(Creates an artifcat and Archive the artifacts)
Project "diamond_ready"
String parameter: UNIQUE_ID
Copy artifacts from another project
Project name: fork
Parameter filters: UNIQUE_ID=${UNIQUE_ID}
Project "diamond":
Trigger parameterized build on other project
Projects to build: fork
Predefinded parameters: UNIQUE_ID=${BUILD_TAG}
Join Trigger:
Post-Join Actions:
Trigger parameterized build on other projects
Projects to build: diamond_ready
Predefined Generator parameters: UNIQUE_ID=${BUILD_TAG}