i want to store build result variables for another job,i wish to store variables in a file let's call envfile and read the file using def props = readProperties file: '/jenkins/home/envfile',this is working but issue is ,i have a master node and different Jenkins worker nodes,the variable file stored in worker nodes ,worker nodes can be different at different time hence the job failing due to unable to find the file , how i can sort the issue
Related
In Jenkins you can set a job to build periodically using a cron-like time definition. e.g.
# switch timezone
TZ=Etc/GMT+6
# build once anywhere between 13:00 - 23:59
H H(13-23) * * *
Further you can configure parameterized jobs. That means on the go you can set the value of some predefined environment variables which will be used in the configuration. e.g. a certain branch on git you can let the user set the value of the environment variable "BRANCH" and than access this value using
${BRANCH}
in the configuration.
But this seems not to work with the Schedule value of a periodically build job configuration.
My problem:
I'm using the Job-Generator Plugin. It basically creates new (not parameterized but periodically running) jobs while using its own configuration as template.
In order to generate different jobs for different repositories you use the parameterized build as descriped before so in the generated job configuration the variable names ${...} are replaced by the value.
Now I don't want the Generator to run periodically but ofcourse only on demand. Therefor I want to replace the before mentioned cron rule by a variable so itself isn't build over night.
I tried to set CRON1 (TZ=Etc/GMT+6) and CRON2(H H(13-23) * * *) as unchngeable Generator variables and use
Schedule
${CRON1}
${CRON2}
but this makes Jenkins break and throw an error on trying to save the generator config.
line 1:1: unexpected char: '$'
How can I set the schedule value using an environment variable?
(I'm not trying to do Jenkins scheduled build Triggers with environment variable? . I'm using this but this doesn't solve my problem of the Job-Generator running periodically itself what it shouldn't)
You can't access the variables in that block, Instead you can use plugins to schedule the jobs. parameterized-scheduler-plugin this plugin can be helpful to you in your case. It does have good integration with pipeline scripting also. Hope this helps.
I have a Jenkins job, with SCM from bitbucket, two shell scripts, and a post build action publishing the result to Slack.
Naively I want to pass a concluded variable in the first shell script to the second, add some information to that variable in the second shell script, and then to append that variable to the Slack custom message.
I was expecting this to be a built in feature, and now spending few days on and off at it. I've tired the EnvInject, Environment Inject, Global Variable String Parameter plugins, but in any configuration I've tried it didn't work.
In some cases I got this error:
21:01:08 [EnvInject] - [ERROR] - The given properties file path 'build.properties' doesn't exist.
I know this file does not exist.. I expected the plugin to create it, so I can add new content to it in first shell script, and to be loaded in every other step of the job.
Am I missing something or misusing these plugins?
So like I've seen it happens too often, after asking the question, I was able to solve it like this:
First we create a shell script to create the file, I've already added a value:
Then we tell Jenkins to inject the variables from the build.properties file:
Then we change the value of the variable in the file:
Then AGAIN we tell Jenkins to inject the variables from the same file:
Then we can observe the value changes in the next shell:
Also in the post build action:
And success:
on a multi job I have two phases:
PhaseA running Build_job1, with a project name Build_job1, pulling stuff from git to dir: /var/lib/jenkins/workspace/Build_job1
PhaseB running Deploy_job2, that rsyncs /var/lib/jenkins/workspace/Build_job1/* to a bunch of servers.
For internal reasons I need to replicate the multijob, the build job and the deploy job to different environments (PROD, QA, Staging). I each case, the deploy job rsync will need to copy files from a different build directory (Build_QA, Build_Prod, Build_whatever etc.).
As Jenkins creates the dir per project name, I need the rsync command in the deploy job to get the project name as a parameter that is passed down from the build job.
help?
Are you wanting to pass down the current job's project name down to its children? If so, you can pass down this information via a Jenkins Set Environment Variables call "JOB_NAME" in conjunction with a predefined job parameter. For example, something like:
Param1=${JOB_NAME}
If the Multijob job name is "QA", you can pass that down to both the build and deploy phase jobs via a predefined parameter and then construct the final "Build_QA" path by doing something like "Build_${Param1}" or "Build_%Param1%".
I want to run a Jenkins job on 4 different slaves (windows, linux, solaris, Mac). Instead of making 4 different jobs I want to have a single job. I can use a Node parameter to execute on different slaves. My job runs a script which uses Jenkins workspace of slave and a few other scripts. My script is in a different folder on each slave, and other required scripts are in a different folder. So now I have created 4 different jobs for each slave and hard-coded Jenkins workspace and other required scripts path.
Is there any way so that I can put all paths in some JSON-like structure and depending on slave will pick those paths? So that I will have 1 job only.
Please suggest, Thanks in advance!
my idea is to use e.g "Execute system Groovy script" to get slave value and then use if statement to assigne proper path and create parameter visible in Environment Variables:
import hudson.model.Computer
import hudson.model.StringParameterValue
import hudson.model.ParametersAction
//get slave name
def slaveName = Computer.currentComputer().getNode().name
def path
//choose path
if(slaveName.equals("slave01")){
path = "C:"
}
if(slaveName.equals("slave02")){
path = "/root"
}
if(slaveName.equals("slave03")){
path = "D:"
}
//pass path as env. variable
build.addAction(new ParametersAction(new StringParameterValue('path', path)))
then you can use variable path in command:
echo %path%
or use Conditional BuildStep Plugin to set separable steps for each operation system and control when each step should be executed
Jenkins is designed to check out files from a version control system (Subversion, Git, whatever) and run tasks. Instead of trying to manage separate files on separate slaves, you should put your scripts in some form of version control and let Jenkins check out the files in the workspace as part of its build process.
I have 2 jobs: "Helper" and "Main" and the single jenkins instance (which is the host and the executor).
The helper manages 3rd party resource and makes the preparation for the Main job (to be precise - it creates the environment for the application to be deployed for testing).
The only artifact for the helper job is a single file with IP of the environment prepared especially for the Main job.
How would I pass back the build from the Helper to the Main in this case?
You are saying that you only need to pass a file with an IP to the "Main" job. If all you need is that IP, there are easier ways of doing it (without files), I will describe both.
To pass an artifact from one job to another
In the "Helper" job, you need to archive that file from your workspace.
In post-build actions, choose Archive the artifacts
Put a path relative to the workspace. You can use wildcards, or hardcode the name of the file if it is always same.
Configure this job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Now, in the "Main" job, you need to copy this artifact from the previous ("Helper") job.
For the first build step, select Copy artifacts from another project build step. If you don't have this plugin, you can get it here
For the Project name, enter the name of your "Helper" job
For Which build, select Latest successful build
For Artifacts to copy, use **/yourartifactname*.* Your artifact name will be what you configured in "Helper" job. Using **/ in front makes sure it will ignore any directory structure before getting to the artifact
For, Target directory, specify a location in your "Main" job's workspace where this file will be copied too.
Checkmark Flatten directories, so the file goes directly to the location specified in Step 5, else it will retain the directory structure that it was archived under (in "Helper" job)
Now, your "Main" job has the file from "Helper" job in it's workspace. Use it like you would any other file in your workspace
To pass a variable from one job to another
Like I mentioned, if all you need is that one IP address, that you have as a variable at one point in time in "Helper" job, you just send it to "Main" job using the Trigger/Call builds on other projects step that you configured in steps 3 and 4 of the "Helper" job. In this case, you don't need any special configuration on "Main" job.
Configure "Helper" job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Click Add Parameters button
Select Predefined parameters
Type VarForMain=$VarFromHelper, where VarFromHelper is your environment variable from the "Helper" job that contains your IP address, and VarForMain is the environment variable that will be set in your "Main" job to this value. There is no reason why these can't have the same name.
Now, in your "Main" job, you can reference $VarForMain as you would any other environment variable
The accepted answer wasn't helpful in my case, but I've just came up with a trick:
Create a main job with a shell command of
echo "PARAMS_FILE=${WORKSPACE}/build-${BUILD_NUMBER}.params" > "${WORKSPACE}/build-${BUILD_NUMBER}.params"
Create sub-jobs by adding them to the build steps (not steps after build)
Pass the file as a parameters source to the sub-builds and have the builds updated the file with a line in their scripts like:
echo "MY_VAR=some_value" >> "$PARAMS_FILE"
That way all subsequent jobs have environment updated with the results of their predecessors.