I have the following problem :
- I created a job, which called "job1", contains four directories. How can I access one of those directories if I created "job2". I am using the declarative pipeline
Example: Let's say the first job1 have the following directories and each directory has some files
The directory has (FirstFile -SecondLife - ThirdFile)
The question is: When I create another job "job2". How can I access one of the above files?
Is it available to write this line "dir(/workspace/job1/directory/FirstFile)" in second job ?!
Related
I am using Jenkins pipeline, I created 4 Jobs, each job has some functions and Their is a redundant function existing in all those Jobs.
How to make that redundant function in a shared place and all those jobs can call this function ?
You are looking for Jenkins shared library
As the name suggest, you create a library - a pipeline shared among jenkins jobs - in a SCM (git svn ...) and in your project you create a simple Jenkinsfile calling the library.
So, every build will checkout your project, read the Jenkinsfile and then checkout the library with the pipeline.
I did it by:
Created Folder in Jenkins working home
In that folder => I Created file.groovy contains the functions i need
At the end of that file should contain
return this
in JenkinsFile add
node{shared_functionality = load "FilePath.groovy"}
This number four will include functions in .groovy file in your jenkinsFile
So you can add node statement in your JenkinsFiles to include Functions you need
So this is my entire Use case:
I have a parameterized build job which accepts file parameters. After the build I need to send a mail with that file and the size of the file. For this, I'm trying to add the name and size of the file as an Env variable using EnvInject Plugin.
But EnvInject is in the Build Environment step. The file parameter gets stored in the Workspace of the build only in the Build step, not in Build environment. So, there will be an error like File not found.
Due to which, I'm trying a crooked way of defining a properties file somewhere on my local system.I'm mentioning this properties file in "Properties File Path" of Inject Environment variables. In the build step I'm adding FOO=BAR and other values in the file so that I can use those values as my env variables down the line, like when I configure my e-mail template in Post Build Actions.
Can this process be done easily? I was initially making the properties file in JENKINS_HOME. I just got to know that I'm not allowed to do that as in master-agent architecture, JENKINS_HOME will be different and build will fail.
PS-1. The workspace gets deleted after every build
2. Any other plugins which can be used? If possible, please suggest without installing some new plugin as I'm not Jenkins admin
One way of solving this problem is by creating 3 different jobs as follows -
job 1 --> should call below 2 jobs(job 2 & job 3)
job 2:
Build --> Trigger/call builds on other projects --> job 2(block until the triggerred projects finish their builds)
select"Build on the same node" from Add Parameters.
job 3:
Build --> Trigger/call builds on other projects --> job 2(block until the triggerred projects finish their builds)
select"Build on the same node" from Add Parameters.
Job 2 (Create this job as follows):
Execution API --> using "GET_FILE" option you can download the required details on to the current working directory of your job.
Execute shell -->
now within "Execute Shell", download "consoleText" using wget command.
process the "consoleText" using unix command prepare a key-value pairs and store it under /tmp folder. i.e. "/tmp/env.prop"
Job 3 (Create this job as follows):
Bindings:
select "Inject environment variable to the build process" and under 'Properties File Path' enter "/tmp/env.prop"
now you can use the variable which you created in Job2 in the current job without any issue.
please note that it is important to select "Build on the same node" in Job 1, because this will preserve the data and it allows other jobs to access this information.
Let me know if its not clear.
on a multi job I have two phases:
PhaseA running Build_job1, with a project name Build_job1, pulling stuff from git to dir: /var/lib/jenkins/workspace/Build_job1
PhaseB running Deploy_job2, that rsyncs /var/lib/jenkins/workspace/Build_job1/* to a bunch of servers.
For internal reasons I need to replicate the multijob, the build job and the deploy job to different environments (PROD, QA, Staging). I each case, the deploy job rsync will need to copy files from a different build directory (Build_QA, Build_Prod, Build_whatever etc.).
As Jenkins creates the dir per project name, I need the rsync command in the deploy job to get the project name as a parameter that is passed down from the build job.
help?
Are you wanting to pass down the current job's project name down to its children? If so, you can pass down this information via a Jenkins Set Environment Variables call "JOB_NAME" in conjunction with a predefined job parameter. For example, something like:
Param1=${JOB_NAME}
If the Multijob job name is "QA", you can pass that down to both the build and deploy phase jobs via a predefined parameter and then construct the final "Build_QA" path by doing something like "Build_${Param1}" or "Build_%Param1%".
I want to know how can we parameterize build in jenkins. I need to create a jenkins job which will contain 5 sub jobs in it. i need to create a drop down , selelct any of the module and build it. But the script used is different for every sub build? can any1 guide on the same is it possible.
string parameters in Jenkins result in environment variables of the same name.
So, you could write a wrapper script in bash which would look for the environment variables that could be set as a result of the parameterized build (i.e. your 5 sub-jobs) in a series of if-elif statements, and within each one, you would invoke the necessary build script from there.
The build script that you would have Jenkins run would be the wrapper script.
I have 2 jobs: "Helper" and "Main" and the single jenkins instance (which is the host and the executor).
The helper manages 3rd party resource and makes the preparation for the Main job (to be precise - it creates the environment for the application to be deployed for testing).
The only artifact for the helper job is a single file with IP of the environment prepared especially for the Main job.
How would I pass back the build from the Helper to the Main in this case?
You are saying that you only need to pass a file with an IP to the "Main" job. If all you need is that IP, there are easier ways of doing it (without files), I will describe both.
To pass an artifact from one job to another
In the "Helper" job, you need to archive that file from your workspace.
In post-build actions, choose Archive the artifacts
Put a path relative to the workspace. You can use wildcards, or hardcode the name of the file if it is always same.
Configure this job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Now, in the "Main" job, you need to copy this artifact from the previous ("Helper") job.
For the first build step, select Copy artifacts from another project build step. If you don't have this plugin, you can get it here
For the Project name, enter the name of your "Helper" job
For Which build, select Latest successful build
For Artifacts to copy, use **/yourartifactname*.* Your artifact name will be what you configured in "Helper" job. Using **/ in front makes sure it will ignore any directory structure before getting to the artifact
For, Target directory, specify a location in your "Main" job's workspace where this file will be copied too.
Checkmark Flatten directories, so the file goes directly to the location specified in Step 5, else it will retain the directory structure that it was archived under (in "Helper" job)
Now, your "Main" job has the file from "Helper" job in it's workspace. Use it like you would any other file in your workspace
To pass a variable from one job to another
Like I mentioned, if all you need is that one IP address, that you have as a variable at one point in time in "Helper" job, you just send it to "Main" job using the Trigger/Call builds on other projects step that you configured in steps 3 and 4 of the "Helper" job. In this case, you don't need any special configuration on "Main" job.
Configure "Helper" job to automatically trigger your "Main" job using Trigger/Call builds on other projects build step. If you don't have this plugin, you can get it here
For Projects to build, enter the name of your "Main" job
Click Add Parameters button
Select Predefined parameters
Type VarForMain=$VarFromHelper, where VarFromHelper is your environment variable from the "Helper" job that contains your IP address, and VarForMain is the environment variable that will be set in your "Main" job to this value. There is no reason why these can't have the same name.
Now, in your "Main" job, you can reference $VarForMain as you would any other environment variable
The accepted answer wasn't helpful in my case, but I've just came up with a trick:
Create a main job with a shell command of
echo "PARAMS_FILE=${WORKSPACE}/build-${BUILD_NUMBER}.params" > "${WORKSPACE}/build-${BUILD_NUMBER}.params"
Create sub-jobs by adding them to the build steps (not steps after build)
Pass the file as a parameters source to the sub-builds and have the builds updated the file with a line in their scripts like:
echo "MY_VAR=some_value" >> "$PARAMS_FILE"
That way all subsequent jobs have environment updated with the results of their predecessors.