Jenkins pipeline: can I share stashes between pipelines / workspaces? - jenkins

I have declarative pipeline_a executing pipeline_b via build job. Problem is pipeline_b needs to use some files generated by pipeline_a. stash/unstash works for me to share data between stages but stashes saved in pipeline_a do not seem to be visible in pipeline_b.
Is this by design?
Should I be using some other Jenkins trick to share files between different jobs/pipelines?

For share between jobs you can use Copy Artifacts plugin or archive() artifacts of pipeline_a and download it in pipeline_b:
Pipeline_a:
archive('artifactName')
Pipeline_b:
sh("wget ${env.JENKINS_URL}/job/$jobName/$buildNumber/artifact/$artifactName")

Related

Jenkins declarative pipeline: How to fingerprint a file without archiving it?

I have a Jenkins declarative pipeline job that has the end result of creating some very large output files ( > 2 GB in size ). I don't want to archive these files in Jenkins as artifacts.
However, I would like to fingerprint these large files so that I can associate them with other builds.
How can I do this, preferably in the post action of the pipeline?
In your pipeline script add: fingerprint 'module/dist/**/*.zip'
Where 'module/dist/**/*.zip' are the files you wish to fingerprint using Ant's FileSet
In console log you should see:
Recording fingerprints
[Pipeline] ...
While users have mentioned in the Jenkins documentation that files also need to be archived for the build not to fail, this work for me on Jenkins ver. 2.180.

Jenkins: stash vs archiveArtifacts

What are the use cases and pros/cons for using stash vs archiveArtifacts?
The documentation mentions each:
i.e. https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#stash-stash-some-files-to-be-used-later-in-the-build
and
https://jenkins.io/doc/pipeline/tour/tests-and-artifacts/
but doesn't do a comparison.
stash is used to "save" some files in a pipeline stage and reuse them on a different slave (unstash). Stash is only useful when you have a small set of files. It will become very slow when you want to stash a big amount of data. If you need to stash a lot of files it's recommended to use a shared filesystem between your slaves so the content of your workspace can be used by multiple slaves.
Archiving artifacts will save artifacts on the master slave. You can specify if you only want to archive the generated artifacts from the last build or more. This is useful when you have some deploy job on your master to deploy the artifacts after a succesful run or to make them available in your jenkins console.
From the latest Pipeline Syntax documentation and Options directive:
https://jenkins.io/doc/book/pipeline/syntax/#options
preserveStashes
Preserve stashes from completed builds, for use with stage restarting. For example: options { preserveStashes() } to preserve the stashes from the most recent completed build, or options { preserveStashes(buildCount: 5) } to preserve the stashes from the five most recent completed builds.
In theory this seems much the same as using archiveArtifacts with buildDiscarder option to apply artifact retention policy.

How to manage multiple Jenkins pipelines from a single repository?

At this moment we use JJB to compile Jenkins jobs (mostly pipelines already) in order to configure about 700 jobs but JJB2 seems not to scale well to build pipelines and I am looking for a way to drop it from the equation.
Mainly i would like to be able to have all these pipelines stored in a single centralized repository.
Please note that keeping the CI config (Jenkinsfile) inside each repository and branch is not possible in our use case, we need to keep all pipelines in a single "jenkins-jobs.git" repo.
As far as I know this is not possible yet, but in progress. See: https://issues.jenkins-ci.org/browse/JENKINS-43749
I think this is the purpose of jenkins shared libraries
I didn't dev such library my-self but I am using some. Basically:
Develop the "shared code" of the jenkins pipeline in a shared library
it can contains the whole pipeline (seq of steps)
Add this library to the jenkins server
In each project, add a jenkinsfile that "import" those using #Library
as #Juh_ said, you can use jenkins shared libraries, here is a complete steps, Suppose that we have three branches:
master
develop
stage
and we want to create a single Jenkins file so that we can change in only one place. All you need is creating a new branch ex: common. This branch MUST have this structure. What we are interested for now is adding a new groovy file in vars directory, ex: common.groovy. Here we can put the common Jenkins file that you wish to be used across all branches.
Here is a sample:
def call() {
node {
stage("Install Stage from common file") {
if (env.BRANCH_NAME.equals('master')){
echo "npm install from common files master branch"
}
else if(env.BRANCH_NAME.equals('develop')){
echo "npm install from common files develop branch"
}
}
stage("Test") {
echo "npm test from common files"
}
}
}
You must wrap your code call function in order to be used in other branches. now we have finished work in common branch we need to use it in our branches. go to any branch you wish to use this pipline ex: master and create Jenkinsfile and put this one line of code:
common()
This will call the common function that you have created before in common branch and will execute the pipeline.

Jenkins pipeline share information between jobs

We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.

Passing s3 artifacts from parallel builds to a single build in Jenkins Workflow

I am attempting to build a Windows installer through Jenkins.
I have a number of jenkins projects that build individual modules and then save these artifacts in s3 via the s3 artifact plugin.
I'd like to run these in parallel and copy the artifacts to a final "build-installer" job that takes all these and builds an installer image. I figured out how to run jobs in parallel with jenkins workflow but I don't know where to look to figure out how to extract job result details, ensure they're all the same changeset and pass it to the 'build-installer' job.
So far I have workflow script like this:
def packageBuilds = [:]
// these save artifacts to s3:
packageBuilds['moduleA'] = { a_job = build 'a_job' }
packageBuilds['moduleB'] = { b_job = build 'b_job' }
parallel packageBuilds
// pass artifacts from another jobs to below??
build job:'build-installer', parameters:????
Is this the right way? Or should I just have a mega build job that builds the modules and installer in one job?
A single job that does all the steps would be easier to manage.
I know file parameters are yet not supported for sending files to a Workflow job: JENKINS-27413. I have not tried sending files from a Workflow job using file parameters. Probably cannot work without some special support. (Not sure if you can even send file parameters between freestyle builds, for that matter.)

Resources