Loading a JSON file from another job's archive into a Jenkins pipeline - jenkins

I'd like to store a JSON file as the output of one job, and read that JSON in and parse it for use in a pipeline in a different job. I'm having trouble getting the JSON from the first job into my workspace so I can read it.
This mentions reading in a JSON, but not how to get it into the workspace = Pass Jenkins Pipeline parameters from a Jenkins job?
I see some suggestions that involve adding build steps (URL SCM plugin), but adding build steps doesn't seem available in my pipeline job

You should have a look at archiveArtifacts and copyArtifacts. You would archive the JSON file in one job and then copy it from in the other.
Edit:
In a pipeline you would do something like:
copyArtifacts(projectName: 'sourceproject')
or
copyArtifacts(projectName: 'downstream', selector: lastSuccessful())
You can look it up here: https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin

Related

How to access json file in bitbucket pipeline?

I have a bitbucket pipeline that runs Google Lighthouse. I want to access the json output that is generated at the end of the pipeline and have it echo 1 of the variables. I understand that I can use artifacts, but I am unsure of how to access it.
Here is my bitbucket-pipelines.yml file:
script:
- lhci collect
- lhci upload
- echo "===== Lighthouse has completed running ====="
artifacts: # defining the artifacts to be passed to each future step.
- .lighthouseci/*.json
Reciting the official doc,
Artifacts are files that are produced by a step. Once you've defined them in your pipeline configuration, you can share them with a following step or export them to keep the artifacts after a step completes. For example, you might want to use reports or JAR files generated by a build step in a later deployment step. Or you might like to download an artifact generated by a step, or upload it to external storage.
If you have your json file generated right after echo "===== Lighthouse has completed running =====" line, you don't have to define a separate step for echoing its contents. Do it right here. You don't even need artifacts if that's the only thing you want to do with your json.

Upload files to userContent of jenkins server

I have a multi-branch pipeline job from which I would like to upload a file to the Jenkins userContent location using a Groovy script. I tried the job-dsl-plugin to use userContent method, but it throws the error shown below:
java.lang.NoSuchMethodError: No such DSL method 'userContent' found among steps
Reference: https://github.com/jenkinsci/job-dsl-plugin/wiki/Job-DSL-Commands
Do I need to configure anything in order to upload a file to userContent? Is there any other way to upload a file to the userContent location?
You can not simply mix Pipeline DSL and Job DSL. See Use Job DSL in Pipeline scripts for instructions on using the Job DSL build step as a Pipeline step.
node {
jobDsl scriptText: 'userContent("test.txt", new ByteArrayInputStream("test".bytes))'
}

How to pass output from one pipeline to another in jenkins

I'm new to Jenkins and I've been tasked with a simple task of passing the output from one pipeline to the other.
Lets say that the first pipeline has a script that says echo HelloWorld, how would i pass this output to another pipeline so it displays the same thing.
I've looked at parameterized triggers and couple of other answers but I was hoping if someone could layout the step by step procedure to me.
If you want to implement it purely with Jenkins pipeline code - what I do is have an orchestrator pipeline job that builds all the pipeline jobs in my process, waits for them to complete then gets the build number:
Orchestrator job
def result = build job: 'jobA'
def buildNumber = result.getNumber()
echo "jobA build number : ${buildNumber}"
In each job like say 'jobA' I arrange to write the output to a known file (a properties file for example) which is then archived:
jobA
writeFile encoding: 'utf-8', file: 'results.properties', text: 'a=123\r\nb=foo'
archiveArtifacts 'results.properties'
Then after the build of each job like jobA, use the build number and use the Copy Artifacts plugin to get the file back into your orchestrator job and process it however you want:
Orchestrator job
step([$class : 'CopyArtifact',
filter : 'results.properties',
flatten : true,
projectName: 'jobA',
selector : [$class : 'SpecificBuildSelector',
buildNumber: buildNumber.toString()]])
You will find these plugins useful to look at:
Copy Artifact Plugin
Pipeline Utility Steps Plugin
If you are chaining jobs instead of using an orchestrator - say jobA builds jobB builds jobC etc - then you can use a similar method. CopyArtifacts can copy from the upstream job or you can pass parameters with the build number and name of the upstream job. I chose to use an orchestrator job after changing from chained jobs because I need some jobs to be built in parallel.

jenkins pipeline unable to read files

I have a simple Jenkinsfile where I want to load some data from the workspace. I am using the pipeline plugin to leverage the Jenkinsfile inside of the repository. The build is farmed off to a matching Jenkins agent. When I try to use "readFile" I get the following message:
java.io.FileNotFoundException: /path/to/jenkins/workspace/XXXXX/project/data.json (No such file or directory)
I also get the same message when trying to load a Groovy file from the workspace.
My Jenkinsfile looks like:
node('master') {
stage "Start"
echo "Starting"
stage "Load File"
def myJson = readFile "data.json"
}
Any ideas why I can't read these files?
Thanks,
Tim
When Jenkins processes a Jenkinsfile it does not automatically pull down the entire source repository. You need to execute "checkout scm" to pull down the contents of the repository. If you fail to do so no other files will be available to the pipeline script.

Passing s3 artifacts from parallel builds to a single build in Jenkins Workflow

I am attempting to build a Windows installer through Jenkins.
I have a number of jenkins projects that build individual modules and then save these artifacts in s3 via the s3 artifact plugin.
I'd like to run these in parallel and copy the artifacts to a final "build-installer" job that takes all these and builds an installer image. I figured out how to run jobs in parallel with jenkins workflow but I don't know where to look to figure out how to extract job result details, ensure they're all the same changeset and pass it to the 'build-installer' job.
So far I have workflow script like this:
def packageBuilds = [:]
// these save artifacts to s3:
packageBuilds['moduleA'] = { a_job = build 'a_job' }
packageBuilds['moduleB'] = { b_job = build 'b_job' }
parallel packageBuilds
// pass artifacts from another jobs to below??
build job:'build-installer', parameters:????
Is this the right way? Or should I just have a mega build job that builds the modules and installer in one job?
A single job that does all the steps would be easier to manage.
I know file parameters are yet not supported for sending files to a Workflow job: JENKINS-27413. I have not tried sending files from a Workflow job using file parameters. Probably cannot work without some special support. (Not sure if you can even send file parameters between freestyle builds, for that matter.)

Resources