I am trying to figure out a way to have one list of parameters, and have Jenkins create a job or run a build for each of the items in the list.
The parameter is a directory, so I have a list of directories, and I want it to work so for each of them, the build runs several steps - so basically for each directory, run git pull, ant command, ant command, ant command with the directory name, publish test results, next build.
I have looked at a bunch of plugins but I can't figure out how to do this to get it to go to the next item in the list until they're all done.
if I understand correctly you have on job? You can trigger it multiple times with different parameters (directory) by using BuildFlow Plugin. Create build flow job and inside this job call your job with different parameters. In build flow job you can trigger your job with parameters
build("AntJob", parDirectory: "C:\src1")
build("AntJob", parDirectory: "C:\src2")
you can also create smarter DSL and run this job in parallel
def dirTable = [ "C:\src1", "C:\src2", "C:\src3"]
def builds = []
dirTable.each{ d ->
def clr = { build("AntJob", parDirectory: d) }
builds.add(clr)
}
parallel(builds)</code>
Related
I have these Jenkins jobs A and B. Job A Builds a bunch of Files for my project. In Job B i wanna execute a command to run a file in the most recent build of job A.
My execution even works fine, but only because I have hard coded the build number and I am picking that from the files stored by Jenkins in my C:JenkinsData Directory, I would wanna have that called from the Workspace instead
See image for clarification.
Jenkins build steps illustration
For e.g my last build right now is 70 I want to know how I can be always executing those same files but in the most recent Build
Or if its even way better Can I execute those same file from Job A since the built files are in the Workspace.
You could get the last build index using this API and "number" property:
/lastBuild/api/json
ie:
http://localhost:8080/job/yourJobName/lastBuild/api/json
This could help:
Jenkins - Get last completed build status
When i run Jenkins Jobs that runs DSL script as a built step to generate folder in jnkins it shows unreferenced Items which contains the job details of previously build of the same job. Is there aby way to get rid of the Unreference Item
DSL Script :
folder('project-e') {
displayName('project-e')
description('Folder for project e')
}
Console output :
Processing provided DSL script
Added items:
GeneratedJob{name='project-e'}
Unreferenced items:
GeneratedJob{name='project-b'}
No, you can not get rid of the unreference items output. But you can ignore it if you set "Action for removed jobs" to "Ignore" in the "Process Job DSLs" build step.
Job DSL scripts are not meant to be one-time generator scripts. A script should describe a part of your Jenkins configuration. If you want to add a folder, you add a folder to the script. If you want to delete a folder, you delete the folder from the script. Then you run the Job DSL seed job to apply the changes.
I have a script to run to which i give multiple parameters in a loop, it takes each parameter completes the cycle and then the next one.
I need to run this on Jenkins, is there any option that i can run multiple builds on a single job ? I mean each parameter should be a single build and all the builds should run in queue not parallel.
We have two options :
1.) In my case i use a shell script and csv file as an input to it. So i wrote a simple groovy script for getting the inputs from csv file. Then i used parameterized plugin to run the script
2.) In case if you want to see the workflow on GUI you can go for pipeline
I am attempting to build a Windows installer through Jenkins.
I have a number of jenkins projects that build individual modules and then save these artifacts in s3 via the s3 artifact plugin.
I'd like to run these in parallel and copy the artifacts to a final "build-installer" job that takes all these and builds an installer image. I figured out how to run jobs in parallel with jenkins workflow but I don't know where to look to figure out how to extract job result details, ensure they're all the same changeset and pass it to the 'build-installer' job.
So far I have workflow script like this:
def packageBuilds = [:]
// these save artifacts to s3:
packageBuilds['moduleA'] = { a_job = build 'a_job' }
packageBuilds['moduleB'] = { b_job = build 'b_job' }
parallel packageBuilds
// pass artifacts from another jobs to below??
build job:'build-installer', parameters:????
Is this the right way? Or should I just have a mega build job that builds the modules and installer in one job?
A single job that does all the steps would be easier to manage.
I know file parameters are yet not supported for sending files to a Workflow job: JENKINS-27413. I have not tried sending files from a Workflow job using file parameters. Probably cannot work without some special support. (Not sure if you can even send file parameters between freestyle builds, for that matter.)
I'm using the Scriptler plugin for Jenkins, and am having a hard time finding any information on how to share the scriptler scripts I'm writing between scripts. I've tried using the ScriptHelper from the Scriptler API, but have run into issues when passing in arguments to the script.
Anyone else come across this and solve it? Is there a standard way to do this (without calling the Jenkins REST API) to execute a script?
More Details
We have a full build MultiJob that contains many phase jobs, each with their own artifacts, with a 3 day time to live on them. When a this full build job is promoted, a scriptler runs against it, pulling each of the phase jobs artifacts into the full build job. By doing so, we can keep the full build alive forever, without changing the lifetime on the artifacts for each phase job (essentially 'keep this build forever' on the full build, ignoring the lifetimes set in the phase jobs.
We also want to pull these artifacts into a deploy job. The idea is that we can point a deploy job to a full build, and it will pull out the artifacts we specify. If the full build is promoted, this script will pull the artifacts directly from the full build job, otherwise, it will pull them from the internal phase jobs. Since we have 2 scripts that work with MultiJobs, I would like to be able to share this code between them.
The script would take a MultiJob name and build number, and return the individual phase job's build numbers, build statuses, and artifact information.
This is possible using Groovy capabilities, though I don't know if Scripler supports it directly. If you are running on the master node, you can use Groovy evaluate. Scriptler scripts are stored as Groovy files on the file system of the master node in the $JENKINS_HOME/scriptler/scripts directory. The Scripter ID is the function name within that directory.
Here is a very simple example. It uses two files. The first is the parameterized function, findByScm.groovy, which finds jobs using a give source control type. The second script, findByGitScm.groovy will evaluate the first function for Git SCMs and print the results.
findByScm.groovy
import jenkins.model.*
jenkins = Jenkins.instance;
// Notice that myScmType is not defined in this function
scmJobs = jenkins.items
.findAll { job -> job.scm != null && job.scm.type == myScmType }
findByGitScm.groovy
// This is supplying the argument to findByScm.groovy
myScmType = 'hudson.plugins.git.GitSCM'
// Now we are evaluating the script
evaluate(new File("${System.getProperty('JENKINS_HOME')}/scriptler/scripts/findByScm.groovy"))
// scmJobs is a variable which was introduced in findByScm.groovy
scmJobs.each { println it }