Bamboo: Override a runtime variable - build working directory? - jenkins

I'd like to override the build working directory for my bamboo plan.
I noticed the it's always something like <HOME>\<BUILD_PROJECT>-<BUILD-PLAN>-<JOB-KEY>
I'd like to override the build directory so all the stages and jobs will use the same.
Current Setup:
STAGE 1
JOB 1: build dir = C:\data\bamboo\agent5_1\xml-data\build-dir\PROJ-PLAN-S101
STAGE 2
JOB 1: build dir = C:\data\bamboo\agent5_1\xml-data\build-dir\PROJ-PLAN-S201
JOB 2: build dir = C:\data\bamboo\agent5_1\xml-data\build-dir\PROJ-PLAN-S202
Current Setup:
STAGE 1
JOB 1: build dir = C:\data\bamboo\agent5_1\xml-data\build-dir\PROJ-PLAN-FOO
STAGE 2
JOB 1: build dir = C:\data\bamboo\agent5_1\xml-data\build-dir\PROJ-PLAN-FOO
JOB 2: build dir = C:\data\bamboo\agent5_1\xml-data\build-dir\PROJ-PLAN-FOO
How can i achieve this?

I do not think you can or would want to use the same folder as it violates the multi-stage and concurrent job philosophy of Bamboo.
Multiple stages are separated by folder so that each build stage is isolated from the previous. If you want to share files between the stages you will want to use an artifact.
Multiple jobs are separated by folder so that they can be run concurrently. If the jobs were all in the same folder, this would not be possible due to permissions errors (especially on Windows). If you don't care about the jobs running concurrently the two jobs in the second stage could be combined.
Since you want to build in the same folder on the same system, it sounds like this pipeline could be simplified to one stage with one job.

Related

how to change the # and #tmp directory creation in jenkins workspace

Every time my build is running, i got 2 or 3 folders with # and #tmp
Example: If my build name is test, I run the build it fetches some code from git and store it in the jenkins workspaces with names test, test#2 test#2tmp test#tmp. But original folder is test. I only want the test folder and i need to remove the next 2 folders. How can i do this.
My present working directory is automatically choosing as test#2
I want the default pwd to be /var/lib/jenkins/workspace/
I want to delete the #2 and #tmp files and change my working directory to after the build runs
Sample output is:
pwd
/var/lib/jenkins/workspace/test#2
You can use customWorkspace in your jenkins pipelines:
Example:
agent {
node {
label 'my-label'
customWorkspace '/my/path/to/workspace'
}
}
Note that Jenkins use different directories to support concurrent builds:
If concurrent builds ask for the same workspace, a directory with a suffix such as #2 may be locked instead.
Since you don't want this behaviour I advise you to use disableConcurrentBuilds option:
Disallow concurrent executions of the Pipeline. Can be useful for preventing simultaneous accesses to shared resources, etc. For example: options { disableConcurrentBuilds() }
References on customWorkspace and disableConcurrentBuilds: https://jenkins.io/doc/book/pipeline/syntax/
look for agent{ } block in your pipeline groovy script. it should be only at the pipeline block level and not into any stage block.

Jenkins how to create pipeline manual step

Prior Jenkins2 I was using Build Pipeline Plugin to build and manually deploy application to server.
Old configuration:
That works great, but I want to use new Jenkins pipeline, generated from groovy script (Jenkinsfile), to create manual step.
So far I came up with input jenkins step.
Used jenkinsfile script:
node {
stage 'Checkout'
// Get some code from repository
stage 'Build'
// Run the build
}
stage 'deployment'
input 'Do you approve deployment?'
node {
//deploy things
}
But this waits for user input, noting that build is not completed. I could add timeout to input, but this won't allow me to pick/trigger a build and deploy it later on:
How can I achive same/similiar result for manual step/trigger with new jenkins-pipeline as prior with Build Pipeline Plugin?
This is a huge gap in the Jenkins Pipeline capabilities IMO. Definitely hard to provide due to the fact that a pipeline is a single job. One solution might be to "archive" the workspace as an "artifact" (tar and archive **/* as 'workspace.tar.gz'), and then have another pipeline copy the artifact and and untar it into the new workspace. This allows the second pipeline to pickup where the previous one left off. Of course there is no way to gauentee that the second pipeline cannot be executed out of turn or more than once. Which is too bad. The Delivery Pipeline Plugin really shines here. You execute a new pipeline right from the view - instead of the first job. Anyway - not much of an answer - but its the path I'm going to try.
EDIT: This plugin looks promising:
https://github.com/jenkinsci/external-workspace-manager-plugin/blob/master/doc/PIPELINE_EXAMPLES.md

Passing s3 artifacts from parallel builds to a single build in Jenkins Workflow

I am attempting to build a Windows installer through Jenkins.
I have a number of jenkins projects that build individual modules and then save these artifacts in s3 via the s3 artifact plugin.
I'd like to run these in parallel and copy the artifacts to a final "build-installer" job that takes all these and builds an installer image. I figured out how to run jobs in parallel with jenkins workflow but I don't know where to look to figure out how to extract job result details, ensure they're all the same changeset and pass it to the 'build-installer' job.
So far I have workflow script like this:
def packageBuilds = [:]
// these save artifacts to s3:
packageBuilds['moduleA'] = { a_job = build 'a_job' }
packageBuilds['moduleB'] = { b_job = build 'b_job' }
parallel packageBuilds
// pass artifacts from another jobs to below??
build job:'build-installer', parameters:????
Is this the right way? Or should I just have a mega build job that builds the modules and installer in one job?
A single job that does all the steps would be easier to manage.
I know file parameters are yet not supported for sending files to a Workflow job: JENKINS-27413. I have not tried sending files from a Workflow job using file parameters. Probably cannot work without some special support. (Not sure if you can even send file parameters between freestyle builds, for that matter.)

Jenkins builds loop list

I am trying to figure out a way to have one list of parameters, and have Jenkins create a job or run a build for each of the items in the list.
The parameter is a directory, so I have a list of directories, and I want it to work so for each of them, the build runs several steps - so basically for each directory, run git pull, ant command, ant command, ant command with the directory name, publish test results, next build.
I have looked at a bunch of plugins but I can't figure out how to do this to get it to go to the next item in the list until they're all done.
if I understand correctly you have on job? You can trigger it multiple times with different parameters (directory) by using BuildFlow Plugin. Create build flow job and inside this job call your job with different parameters. In build flow job you can trigger your job with parameters
build("AntJob", parDirectory: "C:\src1")
build("AntJob", parDirectory: "C:\src2")
you can also create smarter DSL and run this job in parallel
def dirTable = [ "C:\src1", "C:\src2", "C:\src3"]
def builds = []
dirTable.each{ d ->
def clr = { build("AntJob", parDirectory: d) }
builds.add(clr)
}
parallel(builds)</code>

Is it possible to run part of Job on master and the other part on slave?

I'm new to Jenkins. I have a requirement where I need to run part of a job on the Master node and the rest on a slave node.
I tried searching on forums but couldn't find anything related to that. Is it possible to do this?
If not, I'll have to break it into two separate jobs.
EDIT
Basically I have a job that checks out source code from svn, then compiles and builds jar files. After that it's building a wise installer for this application. I'd like to do source code checkout and compilation on the master(Linux) and delegate Wise Installer setup to a Windows slave.
It's definitely easier to do this with two separate jobs; you can make the master job trigger the slave job (or vice versa).
If you publish the files that need to be bundled into the installer as build artifacts from the master build, you can pull them onto the slave via a Jenkins URL and create the installer. Use the "Archive artifacts" post build step in the master build to do this.
The Pipeline Plugin allows you to write jobs that run on multiple slave nodes. You don't even have to go create other separate jobs in Jenkins -- just write another node statement in the Pipeline script and that block will just run on an assigned node. You can specify labels if you want to restrict the type of node it runs on.
For example, this Pipeline script will execute parts of it on two different nodes:
node('linux') {
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
sh "make"
step([$class: 'ArtifactArchiver', artifacts: 'build/program', fingerprint: true])
}
node('windows && amd64') {
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
sh "mytest.exe"
}
Some more information at the Pipeline plugin tutorial. (Note that it was previously called the Workflow Plugin.)
You can use the Multijob plugin which adds an the idea of a build phase which runs other jobs in parallel as a build step. You can still continue to use the regular freestyle job build and post build options as well

Resources