How to get the project root directory in a Jenkins pipeline? - jenkins

I use Jenkins v2.222.1 and created a simple pipeline for a project. For the deployment I want to set the release/tag name manually. So the plan, how I want it working is:
I change the release name/number in a defined file, e.g. /root/of/my/project/release.txt.
I start deployment by clicking Build Now
Jenkins reads the release name/number from the file.
Jenkins checks out the appropriate tag (and creates and pushes the according branch).
pipeline {
agent {
label 'dev'
}
stages {
stage('build') {
environment {
APP_VERSION = sh (
script: 'eval "cat $REMOTE_ROOT_DIRECTORY/release.txt"', returnStdout: true
)
}
steps {
sh 'git fetch --all --tags'
sh 'git checkout ${APP_VERSION} -b v${APP_VERSION}'
...
}
}
...
}
}
Since I haven't found out, how to get the remote root directory path, I use a workaround: The $WORKSPACE is placed in the folder /workspace/$project_name (where the $project_name is the project name / title defined in Jenkins). So I just use this knowledge and define the path for cat as ../../release.txt. It works, but is a bit dirty because of the hard-coded path in the Jenkinsfile.
How to get / retrieve the project root directory dynamically in a Jenkins pipeline?

Related

Jenkins build via SSH, artifacts on host but not in workspace?

I've started with a new company and a very convoluted build methodology. The best way I've found to build projects is to command Jenkins to SSH into the build server and execute a string of commands there.
Building by executing shell commands using SSH works well, but the build artifacts don't show up in the Jenkins workspace. Therefore, I can't seem to archive the artifacts directly to Jenkins.
Is there a way to correct this or work around it? Can Jenkins be set up to archive files from outside the workspace?
Assuming that you've added your build node as a node (slave), here's a stash / unstash example
Stash on the build node
stage ('Stash file on node 1') {
node ('some_build_node') {
dir ('/tmp') {
stash name: 'TestTransfer', includes: 'foo.jar'
}
}
}
Unstash on the master
stage ('Unstash file on node 2') {
node ('master') {
// If you don't use a 'dir' block, it'll unstash in the workspace
// which is handy if you then want to archive the artifacts
dir ('/tmp') {
unstash 'TestTransfer'
}
}
}

Jenkins build failing in CI machine

I have installed jenkins in my CI server machine and i am creating a pipeline to build my project. Jenkins build fails saying it couldn't find the path. However i have mentioned my workspace path in my pipeline code. Also i am using SVN as my sub version. Kindly assist.
The error i am getting is
+ cd var/lib/jenkins/workspace/ProjectDemo/target
/var/lib/jenkins/workspace/ProjectDemo#tmp/durable-a40648b0/script.sh: line 1: cd: var/lib/jenkins/workspace/ProjectDemo/target: No such file or directory
pipeline {
agent any
stages {enter code here
stage('Code Checkout') {
steps {
checkout scm
}
}
stage('Build') {
steps {
sh 'cd var/lib/jenkins/workspace/ProjectDemo/target; mvn clean package'
}
}
}
}
The better solution is use the Jenkins' build in Environment Variable WORKSPACE which represents the job's workspace where source code resides after cloned from SVN or Git.
stage('Build') {
steps {
sh '''
pwd
ls -l
cd ${WORPKSPACE}
echo $PATH
mvn clean package
'''
}
}
Above pwd is to print the path of job's workspace folder, ls -l is to print out files & folders under job's workspace folder. You can remove them if the job's workspace folder is your expected work dir.

How to make subsequent checkout scm stages use local repo in a Jenkins pipeline?

We use Jenkins ECS plugin to spawn Docker containers for "each" job we build. So our pipelines look like
node ('linux') {
stage('comp-0') {
checkout scm
}
parallel(
"comp-1": {
node('linux') {
checkout scm
...
}
}
"comp-2": {
node('linux') {
checkout scm
...
}
}
)
}
The above pipeline will spawn 3 containers, one for each node('linux') call.
We set up a 'linux' node in our Jenkins configuration page to tell Jenkins the Docker repo/image we want to spawn. Its setup has a notion of 'Container mount points' which I assume are mounts on the host that the container can access.
So in above pipeline, I want the "first" checkout scm to clone the our repo onto a host path mounted by our containers, say /tmp/git. I then want the succeeding 'checkout scm' lines to clone the repo in my host's /tmp/git path.
I'm looking at How to mount Jenkins workspace in docker container using Jenkins pipeline to see how to mount a local path onto my docker
Is this possible?
You can stash the code from your checkout scm step and then unstash it in subsequent steps. Here's an example from the Jenkins pipeline documentation.
// First we'll generate a text file in a subdirectory on one node and stash it.
stage "first step on first node"
// Run on a node with the "first-node" label.
node('first-node') {
// Make the output directory.
sh "mkdir -p output"
// Write a text file there.
writeFile file: "output/somefile", text: "Hey look, some text."
// Stash that directory and file.
// Note that the includes could be "output/", "output/*" as below, or even
// "output/**/*" - it all works out basically the same.
stash name: "first-stash", includes: "output/*"
}
// Next, we'll make a new directory on a second node, and unstash the original
// into that new directory, rather than into the root of the build.
stage "second step on second node"
// Run on a node with the "second-node" label.
node('second-node') {
// Run the unstash from within that directory!
dir("first-stash") {
unstash "first-stash"
}
// Look, no output directory under the root!
// pwd() outputs the current directory Pipeline is running in.
sh "ls -la ${pwd()}"
// And look, output directory is there under first-stash!
sh "ls -la ${pwd()}/first-stash"
}
Jenkins Documentation on Stash/Unstash

How do I retrieve the remote FS root from declarative Jenkinsfile?

I have a heavyweight base project that rarely changes over time and has a lot of header files. I've set it up to be built in a custom workspace with a relative path base. So on a remote node with FS root set to C:\Jenkins, the resulting path on that particular node will be C:\Jenkins\base.
The reason for this setup is that I don't want to copy or unpack the whole base project for every dependent project to save build time. Also, I don't want to use absolute paths because I like the idea of a self-contained jenkins installation.
Now I have a second project project that uses base. I need to specify the path of base to the build system of project so that it will find the base headers it needs.
Is there any way to retrieve the remote FS root through the environment? I've tried using ${env.JENKINS_HOME} but this always resolves to the home folder of the Jenkins master. My build system expects to find the path to the base project in the PATH_TO_BASE environment variable:
pipeline {
agent none
stages {
stage('Build') {
parallel {
stage('Build Linux x64') {
agent {
label "Debian9_x64"
}
steps {
withEnv(["PATH_TO_BASE=${env.JENKINS_HOME}/base"]) {
sh '''mkdir -p _build
cd _build
cmake ..
cmake --build .'''
}
}
}
stage('Build Windows10') {
agent {
label "Windows10"
}
steps {
withEnv(["PATH_TO_BASE=${env.JENKINS_HOME}/base"]) {
bat '''if not exist _build mkdir _build
cd _build
cmake ..
cmake --build .'''
}
}
}
}
}
}
}
I usually configure slave with a set of environment variables.
Go to the slave configuration
Set Environment varaible under "node properties" section.

How to set specific workspace folder for jenkins multibranch pipeline projects

I have an external tool that should be called as build-step in one of my jenkins jobs. Unfortunately, this tool has some issues with quoting commands to avoid problems with whitespaces in the path that is called from.
Jenkins is installed in C:\Program Files (x86)\Jenkins. Hence I'm having trouble with jenkins calling the external tool.
What I tried is to set "Workspace Root Directory" in Jenkins->configuration to C:\jenkins_workspace in order to avoid any whitespaces. This works for Freestyle Projects but my Multibranch Pipeline Project is still checked out and built under C:\Program Files (x86)\Jenkins\workspace.
One solution would be to move the whole jenkins installation to e.g. C:\jenkins. This I would like to avoid. Is there a proper way to just tell Jenkins Pipeline jobs to use the "Workspace Root Directory" as well?
Thanks for any help
the ws instruction sets the workspace for the commands inside it. for declarative pipelines, it's like this:
ws("C:\jenkins") {
echo "awesome commands here instead of echo"
}
You can also call a script to build the customWorkspace to use:
# if the current branch is master, this helpfully sets your workspace to /tmp/ma
partOfBranch = sh(returnStdout: true, script: 'echo $BRANCH_NAME | sed -e "s/ster//g"')
path = "/tmp/${partOfBranch}"
sh "mkdir ${path}"
ws(path) {
sh "pwd"
}
you can also set it globally by using the agent block (generally at the top of the pipeline block), by applying it to a node at that level:
pipeline {
agent {
node {
label 'my-defined-label'
customWorkspace '/some/other/path'
}
}
stages {
stage('Example Build') {
steps {
sh 'mvn -B clean verify'
}
}
}
}
Another node instruction later on might override it. Search for customWorkspace at https://jenkins.io/doc/book/pipeline/syntax/. You can also it use it with the docker and dockerfile instructions.
Try this syntax instead:
pipeline {
agent {
label {
label 'EB_TEST_SEL'
customWorkspace "/home/jenkins/workspace/ReleaseBuild/${BUILD_NUMBER}/"
}
}
}

Resources