Jenkins job separate workspace by build - jenkins

We have a job which generate some html files in the workspace folder during the build.
Our goal is to get those files after the build is completed and zip them.
The first step of the job is to clean workspace - to be sure that there is no files from previous builds.
Our problem apear when we start a build and someone start separate build - the workspace became wiped. The both of the builds are generating those html's and the content become mixed from the different builds.
If somebody having idea how can I separate every build to have their own workspace will be very glad to share it. I want this to be applied only for ONE job. Other jobs must stay with shared workspace.

If you have a pipeline job, you can just add a post action before termination of your job:
something like :
if (currentBuild.result == "SUCCESS") {
sh '''
tar czf myArchive.tgz *.html
scp myArchive.tgz xxx#xxxx:
'''
} else {
step ([$class: 'Mailer', recipients: 'xxx#xxx.com'])
}
cleanWs cleanWhenFailure: false
I'll do some research if you really want to manipulate workspace. Maybe you can do something with redeclaring the path for the env.WORKSPACE variable but it doesn't seems great to me.

This is the solution that I've looked for:
pipeline {
agent {
node {
label 'master'
customWorkspace "${JENKINS_HOME}/workspace/${JOB_NAME}/${BUILD_NUMBER}"
}
}
}
In the end I use cleanup to remove generated folders for each build like this:
post {
cleanup {
deleteDir()
dir("${workspace}#tmp") {
deleteDir()
}
dir("${workspace}#script") {
deleteDir()
}
}
}
Thanks you guys

Under the job configuration, check 'Use custom workspace' and pass where you want this workspace to be made. I usually have the workspace directories made by passing in the $BUILD_NUMBER variable for the directory.
Use Custom Workspace

Related

How to access file from git repo in jenkins pipeline

I'm very new to jenkins. I have a monorepo where I need to check what needs to be build. Created a pipeline element in Jenkins, named it Test and copy pasted a script to do a check what files changed:
def changedFiles = currentBuild.changeSets
.collect { it.getItems() }
.flatten() //Ensures that we look through each commit, not just the first.
.collect { it.getAffectedPaths() }
.flatten()
.toSet() //Ensures uniqueness.
echo("Changed files: ${changedFiles}")
def changedDirectories = changedFiles.collect {
if (it.contains('folder/in/monorepo/project/foo')) { //Code cut for simplicity
return "Foo"
}
return ''
}
.unique()
echo("Changed directories: ${changedDirectories}")
if (changedDirectories.contains("Foo")) {
stage('Foo Build') {
node {
env.NODEJS_HOME = "${tool 'NodeJS'}"
env.PATH="${env.NODEJS_HOME}/bin:${env.PATH}"
dir("folder/in/monorepo/buildscripts/") {
sh 'mybuildscript.sh'
}
}
}
} else {
echo("No Foo Build")
}
The code for loading NodeJS is documentend in the NodeJS plugin.
I installed Jenkins, tried some things and then installed NodeJS plugin and updates for other plugins.
Now, with the updated plugins there is some trouble which basically is related to this: Why does Jenkins creates a subfolder within the workspace#script folder to checkout git code instead of the workspace#script itself?
Instead of cloning into the workspace folder, the git repo gets cloned to folder with a unique id.
This leads to the fact, that my folder like in the command dir("folder/in/monorepo/buildscripts/") is not found or to be more precise: It's created and empty and not the one which is in the git repo because it is executed relative to my empty workspace instead relative to the cloned repo.
My questions are:
Is there a much easier way to achieve what I try? AFAIK I need to implement the search for the folder as in the referenced so question.
Doesn't this security fix from the referenced so question breaks all builds / jenkinsfiles? I mean: Eventually in a pipeline someone wants to call a build script, a build tool or a simple make with the makefile in the repo.
EDIT: I just realised that the only reason why the repo is cloned, is because I use the option "Pipeline script from SCM" with "Lightweight checkout" unchecked. I assume that the intended way would be to use a "Lightweight checkout" to just get the Jenkinsfile and then clone the repo as part of the script. A "Pipeline script" written in the gui, would also never get a clone of the repository.

How to get the file path from the workspace in Jenkins

My jenkins is located at myjenkins:8080/
After I run my Job it generates one file and I would like to get the complete path of it to give it to the user.
Example:
myjenkins:8080/job/FUTURE/job/GetFullPath/2/execution/node/3/ws/
I want to give this path to the user and the user would see the file generated there.
pipeline {
agent { label env.Machine}
stages {
stage('PREPARE'){
steps{
script{
env.custom_stage_name = "PREPARE"
bat '%FOLDER%\\CreateFile.bat'
}
}
}
stage('BUILD'){
steps{
ws("c:\\jenkins\\workspace\\Test") {
bat 'xcopy ' + '%FOLDER%\\File.txt ' + "c:\\jenkins\\workspace\\Test /I /s /e /h /y"
}
}
}
}
}
Based on your title it's not clear to me if you just need a way to get the current working directory or you want to expose the directory so you can access it using a browser.
If you meant the second case: First at all, I think this is not possible without using a workaround. And there are some problems you need to work around!
First of all let me show you how to get the path to the workspace (if it is no obvious)
You can get the Path to your workspace by using the variable ${env.WORKSPACE}
Example:
pipeline {
agent any
stages {
stage('Hello') {
steps {
echo "${env.WORKSPACE}"
}
}
}
}
Concatenate the file:
def my_file = "${env.WORKSPACE}/my.file"
To your actual problem
Unless your filename differs every run, it will get overwritten. There is no guarantee that the file in your workspace folder is preserved. To keep it you should tell Jenkins to archive (see point 2) your artifacts.
I can't image any good reason to expose your workspace like you want to do.
So, let me give you some alternative first examples:
You want to give or sent Jenkins's log-files to someone: You can do this using the email-ext plugin which allows attaching the log.
This is a neat way to get information about the build state and sending out the log to people.
You want to save your build artifacts - and I think this is what you actually want.
In this case you should archive them. See jenkins-docs on how to do this.
If the "archive artifacts" also do not fit your needs: You can use a separate Freestyle project. A Freestyle projects allows viewing and downloading (as a zip-file) its workspace content. But: This is by far not the best solution!
Basically you copy your files from your pipeline project to this freestyle project. You can access it using the freestyle project's URL.
If you really want to expose the workspace using an URL
This involves a separate webserver, since I can't think of a way on how to tell Jenkins to expose it's working directories as a webservice.
If you just want to gain access to your workspace folder, you can expose it using a separate webserver. The ideas is to run a very simple web-server to serve the current workspace directory.
Here is one example on how to deploy exactly this using python.

Find artifacts in post-build actions

In a nutshell:
How can I access the location of the produced artifacts within a shell script started in a build or post-build action?
The longer story:
I'm trying to setup a jenkins job to automate the building and propagation of debian packages.
So far, I was already successfull in using the debian-pbuilder plugin to perform the build process, such that jenkins presents the final artifacts after successfully finishing the job:
mypackage_1+020200224114528.NOREV.4_all.deb
mypackage_1+020200224114528.NOREV.4_amd64.buildinfo
mypackage_1+020200224114528.NOREV.4_amd64.changes
mypackage_1+020200224114528.NOREV.4.dsc
mypackage_1+020200224114528.NOREV.4.tar.xz
Now I would like to also automate the deployment process into the local reprepro repository, which would actually just require a simple shell script invocation, I've put together.
My problem: I find no way to determine the artifact location for that deployment script to operate on. The "debian-pbuilder" plugin generates the artifacts in a temporary directory ($WORKSPACE/binaries.tmp15567690749093469649), which changes with every build.
Since the artifacts are listed properly in the finished job status view, I would expect that the artifact details are provided to the script (e.g. by environment variables). But that is obvously not the case.
I've already search extensively for a solution, but didn't find anything helpful.
Or is it me (still somewhat a Rookie in Jenkins), following a wron approach here?
You can use archiveArtifacts. You have binaries.tmp directory in the Workspace and you can use it, but before execute clear workspace using deleteDir().
Pipeline example:
pipeline {
agent any
stages {
stage('Build') {
steps {
deleteDir()
...
}
}
}
post {
always {
archiveArtifacts artifacts: 'binaries*/**', fingerprint: true
}
}
}
You can also check https://plugins.jenkins.io/copyartifact/

How to avoid that Jenkins reuses a workspace?

I have several parallel stages that share a node, and clean up their workspace after they are done. The issue I have is that when the stage fails, I want the workspace NOT cleaned up, so I can inspect it.
What happens instead is:
failing stage fails, leaves the workspace as I want it
second stage reuses the workspace, succeeds
second stage cleans up the workspace
How can I avoid this?
Jenkins has a post-stage for this. Depending on the result of your pipeline a different branch of code is executed. So lets say your pipeline is successful then your cleanup script of clean up plugin is called. If you pipeline fails you can archive your results or simply skip the cleanup of the workspace.
Check the official jenkin documentation for more information (search for 'post'): https://jenkins.io/doc/book/pipeline/syntax/
pipeline {
agent any
stages {
stage('PostExample') {
steps {
// do something here
}
}
}
post { //Is called after your stage
failure {
//pipeline failed - do not clear workspace
}
success {
//pipeline is successful - clear workspace
}
}
}
On the other hand if you want to keep your results you could think about archiving them so they are independent from your workspace because you can access them anytime from the jenkins gui (
you just have to use finally(this will execute irrespective of stage output) method while you are executing jenkins files: Refer to How to perform actions for failed builds in Jenkinsfile

Is there a way to run a pre-checkout step in declarative Jenkins pipelines?

Jenkins declarative pipelines offer a post directive to execute code after the stages have finished. Is there a similar thing to run code before the stages are running, and most importantly, before the SCM checkout?
For example something along the lines of:
pre {
always {
rm -rf ./*
}
}
This would then clean the workspace of my build before the source code is checked out.
pre is a cool feature idea, but doesn't exist yet. skipDefaultCheckout and checkout scm (which is the same as the default checkout) are the keys:
pipeline {
agent { label 'docker' }
options {
skipDefaultCheckout true
}
stages {
stage('clean_workspace_and_checkout_source') {
steps {
deleteDir()
checkout scm
}
}
stage('build') {
steps {
echo 'i build therefore i am'
}
}
}
}
For the moment there are no pre-build steps but for the purpose you are looking for, it can be done in the pipeline job configurarion and also multibranch pipeline jobs, when you define where is your jenkinsfile, choose Additional Behaviours -> Wipe out repository & force clone.
Delete the contents of the workspace before building, ensuring a fully fresh workspace.
If you do not really want to delete everything and save some network usage, you can just use this other option: Additional Behaviours -> Clean before checkout.
Clean up the workspace before every checkout by deleting all untracked files and directories, including those which are specified in .gitignore. It also resets all tracked files to their versioned state. This ensures that the workspace is in the same state as if you cloned and checked out in a brand-new empty directory, and ensures that your build is not affected by the files generated by the previous build.
This one will not delete the workspace but just reset the repository to the original state and pull new changes if there are some.
I use "Prepare an environment for the run / Script Content"

Resources