How to get the file path from the workspace in Jenkins - jenkins

My jenkins is located at myjenkins:8080/
After I run my Job it generates one file and I would like to get the complete path of it to give it to the user.
Example:
myjenkins:8080/job/FUTURE/job/GetFullPath/2/execution/node/3/ws/
I want to give this path to the user and the user would see the file generated there.
pipeline {
agent { label env.Machine}
stages {
stage('PREPARE'){
steps{
script{
env.custom_stage_name = "PREPARE"
bat '%FOLDER%\\CreateFile.bat'
}
}
}
stage('BUILD'){
steps{
ws("c:\\jenkins\\workspace\\Test") {
bat 'xcopy ' + '%FOLDER%\\File.txt ' + "c:\\jenkins\\workspace\\Test /I /s /e /h /y"
}
}
}
}
}

Based on your title it's not clear to me if you just need a way to get the current working directory or you want to expose the directory so you can access it using a browser.
If you meant the second case: First at all, I think this is not possible without using a workaround. And there are some problems you need to work around!
First of all let me show you how to get the path to the workspace (if it is no obvious)
You can get the Path to your workspace by using the variable ${env.WORKSPACE}
Example:
pipeline {
agent any
stages {
stage('Hello') {
steps {
echo "${env.WORKSPACE}"
}
}
}
}
Concatenate the file:
def my_file = "${env.WORKSPACE}/my.file"
To your actual problem
Unless your filename differs every run, it will get overwritten. There is no guarantee that the file in your workspace folder is preserved. To keep it you should tell Jenkins to archive (see point 2) your artifacts.
I can't image any good reason to expose your workspace like you want to do.
So, let me give you some alternative first examples:
You want to give or sent Jenkins's log-files to someone: You can do this using the email-ext plugin which allows attaching the log.
This is a neat way to get information about the build state and sending out the log to people.
You want to save your build artifacts - and I think this is what you actually want.
In this case you should archive them. See jenkins-docs on how to do this.
If the "archive artifacts" also do not fit your needs: You can use a separate Freestyle project. A Freestyle projects allows viewing and downloading (as a zip-file) its workspace content. But: This is by far not the best solution!
Basically you copy your files from your pipeline project to this freestyle project. You can access it using the freestyle project's URL.
If you really want to expose the workspace using an URL
This involves a separate webserver, since I can't think of a way on how to tell Jenkins to expose it's working directories as a webservice.
If you just want to gain access to your workspace folder, you can expose it using a separate webserver. The ideas is to run a very simple web-server to serve the current workspace directory.
Here is one example on how to deploy exactly this using python.

Related

Find artifacts in post-build actions

In a nutshell:
How can I access the location of the produced artifacts within a shell script started in a build or post-build action?
The longer story:
I'm trying to setup a jenkins job to automate the building and propagation of debian packages.
So far, I was already successfull in using the debian-pbuilder plugin to perform the build process, such that jenkins presents the final artifacts after successfully finishing the job:
mypackage_1+020200224114528.NOREV.4_all.deb
mypackage_1+020200224114528.NOREV.4_amd64.buildinfo
mypackage_1+020200224114528.NOREV.4_amd64.changes
mypackage_1+020200224114528.NOREV.4.dsc
mypackage_1+020200224114528.NOREV.4.tar.xz
Now I would like to also automate the deployment process into the local reprepro repository, which would actually just require a simple shell script invocation, I've put together.
My problem: I find no way to determine the artifact location for that deployment script to operate on. The "debian-pbuilder" plugin generates the artifacts in a temporary directory ($WORKSPACE/binaries.tmp15567690749093469649), which changes with every build.
Since the artifacts are listed properly in the finished job status view, I would expect that the artifact details are provided to the script (e.g. by environment variables). But that is obvously not the case.
I've already search extensively for a solution, but didn't find anything helpful.
Or is it me (still somewhat a Rookie in Jenkins), following a wron approach here?
You can use archiveArtifacts. You have binaries.tmp directory in the Workspace and you can use it, but before execute clear workspace using deleteDir().
Pipeline example:
pipeline {
agent any
stages {
stage('Build') {
steps {
deleteDir()
...
}
}
}
post {
always {
archiveArtifacts artifacts: 'binaries*/**', fingerprint: true
}
}
}
You can also check https://plugins.jenkins.io/copyartifact/

How to manage multiple Jenkins pipelines from a single repository?

At this moment we use JJB to compile Jenkins jobs (mostly pipelines already) in order to configure about 700 jobs but JJB2 seems not to scale well to build pipelines and I am looking for a way to drop it from the equation.
Mainly i would like to be able to have all these pipelines stored in a single centralized repository.
Please note that keeping the CI config (Jenkinsfile) inside each repository and branch is not possible in our use case, we need to keep all pipelines in a single "jenkins-jobs.git" repo.
As far as I know this is not possible yet, but in progress. See: https://issues.jenkins-ci.org/browse/JENKINS-43749
I think this is the purpose of jenkins shared libraries
I didn't dev such library my-self but I am using some. Basically:
Develop the "shared code" of the jenkins pipeline in a shared library
it can contains the whole pipeline (seq of steps)
Add this library to the jenkins server
In each project, add a jenkinsfile that "import" those using #Library
as #Juh_ said, you can use jenkins shared libraries, here is a complete steps, Suppose that we have three branches:
master
develop
stage
and we want to create a single Jenkins file so that we can change in only one place. All you need is creating a new branch ex: common. This branch MUST have this structure. What we are interested for now is adding a new groovy file in vars directory, ex: common.groovy. Here we can put the common Jenkins file that you wish to be used across all branches.
Here is a sample:
def call() {
node {
stage("Install Stage from common file") {
if (env.BRANCH_NAME.equals('master')){
echo "npm install from common files master branch"
}
else if(env.BRANCH_NAME.equals('develop')){
echo "npm install from common files develop branch"
}
}
stage("Test") {
echo "npm test from common files"
}
}
}
You must wrap your code call function in order to be used in other branches. now we have finished work in common branch we need to use it in our branches. go to any branch you wish to use this pipline ex: master and create Jenkinsfile and put this one line of code:
common()
This will call the common function that you have created before in common branch and will execute the pipeline.

Jenkins pipeline share information between jobs

We are trying to define a set of jobs on Jenkins that will do really specific actions. JobA1 will build maven project, while JobA2 will build .NET code, JobB will upload it to Artifactory, JobC will download it from Artifactory and JobD will deploy it.
Every job will have a set of parameters so we can reuse the same job for any product (around 100).
The idea behind this is to create black boxes, I call a job with some input and I get always some output, whatever happens between is something that I don't care. On the other side, this allows us to improve each job separately, adding the required complexity, and instantly all products will get benefit.
We want to use Jenkins Pipeline to orchestrate the execution of actions. We are going to have a pipeline per environment/usage.
PipelineA will call JobA1, then JobB to upload to artifactory.
PipelineB will download package JobC and then deploy to staging.
PipelineC will download package JobC and then deploy to production based on some internal validations.
I have tried to get some variables from JobA1 (POM basic stuff such as ArtifactID or Version) injected to JobB but the information seems not to be transfered.
Same happens while downloading files, I call JobC but the file is in the job workspace not available for any other and I'm afraid that"External Workspace Manager" plugin adds too much complexity.
Is there any way rather than share the workspace to achieve my purpose? I understand that share the workspace will make it impossible to run two pipelines at the same time
Am I following the right path or am I doing something weird?
There are two ways to share info between jobs:
You can use stash/unstash to share the files/data between multiple jobs in a single pipeline.
stage ('HostJob') {
build 'HostJob'
dir('/var/lib/jenkins/jobs/Hostjob/workspace/') {
sh 'pwd'
stash includes: '**/build/fiblib-test', name: 'app'
}
}
stage ('TargetJob') {
dir("/var/lib/jenkins/jobs/TargetJob/workspace/") {
unstash 'app'
build 'Targetjob'
}
In this manner, you can always copy the file/exe/data from one job to the other. This feature in pipeline plugin is better than Artifact as it saves only the data locally. The artifact is deleted after a build (helps in data management).
You can also use Copy Artifact Plugin.
There are two things to consider for copying an artifact:
a) Archive the artifacts in the host project and assign permissions.
b) After building a new job, select the 'Permission to copy artifact' → Projects to allow copy artifacts: *
c) Create a Post-build Action → Archive the artifacts → Files to archive: "select your files"
d) Copy the artifacts required from host to target project.
Create a Build action → Copy artifacts from another project → Enter the ' $Project name - Host project', which build 'e.g. Lastest successful build', Artifacts to copy '$host project folder', Target directory '$localfolder location'.
The first part of your question(to pass variables between jobs) please use the below command as a post build section:
post {
always {
build job:'/Folder/JobB',parameters: [string(name: 'BRANCH', value: "${params.BRANCH}")], propagate: false
}
}
The above post build action is for all build results. Similarly, the post build action could be triggered on the current build status. I have used the BRANCH parameter from current build(JobA) as a parameter to be consumed by 'JobB' (provide the exact location of the job). Please note that there should be a similar parameter defined in JobB.
Moreover, for sharing the workspace you can refer this link and share the workspace between the jobs.
You could use the Pipelines shared groovy libraries plugin. Have a look at its documentation to implement libraries that multiple pipelines share and define shared global variables.

Multi-branch configuration with externally-defined Jenkinsfile

I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?
I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}
First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;

How to re-use groovy script in Jenkins Groovy Post Build plugin?

I have some groovy code which I am planning to re-use in Jenkins Groovy Post Build plugin of multiple jobs. How can I achieve this? Is there a place I can store the script in a global variable and call that in the jobs where ever I need?
You can load any groovy file living on the Jenkins master within the groovy postbuild and execute it. For example, you could have a special directory on the c drive where all the common scripts live. I'll update my answer later with some code that shows you how to load the script in.
Update
Assuming you have a test.groovy file on your C: drive, it should be as simple as the following in Groovy Postbuild:
evaluate(new File("C:\\test.groovy"))
Please view the comment section of the Groovy Postbuild for more examples and possibly other ways.
Here is the solution that worked for me:
Installed Scriptler plugin for Jenkins and saved the Groovy script in that. Now the script is available in JENKINS_HOME/scriptler/scripts directory. This way we can avoid manual step of copying files to Jenkins master.
Used the groovy file in Post build:
def env = manager.build.getEnvironment(manager.listener) evaluate(new File(env['JENKINS_HOME'] + "\\scriptler\\scripts\\GroovyForPostBuild.groovy"))
This is a copy of my answer to this similar question on StackOverflow:
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}

Resources