Best way to configure jenkins job running on different slaves - jenkins

I want to run a Jenkins job on 4 different slaves (windows, linux, solaris, Mac). Instead of making 4 different jobs I want to have a single job. I can use a Node parameter to execute on different slaves. My job runs a script which uses Jenkins workspace of slave and a few other scripts. My script is in a different folder on each slave, and other required scripts are in a different folder. So now I have created 4 different jobs for each slave and hard-coded Jenkins workspace and other required scripts path.
Is there any way so that I can put all paths in some JSON-like structure and depending on slave will pick those paths? So that I will have 1 job only.
Please suggest, Thanks in advance!

my idea is to use e.g "Execute system Groovy script" to get slave value and then use if statement to assigne proper path and create parameter visible in Environment Variables:
import hudson.model.Computer
import hudson.model.StringParameterValue
import hudson.model.ParametersAction
//get slave name
def slaveName = Computer.currentComputer().getNode().name
def path
//choose path
if(slaveName.equals("slave01")){
path = "C:"
}
if(slaveName.equals("slave02")){
path = "/root"
}
if(slaveName.equals("slave03")){
path = "D:"
}
//pass path as env. variable
build.addAction(new ParametersAction(new StringParameterValue('path', path)))
then you can use variable path in command:
echo %path%
or use Conditional BuildStep Plugin to set separable steps for each operation system and control when each step should be executed

Jenkins is designed to check out files from a version control system (Subversion, Git, whatever) and run tasks. Instead of trying to manage separate files on separate slaves, you should put your scripts in some form of version control and let Jenkins check out the files in the workspace as part of its build process.

Related

How to use the "Extensible choice parameter" in a jenkins freestyle job located on the slave to show files

I have a freestyle job and a parameterized build.
I want to populate an Extensible Choice with all xml file names inside my workspace.
Both the job and the workspace are running on the slave.
The textbox Base Directory says that every relative path has JENKINS_HOME as root, and that is the Jenkins location on the master. Something like C:/ has the same outcome.
I have the same problem with the Active Choice Parameter.
I don't know how to get access to my workspace, with the groovy script for my parameter.
I've tried the following:
def list = []
def dir = new File("C:/<path>")
dir.eachFileRecurse (FileType.FILES) { file ->
list << file
}
It results in a FileNotFoundException. When I try to input a path to the master, it works fine.
I labeled the slave and the job correctly (The job will only be executed on the slave).
Does anyone has a solution?

How to re-use groovy script in Jenkins Groovy Post Build plugin?

I have some groovy code which I am planning to re-use in Jenkins Groovy Post Build plugin of multiple jobs. How can I achieve this? Is there a place I can store the script in a global variable and call that in the jobs where ever I need?
You can load any groovy file living on the Jenkins master within the groovy postbuild and execute it. For example, you could have a special directory on the c drive where all the common scripts live. I'll update my answer later with some code that shows you how to load the script in.
Update
Assuming you have a test.groovy file on your C: drive, it should be as simple as the following in Groovy Postbuild:
evaluate(new File("C:\\test.groovy"))
Please view the comment section of the Groovy Postbuild for more examples and possibly other ways.
Here is the solution that worked for me:
Installed Scriptler plugin for Jenkins and saved the Groovy script in that. Now the script is available in JENKINS_HOME/scriptler/scripts directory. This way we can avoid manual step of copying files to Jenkins master.
Used the groovy file in Post build:
def env = manager.build.getEnvironment(manager.listener) evaluate(new File(env['JENKINS_HOME'] + "\\scriptler\\scripts\\GroovyForPostBuild.groovy"))
This is a copy of my answer to this similar question on StackOverflow:
If you wish to have the Groovy script in your Code Repository, and loaded onto the Build / Test Slave in the workspace, then you need to be aware that Groovy Postbuild runs on the Master.
For us, the master is a Unix Server, while the Build/Test Slaves are Windows PCs on the local network. As a result, prior to using the script, we must open a channel from the master to the Slave, and use a FilePath to the file.
The following worked for us:
// Get an Instance of the Build object, and from there
// the channel from the Master to the Workspace
build = Thread.currentThread().executable
channel = build.workspace.channel;
// Open a FilePath to the script
fp = new FilePath(channel, build.workspace.toString() + "<relative path to the script in Unix notation>")
// Some have suggested that the "Not NULL" check is redundant
// I've kept it for completeness
if(fp != null)
{
// 'Evaluate' requires a string, so read the file contents to a String
script = fp.readToString();
// Execute the script
evaluate(script);
}

Jenkins Multijob - pass project name parameter from Build_job1 to Deploy_job2

on a multi job I have two phases:
PhaseA running Build_job1, with a project name Build_job1, pulling stuff from git to dir: /var/lib/jenkins/workspace/Build_job1
PhaseB running Deploy_job2, that rsyncs /var/lib/jenkins/workspace/Build_job1/* to a bunch of servers.
For internal reasons I need to replicate the multijob, the build job and the deploy job to different environments (PROD, QA, Staging). I each case, the deploy job rsync will need to copy files from a different build directory (Build_QA, Build_Prod, Build_whatever etc.).
As Jenkins creates the dir per project name, I need the rsync command in the deploy job to get the project name as a parameter that is passed down from the build job.
help?
Are you wanting to pass down the current job's project name down to its children? If so, you can pass down this information via a Jenkins Set Environment Variables call "JOB_NAME" in conjunction with a predefined job parameter. For example, something like:
Param1=${JOB_NAME}
If the Multijob job name is "QA", you can pass that down to both the build and deploy phase jobs via a predefined parameter and then construct the final "Build_QA" path by doing something like "Build_${Param1}" or "Build_%Param1%".

Get absolute path to workspace directory in Jenkins Pipeline plugin

I'm currently doing some evaluation on the Jenkins Pipeline plugin (formerly know as Workflow plugin).
Reading the documentation I found out that I currently cannot retriev the workspace path using
env.WORKSPACE:
The following variables are currently unavailable inside a workflow script:
NODE_LABELS
WORKSPACE
SCM-specific variables such as SVN_REVISION
Is there any other way how to get the absolute path to the current workspace? I need this running some test which in turn gets some parameter (absolute path to some executable file).
I already tried using new File("").absolutePath() inside a #NonCPS section but looks like the non-CPS stuff gets always executed on the master.
Does anybody have a clue how to get this path without running some batch script which stores the path into some file which later on can be read in again?
Since version 2.5 of the Pipeline Nodes and Processes Plugin (a component of the Pipeline plugin, installed by default), the WORKSPACE environment variable is available again. This version was released on 2016-09-23, so it should be available on all up-to-date Jenkins instances.
Example
node('label'){
// now you are on slave labeled with 'label'
def workspace = WORKSPACE
// ${workspace} will now contain an absolute path to job workspace on slave
workspace = env.WORKSPACE
// ${workspace} will still contain an absolute path to job workspace on slave
// When using a GString at least later Jenkins versions could only handle the env.WORKSPACE variant:
echo "Current workspace is ${env.WORKSPACE}"
// the current Jenkins instances will support the short syntax, too:
echo "Current workspace is $WORKSPACE"
}
Note: this solution works only if the slaves have the same directory structure as the master. pwd() will return the workspace directory on the master due to JENKINS-33511.
I used to do it using pwd() functionality of pipeline plugin. So, if you need to get a workspace on slave, you may do smth like this:
node('label'){
//now you are on slave labeled with 'label'
def workspace = pwd()
//${workspace} will now contain an absolute path to job workspace on slave
}
"WORKSPACE" environment variable works for the latest version of Jenkins Pipeline. You can use this in your Jenkins file: "${env.WORKSPACE}"
Sample use below:
def files = findFiles glob: '**/reports/*.json'
for (def i=0; i<files.length; i++) {
jsonFilePath = "${files[i].path}"
jsonPath = "${env.WORKSPACE}" + "/" + jsonFilePath
echo jsonPath
hope that helps!!
For me WORKSPACE was a valid property of the pipeline itself. So when I handed over this to a Groovy method as parameter context from the pipeline script itself, I was able to access the correct value using "... ${context.WORKSPACE} ..."
(on Jenkins 2.222.3, Build Pipeline Plugin 1.5.8, Pipeline: Nodes and Processes 2.35)

Calling a Scriptler script within another Scriptler script

I'm using the Scriptler plugin for Jenkins, and am having a hard time finding any information on how to share the scriptler scripts I'm writing between scripts. I've tried using the ScriptHelper from the Scriptler API, but have run into issues when passing in arguments to the script.
Anyone else come across this and solve it? Is there a standard way to do this (without calling the Jenkins REST API) to execute a script?
More Details
We have a full build MultiJob that contains many phase jobs, each with their own artifacts, with a 3 day time to live on them. When a this full build job is promoted, a scriptler runs against it, pulling each of the phase jobs artifacts into the full build job. By doing so, we can keep the full build alive forever, without changing the lifetime on the artifacts for each phase job (essentially 'keep this build forever' on the full build, ignoring the lifetimes set in the phase jobs.
We also want to pull these artifacts into a deploy job. The idea is that we can point a deploy job to a full build, and it will pull out the artifacts we specify. If the full build is promoted, this script will pull the artifacts directly from the full build job, otherwise, it will pull them from the internal phase jobs. Since we have 2 scripts that work with MultiJobs, I would like to be able to share this code between them.
The script would take a MultiJob name and build number, and return the individual phase job's build numbers, build statuses, and artifact information.
This is possible using Groovy capabilities, though I don't know if Scripler supports it directly. If you are running on the master node, you can use Groovy evaluate. Scriptler scripts are stored as Groovy files on the file system of the master node in the $JENKINS_HOME/scriptler/scripts directory. The Scripter ID is the function name within that directory.
Here is a very simple example. It uses two files. The first is the parameterized function, findByScm.groovy, which finds jobs using a give source control type. The second script, findByGitScm.groovy will evaluate the first function for Git SCMs and print the results.
findByScm.groovy
import jenkins.model.*
jenkins = Jenkins.instance;
// Notice that myScmType is not defined in this function
scmJobs = jenkins.items
.findAll { job -> job.scm != null && job.scm.type == myScmType }
findByGitScm.groovy
// This is supplying the argument to findByScm.groovy
myScmType = 'hudson.plugins.git.GitSCM'
// Now we are evaluating the script
evaluate(new File("${System.getProperty('JENKINS_HOME')}/scriptler/scripts/findByScm.groovy"))
// scmJobs is a variable which was introduced in findByScm.groovy
scmJobs.each { println it }

Resources