Run script on multiple hosts using jenkins pipeline groovy script - jenkins

I am new to Jenkins pipeline groovy script.
I have a job param named 'HOSTNAMES' which will take some host name values separated by comma as an input. I need to execute some scripts on all these hosts in parallel mode. How to achieve this using jenkins groovy script. Pls guide me.

Assuming all these nodes (hostnames) are connected to the Jenkins server and the HOSTNAMES parameter is a list of the node names (as they appear in the jenkins server) you can use the parallel step to achieve what you want.
You will have to transform each node name into a map entry representing the parallel execution branch and then run them using the parallel keyword.
something like:
def hosts = HOSTNAMES.split(',')
def executions = hosts.collectEntries { host ->
["Running on ${host}" : {
node(host) {
// The code to run on each node
stage("Execution") {
echo 'example code on #{host}'
...
}
}
}]
}
parallel executions
You can also run it in one line if you want:
parallel hosts.collectEntries { ...

Related

Run multiple stages of Jenkins declarative pipe line in same docker slave

Objective
Objective here to migrate a scripted jenkins pipe line to declarative .Scripted pipe line is running on docker slave managed by kubernetes and the working syntax is as below
slave = 'dtr#tes.com/namespace/image:1.0'
dockerNode(image:slave)
{
stage('1'){echo "1"}
stage('2'){echo "2"}
}
The scripted pipe line is working perfect .
Concerns
Trying to use dockerNode to declrative pipeline but in declarative the dockerNode syntax is allowed only after stepes inside a stage
eg:
pipeline{
agent any
stages{
stage('1and2'}{
dockerNode(image:slave){
echo "1"
echo "2"
}
}
}
}
This is making concern to club bulky steps in to one stage than in to multiple one .So we would like your help to understand how can we better align and have multiple stages that is running in same container always .The container images is managed by kubernetes (kube pod with docker images)
To use one container for all steps you need to specify it in agent section
pipeline{
agent {
label 'docker-agent-label'
}
}
To use like so you need to configure pod template in 'Manage jenkins' -> 'Manage nodes and
Clouds' -> 'Configure clouds' -> 'Add new cloud' or use existing one.
It must be kubernetes if your jenkins host integrated with k8s.
UPDATE:

Running Groovy scripts in Pipeline on Slaves

I'm currently attempting to run Groovy code using Jenkins Pipeline, but I'm finding that my scripts run the Groovy part of the code on the Master rather than the Slaves despite me noting the agent.
I have a Repo in Git named JenkinsFiles containing the below JenkinsFile, as well as a repo named CommonLibrary, which contains the code run in the Stages.
Here's my JenkinsFile :
#Library(['CommonLibrary']) _
pipeline {
agent { label 'Slave' }
stages {
stage("Preparation") {
agent { label 'Slave && Windows' }
steps {
Preparation()
}
}
}
}
Here's the Preparation.groovy file :
def call() {
println("Running on : " + InetAddress.localHost.canonicalHostName)
}
Unfortunately, I always seem to get the Master returned when I run the Pipeline in Jenkins. I've tried manually installing Groovy on the Slave, and have also removed the Executors on the Master. Any Powershell that gets run triggers correctly on the Slave, and the $NODE_NAME value returns as the Slave, but it's just the Groovy commands that seem to run on the Master.
Any help would be greatly appreciated. Thanks! - Tor

Running a groovy script through jenkinsfile which runs on a remote linux box

I have a abc.groovy script which takes an argument. In my local I run it as
$ groovy abc.groovy <argumentValue>
I have stored this abc.groovy in a remote linux box under path "/home/path/to a directory/" and I have a jenkins pipeline job with a Jenkinsfile. How can I call abc.groovy from the JenkinsFile.
You can use GroovyShell to evaluate your script.
GroovyShell shell = new GroovyShell()
def execute = shell.parse(new File('/path/to/abc.groovy'))
execute.method()
You'll want to use the load step in your Jenkinsfile like this:
def pipeline {
agent 'slave'
stages {
stage ('Load Groovy Script') {
steps {
load 'path/to/abc.groovy'
}
}
}
(This example uses the declarative pipeline syntax, but is easily ported to scripted)
Note: you can't pass parameters to the groovy script in the load step, however this isn't hard to work around.

How to execute groovy/java code in the context of a jenkins-pipeline slave node block?

In this snippet:
stage('build') {
node ('myslave') {
git(url: 'git#hostname:project.git')
println(InetAddress.getLocalHost().getHostName())
}
}
The git step is executed correctly and checks out code into node's workspace.
But why do I get Masters' hostname when executing the second command?
For example, this is not working also in the context of a node() {}
new File("${WORKSPACE}).listFiles()
Which does not actually iterate the ${WORKSPACE} folder
All Groovy code in an Pipeline script is executed on the master. I'm been unable to find any way to execute a generic groovy code on the slave, not due to lack of functionality in the Jenkins core, but problems with Pipeline groovy and serialisation of objects. Found this related question which addresses remoting in groovy.
It is however possible to do file operations on the slave side, see this answer for example how you can access files on the slave.

Jenkins - Running a single job in master as well as slave

I have a master (linux) and a windows slave set up, and would like to build a single job both on the master and the slave. The "Restrict where this project can be run" option allows us to bind the job to a particular slave but is it possible to bind one job to the master as well as slave? How would one configure the "Build Step" since running it on Windows requires a build with Windows batch command and Linux requires shell command. For example even if the job tries to run on master and slave, wouldn't it fail at one point since both the build options (with batch and shell command) will be executed?
Well, in Jenkins you can create groups of machine (either master or slaves), to do this :
click on the machine name on the first page of jenkins
enter in the node configuration menu
then, you can enter some labels in the Labels field. Let's add a mutli_platform label for example
go back to the first page of Jenkins
do it for each machine on which you need to run the job
go back to the first page of Jenkins
click on the job you want to run on multiple nodes
go in the configuration menu
check the Restrict where this project can be run and put the mutli_platform in it.
Then, your build will be able to run on the mutli_platform label.
For the second part, the multi-platform script, you can use ant builds, or python builds (with the python plugin).
EDIT: If you need to build on the 2 (or more) platforms, you should use a Matrix Job. You will be able to create a job and force it to run on every slave you need.
This is how you should do it:
import groovy.json.JsonSlurperClassic
def requestNodes() {
def response = httpRequest url: '<your-master-url>/computer/api/json', authentication: '<configured-authentication>'
println("Status: "+response.status)
return new JsonSlurperClassic().parseText(response.content)
}
def Executor(node_name) {
return {
stage("Clean ${node_name}") {
node(node_name) {
//agent {node node_name}
echo "ON NODE: ${node_name}."
}
}
}
}
def makeAgentMaintainer() {
def nodes = requestNodes()
def agent_list = []
for (e in nodes['computer']) {
echo e.displayName
if (!e.offline) {
if (e.displayName != "master") {
agent_list.add(e.displayName)
}
}
}
def CleanAgentsMap = agent_list.collectEntries {
["${it}" : Executor(it)]
}
return CleanAgentsMap
}
node {
parallel makeAgentMaintainer()
}
You will need http_request plugin and do some approvals.
In the Executor function you can define the commands, that you want to execute on every Agent.

Resources