I have the following code as part of my declarative pipeline:
String CRON_SETTINGS = BRANCH_NAME ==~ /(master|.*release.*)/ ? '''30 23 * * * % param1=value1''' : ""
pipeline {
parameters {
choice(name: 'param1', choices: ['value1', 'value2'], description: 'param')
}
triggers {
parameterizedCron(CRON_SETTINGS)
}
}
Currently the Cron behaves in the following way:
every night at 23:30 PM a build of the job is built if my branch name is master or if it contains the string 'release', always with the value of param1 set to value1.
What I would like to achieve is this:
In case the barnch name is master, run the cron with value1 set to param1 parameter,
However, if the branch name contains 'release', then run the cron with value2 set to param1 parameter.
Would apprreciate your help to achieve this,
Thanks.
Late answer and I haven't tested this, but how about using a switch statement?
switch (BRANCH_NAME) {
case 'master':
cronSettings = '30 23 * * * % param1=value1'
break
case '.*release.*':
cronSettings = '30 23 * * * % param1=value2'
break
default:
cronSettings = ""
break
}
pipeline {
parameters {
choice(name: 'param1', choices: ['value1', 'value2'], description: 'param')
}
triggers {
parameterizedCron(cronSettings)
}
}
Related
I'm running the following code where the idea is to run as much "TestRunner" as possible during night. I've removed some unnecessary code if some variables aren't there the for easier reading.
I want to know each job if it was successful, aborted or failed and when I'm using parallel I'm not able to see it.
How can I modify my code so i can print each job state when its done? adding a variable is being wiped out to the last element in the for loop.
Thanks a lot
def numberOfRuns = 0
def availableExecutors = 5
def parallelRuns = [:]
// building executers for later use in parallel
for (int i = 0; i < availableExecutors; i++) {
parallelRuns[i] = {
waitUntil {
build job: 'TestRunner', parameters: [
string(name: 'randomBit', value: "${randomBit}"),
], propagate: false
numberOfRuns++
def now = new Date()
return (now > workDayMorning)
}
}
}
// Parallel stage - running al executers
parallel parallelRuns
I've tried to enter a variable to track job process, i tried to use the parallelRuns as object but didnt manage to get the result of each test passed or not.
A solution i've found is:
def Job = build job: 'TestRunner', parameters: [
string(name: 'randomBit', value: ${randomBit}")
], propagate: false
echo "Runner: ${Job.getDisplayName()} has ${Job.getResult()}
with duration: ${Job.getDuration()}"
This prints data of the job.
I want to perform sum of integers (c = a+b) in jenkins, initially i have defined the value of a by giving def a = 5, but I want to take values of b from parameters . So i added string parameter but this is not considering as integer ,instead it is just attaching the 2 values , is there any way so that i can take inputs of b from parameter and perform addition
the pipeline is as follows
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
script{
def a = 5;
//def b = "${params.inputvalue}";
c = "${a + b}" ;
echo "value of c is ${c}"
}
}
}
}
}
in parameter if i give value of b as 2 the output it's giving as 25 but the expected output is 7 i.e 2+5
I could solve this by converting string parameter to integer
int a = 10;
stage('arithmetic stage') {
int b = params.Value;
c = a + b;
echo "${c}"
}
here "Value" is String parameter name .
I have a Shared Library containing a declarative pipeline which numerous jobs are using to build with.
However I want to be able to pass the trigger block in from the Jenkinsfile to the Shared Library as some jobs I want to trigger via Cron, others I want to trigger via SNS and others with an upstream job trigger.
Is there a way I can do this? Everything I have tried so fails
I have tried
#Jenkinsfile
#Library('build') _
buildAmi{
OS = "amazonlinux"
owners = "amazon"
filters = "\"Name=name,Values=amzn-ami-hvm-*-x86_64-gp2\""
template = "linux_build_template.json"
trigger = triggers {
cron('0 H(06-07) * * *')
}
#Shared Lib
pipeline {
$buildArgs.trigger
which fails with
Not a valid section definition
Have also tried passing just the cron schedule into the Shared lib e.g.
triggers {
cron("${buildArgs.cron}")
}
but that gives the error
Method calls on objects not allowed outside "script" blocks
Have tried various other thing but it seems the declarative style requires a trigger block with just triggers inside.
Does anyone know of a way to achieve what I am trying to do?
Too much code for a comment:
We wanted to merge some preset properties with jenkinsfile properties.
This is the way to get this to work since the properties method can only be called once.
We used an additional pipeline parameter with the trigger info:
Jenkinsfile:
mypipeline (
pipelineTriggers: [pollSCM(scmpoll_spec: '0 5,11,15,22 * * 1-5')],
)
then in the shared library
#Field def pipelineProperties = [
disableConcurrentBuilds(abortPrevious: true),
buildDiscarder(logRotator(numToKeepStr: '10', artifactNumToKeepStr: '0'))
]
def pipelineTriggersArgs = args.pipelineTriggers
def setJobProperties() {
def props = pipelineProperties
def str = ""
pipelineProperties.each { str += (blue + it.getClass().toString() + it.toString() + "\n") }
if (pipelineTriggersArgs) {
if (debugLevel > 29) {
def plTrig = pipelineTriggers(pipelineTriggersArgs)
str += (mag + "args: " + pipelineTriggersArgs.getClass().toString() + "\t\t" + pipelineTriggersArgs.toString() + "\n" + off)
str += (red + "expr: " + plTrig.getClass().toString() + "\t\t" + plTrig.toString() + "\n" + off)
echo str
}
props?.add(pipelineTriggers(pipelineTriggersArgs)) // pass param with list as the arg
}
properties(props)
}
That was the only way to merge preset and parameters since the properties cannot be overwritten. not exactly the answer but an approach to solve it I think.
I have a pipeline job which run some sequence of action (Eg; Build >> Run >> Report). I have put this sequence in a for loop as I can get a parameter which has list of value for which I should repeat the same sequence. Please find the sample code I have written.
param_val = params.param_1
param_list = param_val.split()
for (int i = 0; i < size(param_list); ++i){
item_value = param_list[i].trim()
builds[i] ={
stage('Build') {
build 'Build', parameters: [string(name: 'item', value: item_value)]
}
stage('Run') {
build 'Run', parameters: [string(name: 'item', value: item_value)]
}
stage('Reporting') {
build 'Reporting', parameters: [string(name: 'item', value: item_value)]
}
}parallel builds
}
I can get even more than 100 values for the iteration. So it is difficult to track it in the pipeline stage view. it just grows to right side.
Do we have any other plugin which can be used for the status view ? Some tree like view would be good.
I would like to dynamically generate my parameters on an input step from a loop inside my Jenkins pipeline.
This is my code:
var = input message: 'Tags a saisir', ok: 'Build!',
parameters: [
choice(name: 'name1', choices: file1),
choice(name: 'name2', choices: file2),
choice(name: 'name3', choices: file3),
choice(name: 'name4', choices: file4)
]
I would like to know if it is possible to generate each parameter from a loop like :
for (int i = 0; i < myList.size(); i++) {
theChoice = "choice(name : "+myList.get(i)+'choices: file"+i+")
}
and generate the input step from those lines.
Is that kind of approach possible ?
The main goal is to generate an input step with modular variables dependending on the jenkins file clone from the Git SCM
Regards,
Guillaume
You can do it like this
def parameterNames = fileMap.keySet().toList()
def choices = []
for (int i = 0; i < parameterNames.size(); i++) {
choices += choice(name : parameterNames[i], choices: fileMap[parameterNames[i]])
}
def var = input message: 'Tags a saisir', ok: 'Build!', parameters: choices
Where fileMap is a Map which contains possible choices fore each parameter name (key is a parameter name, value is a String of possible choices)