How to throttle matrix configurations in Jenkins declarative pipeline - jenkins

I need to throttle Matrix configurations in Jenkins declarative pipeline. For now I have a simple matrix like that:
matrix {
axes {
axis {
name 'Test'
values 'Example1','Example2','Example3'
}
}
}
And I can run only five configuration in parallel. First of all I created a category:
And now I'm trying to use throttle job property and there I'm stuck. As I see here in order to use throttle matrix configurations we need to pass matrixOptions parameters that contains two properties throttleMatrixBuilds and throttleMatrixConfigurations.
options {
throttleJobProperty(
categories: ['ForTestMatrix'],
throttleEnabled: true,
throttleOption: 'category',
matrixOptions: ???
)
}
Could someone tell me how to pass an object with two properties as a parameter there?
UPD
I managed to run this code like that:
options {
throttleJobProperty(
categories: ['ForTestMatrix'],
throttleEnabled: true,
throttleOption: 'category',
matrixOptions: new hudson.plugins.throttleconcurrents.ThrottleMatrixProjectOptions(false, true)
)
}
But when I start this job I see in blue ocean that all the matrix configurations start at once. Does anyone have ideas why throttling doesn't work correctly?

See Throttle Concurrent Builds:
Unsupported use cases
This section contains links to the use cases which are not supported.
Throttling of code blocks without a node() definition
...
Throttling Pipeline via job properties
Starting in throttle-concurrents-2.0, using this option is not recommended. Use the throttle() step instead.
The following pipeline:
pipeline {
agent any
stages {
stage('Matrix') {
matrix {
axes {
axis {
name 'Example'
values 'Example 1','Example 2','Example 3'
}
}
stages {
stage('Test') {
steps {
throttle(['ForTestMatrix']) {
node( 'master' ) {
sh 'set +x; date; sleep 5; date'
}
} // throttle
} // steps
} // stage 'Test'
} // stages
} // matrix
} // stage 'Matrix'
} // stages
} // pipeline
gives this output:
...
[Matrix - Example = 'Example 1'] + set +x
Fri Oct 22 15:54:01 GMT 2021
Fri Oct 22 15:54:06 GMT 2021
...
[Matrix - Example = 'Example 2'] + set +x
Fri Oct 22 15:54:06 GMT 2021
Fri Oct 22 15:54:12 GMT 2021
...
[Matrix - Example = 'Example 3'] + set +x
Fri Oct 22 15:54:12 GMT 2021
Fri Oct 22 15:54:17 GMT 2021
...

Related

i have a single pipeline with two stages each stage must be called on two different days of week

My jenkins scripted pipeline has 2 different stages one must be built on Friday and another stage must be built on Sunday...
I tried with cron job each for different stages, but I want to filter based on the day of the week, can I use any when the condition for this.?
node {
stage('build on friday ') {
echo "Hai its friday"
}
stage('build on sunday') {
echo "Hai its sunday"
}
}
Solution
I included an imperative (scripted) example using Calendar.
Imperative Pipeline Example
node {
def today = Calendar.getInstance();
def dayOfWeek = today.get(Calendar.DAY_OF_WEEK);
if( dayOfWeek == Calendar.FRIDAY ) {
stage('build on friday') {
echo "Hai its friday"
}
}
if( dayOfWeek == Calendar.SUNDAY ) {
stage('build on sunday') {
echo "Hai its sunday"
}
}
}
Obviously a run would need to occur on Friday and Sunday, so your cron would need to reflect that. For example: 0 1 * * 0,5

What is JCasc / Job DSL option for "Build when a change is pushed to BitBucket" flag?

I am trying to use jenkins helm chart that uses Jcasc and JobDSL for job configuration.
I have configured organizationFolder with bitbucket config inside that traverses my bitbucket repos and creates multibranchPipeline jobs. Looks good and receives webhooks from bitbucket but does not trigger jobs.
ue Nov 02 14:49:53 UTC 2021] Received com.cloudbees.jenkins.plugins.bitbucket.hooks.PushHookProcessor$1
UPDATED event from 10.0.0.112 ⇒ http://zzzzzzzzz:8080/bitbucket-scmsource-hook/notify with timestamp Tue Nov 02 14:49:53 UTC 2021
Connecting to https://bitbucket.org using zzzzzzzzzz/******
Repository type: Git
Looking up bddevteam83/infra-system for branches
Checking branch master from zzzzzzzzzzz/zzzzzz-system
'Jenkinsfile' found
Met criteria
Changes detected: master (null → d57b19309533ffd5133f88eb3809d1ad3896bfc8)
Did not schedule build for branch: master
I suspect it happens due to unchecked
"Build when a change is pushed to BitBucket" flag in multibranchPipeline config.
QUESTION: which JobDSL option enables "Build when a change is pushed to BitBucket" flag?
my config is based on this example and looks like this
organizationFolder('bitbucket') {
description("Bitbucket organization folder configured with JCasC")
displayName('bitbucket')
properties {
noTriggerOrganizationFolderProperty {
branches('develop')
}
}
// "Projects"
organizations {
bitbucket {
serverUrl("https://bitbucket.org/")
repoOwner("zzzzzzzzzzz")
credentialsId("bitbucket-http")
// "Traits" ("Behaviours" in the GUI) that are "declarative-compatible"
traits {
webhookRegistrationTrait {
mode('ITEM')
}
submoduleOptionTrait {
extension {
disableSubmodules(false)
recursiveSubmodules(true)
trackingSubmodules(false)
reference(null)
timeout(null)
parentCredentials(true)
}
}
}
}
}
// "Traits" ("Behaviours" in the GUI) that are NOT "declarative-compatible"
// For some 'traits, we need to configure this stuff by hand until JobDSL handles it
// https://issues.jenkins.io/browse/JENKINS-45504
configure { node ->
def traits = node / navigators / 'com.cloudbees.jenkins.plugins.bitbucket.BitbucketSCMNavigator' / traits
// Discover branches
traits << 'com.cloudbees.jenkins.plugins.bitbucket.BranchDiscoveryTrait' {
strategyId('1')
// Values
// 1 : Exclude branches that are also filed as PRs
// 2 : Only branches that are also filed as PRs
// 3 : All branches
}
traits << 'com.cloudbees.jenkins.plugins.bitbucket.SSHCheckoutTrait' {
credentialsId('bitbucket-git-ssh-key')
}
// Filter by name (with regular expression)
traits << 'jenkins.scm.impl.trait.RegexSCMSourceFilterTrait' {
regex('.*')
}
// Discover pull requests from origin
traits << 'com.cloudbees.jenkins.plugins.bitbucket.OriginPullRequestDiscoveryTrait' {
strategyId('1')
// Values
// 1 : Merging the pull request with the current target branch revision
// 2 : The current pull request revision
// 3 : Both the current pull request revision and the pull request merged with the current target branch revision
}
// Discover pull requests from forks
traits << 'com.cloudbees.jenkins.plugins.bitbucket.ForkPullRequestDiscoveryTrait' {
strategyId('1')
// Values
// 1 : Merging the pull request with the current target branch revision
// 2 : The current pull request revision
// 3 : Both the current pull request revision and the pull request merged with the current target branch revision
trustID('1')
// Values
// 0 : Everyone
// 1 : Forks in the same account
// 2 : Nobody
}
traits << 'jenkins.scm.impl.trait.RegexSCMHeadFilterTrait' {
regex('(master)|(develop)|(development)|(integration)|(release.*)')
}
}
// "Project Recognizers"
projectFactories {
workflowMultiBranchProjectFactory {
scriptPath 'Jenkinsfile'
}
}
// "Orphaned Item Strategy"
orphanedItemStrategy {
discardOldItems {
daysToKeep(30)
numToKeep(100)
}
}
// "Scan Organization Folder Triggers" : 1 day
// We need to configure this stuff by hand because JobDSL only allow 'periodic(int min)' for now
configure { node ->
node / triggers / 'com.cloudbees.hudson.plugins.folder.computed.PeriodicFolderTrigger' {
spec('H H * * *')
interval(86400000)
}
}
//// the below config I added seems to have no effect on "Build when a change is pushed to BitBucket"
configure { node ->
node / triggers / 'com.cloudbees.jenkins.plugins.BitBucketMultibranchTrigger' {
overrideUrl('')
}
}
configure { node ->
node / triggers / 'com.cloudbees.jenkins.plugins.BitBucketTrigger' {
overrideUrl('')
}
}
}
UPD: configuring anything via Web UI is not an option, everything must be managed as code.
It's all about using the branches method within properties.noTriggerOrganizationFolderProperty that
Allows you to control the SCM commit trigger coming from branch
indexing
organizationFolder(...) {
...
properties {
noTriggerOrganizationFolderProperty {
// automatically trigger builds for branches matching this regex at index time
branches('PR-\\d+')
}
}

Jenkins Triggering of a Build Step/Stage(not the entire job) at a given interval

I am trying to build a pipeline where i would need to chain multiple jobs and some of them has to start at a certain time.
Ex: Job1(starts at Midnight) -> Job2 -> Job3 ->Job4(starts at 4 PM)
Using Declarative Syntax:
pipeline {
agent any
stages{
stage('Fetch Latest Code-1') {
steps{
build job: 'Get Latest - All Nodes', quietPeriod: 60
}
}
stage('CI Day - 1') {
parallel {
stage('ANZ CI'){
steps{
build job: 'ANZ - CI', quietPeriod: 120
}
}
stage('BRZ CI'){
steps{
build job: 'BRZ_CI', quietPeriod: 120
}
}
stage('NAM CI'){
steps{
build job: 'NAM - CI', quietPeriod: 120
}
}
}
}
stage('BEP Part 2') {
steps{
build job: 'BEP_CI_Verification_Job', quietPeriod: 180
}
}
stage('Intermediate Results') {
steps{
build job: 'CI Automation Results', parameters: [string(name: 'Files', value: '_CI_')], quietPeriod: 300
}
}
}
}
As I create this job, I had configured the Job to start at 12 Midnight. Therefore, the 1st job automatically gets started at midnight.
But, I would also need the second job(CI Day - 1) to begin at 1 AM & the last Job 'Intermediate results' to start at 6 PM.
As these jobs are Multi-Configuration Jobs(already tried setting them individually at the desired timings but they get overwritten when called through pipeline).
Also, did try triggers{ cron(0 1 * * 6) } within the stage/steps. No luck!
Here is a quick idea for launching another job at a given time of day. Using Groovy code, calculate the difference in seconds between the desired launch time and the current time and use it as argument for the quietPeriod parameter.
If you get an error "Scripts not permitted to use method", you have to approve the methods using "Manage Jenkins" > "In-process script approval".
import groovy.time.*
pipeline {
agent any
stages{
stage('Stage 1') {
steps{
script {
def secs = secondsUntil hourOfDay: 15, minute: 30, second: 0
echo "anotherjob will be triggered in $secs seconds"
build job: 'anotherjob', quietPeriod: secs
}
}
}
}
}
long secondsUntil( Map dateProperties ) {
def now = new Date()
def to = now.clone()
to.set( dateProperties )
long duration = TimeCategory.minus( to, now ).toMilliseconds() / 1000
return duration > 0 ? duration : 0
}

How to Inject Stages or steps in Jenkins pipeline

The output of this python eval looks like it could be stages in a jenkins pipeline
$ python3 -c 'print("\n".join(["stage({val}) {{ do something with {val} }}".format(val=i) for i in range(3)]))'
stage(0) { do something with 0 }
stage(1) { do something with 1 }
stage(2) { do something with 2 }
Is it possible for jenkins to use output like this to create steps or stages in a pipeline so the running python script is able to update jenkins ? The point of this would be to have Blue Ocean pipeline have a stage dot that was made by an external script running separate jobs.
To elaborate on the example ... if this demo.py script which outputs the uptime in a stage
#!/bin/env python3.6
import subprocess, time
def uptime():
return (subprocess.run('uptime', stdout=subprocess.PIPE, encoding='utf8')).stdout.strip()
for i in range(3):
print("stage({val}) {{\n echo \"{output}\" \n}}".format(val=i, output=uptime()))
time.sleep(1)
where to be setup in a jenkins pipeline
node {
stage("start demo"){
sh "/tmp/demo.py"
}
}
As is this demo just outputs the text and does not create any stages in blue ocean
[Pipeline] sh
+ /tmp/demo.py
stage(0) {
echo "03:17:16 up 182 days, 12:17, 8 users, load average: 0.00, 0.03, 0.05"
}
stage(1) {
echo "03:17:17 up 182 days, 12:17, 8 users, load average: 0.00, 0.03, 0.05"
}
stage(2) {
echo "03:17:18 up 182 days, 12:17, 8 users, load average: 0.00, 0.03, 0.05"
}
Again the point of this would be to have Blue Ocean pipeline have a stage dot with a log
You can evaluate an expression and then call it.
node(''){
Closure x = evaluate("{it -> evaluate(it)}" )
x(" stage('test'){ script { echo 'hi'}}")
}
Since Jenkins converts your Groovy script into Java, compiles it and then executes the result, it would be quite hard to use an external program to generate more Groovy to execute, since that additional groovy code would need to be converted. But the generated code is a result of running, which means that the conversion is already done.
Instead, you may want to programmatically build your stages in Groovy.
some_array = ["/tmp/demo.py", "sleep 10", "uptime"]
def getBuilders()
{
def builders = [:]
some_array.eachWithIndex { it, index ->
// name the stage
def name = 'Stage #' + (index + 1)
builders[name] = {
stage (name) {
def my_label = "jenkins_label" // can choose programmatically if needed
node(my_label) {
try {
doSomething(it)
}
catch (err) { println "Failed to run ${it}"; throw err }
finally { }
}
}
}
};
return builders
}
def doSomething(something) {
sh "${something}"
}
And later in your main pipeline
stage('Do it all') {
steps {
script {
def builders = getBuilders()
parallel builders
}
}
This will run three parallel stages, where one would be running /tmp/demo.py, the second sleep 10, and the third uptime.

How to force jenkins to reload a jenkinsfile?

My jenkinsfile has several paremeters, every time I make an update in the parameters (e.g. remove or add a new input) and commit the change to my SCM, I do not see the job input screen updated accordingly in jenkins, I have to run an execution, cancel it and then see my updated fields in
properties([
parameters([
string(name: 'a', defaultValue: 'aa', description: '*', ),
string(name: 'b', description: '*', ),
string(name: 'c', description: '*', ),
])
])
any clues?
One of the ugliest things I've done to get around this is create a Refresh parameter which basically exits the pipeline right away. This way I can run the pipeline just to update the properties.
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return parameters.Refresh == true }
}
steps {
echo("Ended pipeline early.")
}
}
stage('Run Jenkinsfile') {
when {
expression { return parameters.Refresh == false }
}
stage('Build') {
// steps
}
stage('Test') {
// steps
}
stage('Deploy') {
// steps
}
}
}
}
There really must be a better way, but I'm yet to find it :(
Unfortunately the answer of TomDotTom was not working for me - I had the same issue and my jenkins required another stages unter 'Run Jenkinsfile' because of the following error:
Unknown stage section "stage". Starting with version 0.5, steps in a stage must be in a ‘steps’ block.
Also I am using params instead of parameters as variable to check the condition (as described in Jenkins Syntax).
pipeline {
agent any
parameters {
booleanParam(name: 'Refresh',
defaultValue: false,
description: 'Read Jenkinsfile and exit.')
}
stages {
stage('Read Jenkinsfile') {
when {
expression { return params.Refresh == true }
}
steps {
echo("stop")
}
}
stage('Run Jenkinsfile') {
when {
expression { return params.Refresh == false }
}
stages {
stage('Build') {
steps {
echo("build")
}
}
stage('Test') {
steps {
echo("test")
}
}
stage('Deploy') {
steps {
echo("deploy")
}
}
}
}
}
}
applied to Jenkins 2.233
The Jenkinsfile needs to be executed in order to update the job properties, so you need to start a build with the new file.
Apparently it is known Jenkins "issue" or "hidden secret" https://issues.jenkins.io/browse/JENKINS-41929.
I overcome this automatically using Jenkins Job DSL plugin.
I have Job DSL's seed job for my pipelines checking for changes in git repository with my pipeline.
pipelineJob('myJobName') {
// sets RELOAD=true for when the job is 'queued' below
parameters {
booleanParam('RELOAD', true)
}
definition {
cps {
script(readFileFromWorkspace('Jenkinsfile'))
sandbox()
}
}
// queue the job to run so it re-downloads its Jenkinsfile
queue('myJobName')
}
Upon changes seed job runs and re-generate pipeline's configuration including params. After pipeline is created/updated Job DSL will queue pipeline with special param RELOAD.
Pipeline than reacts to it in first stage and abort early. (Apparently there is no way in Jenkins to abort pipeline stop without error at the end of stage causing "red" pipeline.)
As parameters in Jenkinsfile are in properties, they will be set over anything set by seed job like RELOAD. At this stage pipeline is ready with actual params without any sign of RELOAD to confuse users.
properties([
parameters([
string(name: 'PARAM1', description: 'my Param1'),
string(name: 'PARAM2', description: 'my Param2'),
])
])
pipeline {
agent any
stages {
stage('Preparations') {
when { expression { return params.RELOAD == true } }
// Because this: https://issues.jenkins-ci.org/browse/JENKINS-41929
steps {
script {
if (currentBuild.getBuildCauses('hudson.model.Cause') != null) {
currentBuild.displayName = 'Parameter Initialization'
currentBuild.description = 'On first build we just load the parameters as they are not available of first run on new branches. A second run has been triggered automatically.'
currentBuild.result = 'ABORTED'
error('Stopping initial build as we only want to get the parameters')
}
}
}
}
stage('Parameters') {
steps {
echo 'Running real job steps...'
}
}
}
End result is as such that every time I update anything in Pipeline repository, all jobs generated by seed are updated and run to get updated params list. There will be message "Parameters initialization" to indicate such a job.
There is potentially way how to improve and only update affected pipelines but I haven't explore that as all my pipelines are in one repository and I'm happy with always updating them.
Another upgrade could be that if someone doesn't like "abort" with "error", you could have while condition in every other stage to skip it if parameter is RELOAD but I find adding when to every other stage cumbersome.
I initially tried #TomDotTom's answer but than I didn't liked manual effort.
Scripted pipeline workaround - can probably make it work in declarative as well.
Since you are using SCM, you can check which files have changed since last build (see here), and then decide what to do base on it.
Note that poll SCM on the job must be enabled to detect the Jenkinsfile changes automatically.
node('master') {
checkout scm
if (checkJenkinsfileChanges()) {
return // exit the build immediately
}
echo "build" // build stuff
}
private Boolean checkJenkinsfileChanges() {
filesChanged = getChangedFilesList()
jenkinsfileChanged = filesChanged.contains("Jenkinsfile")
if (jenkinsfileChanged) {
if (filesChanged.size() == 1) {
echo "Only Jenkinsfile changed, quitting"
} else {
echo "Rescheduling job with updated Jenkinsfile"
build job: env.JOB_NAME
}
}
return jenkinsfileChanged
}
// returns a list of changed files
private String[] getChangedFilesList() {
changedFiles = []
for (changeLogSet in currentBuild.changeSets) {
for (entry in changeLogSet.getItems()) { // for each commit in the detected changes
for (file in entry.getAffectedFiles()) {
changedFiles.add(file.getPath()) // add changed file to list
}
}
}
return changedFiles
}
I solve this by using Jenkins Job Builder python package. The main goal of this package is to achieve Jenkins Job as Code
To solve your problem I could simply use like below and keep that on SCM with a Jenkins pipeline which will listen to any changes for jobs.yaml file change and build the job for me so that whenever I trigger my job all the needed parameters will be ready for me.
jobs.yaml
- job:
name: 'job-name'
description: 'deploy template'
concurrent: true
properties:
- build-discarder:
days-to-keep: 7
- rebuild:
rebuild-disabled: false
parameters:
- choice:
name: debug
choices:
- Y
- N
description: 'debug flag'
- string:
name: deploy_tag
description: "tag to deploy, default to latest"
- choice:
name: deploy_env
choices:
- dev
- test
- preprod
- prod
description: "Environment"
project-type: pipeline
# you can use either DSL or pipeline SCM
dsl: |
node() {
stage('info') {
print params
}
}
# pipeline-scm:
# script-path: Jenkinsfile
# scm:
# - git:
# branches:
# - master
# url: 'https://repository.url.net/x.git'
# credentials-id: 'jenkinsautomation'
# skip-tag: true
# wipe-workspace: false
# lightweight-checkout: true
config.ini
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
query_plugins_info = False
url = http://localhost:8080
command to load / update the job
jenkins-jobs --conf conf.ini -u $JENKINS_USER -p $JENKINS_PASSWORD update jobs.yaml
Note - To use jenkins-jobs command, make sure you need install this jenkins-job-builder python package.
This package has a lot of features like create (free-style, pipeline, multibranch) , update, delete , validate jenkins job configuration. It supports Templates - meaning with one generic template, you can build an 'n' number of similar jobs, dynamically generate parameters and etc..

Resources