Build Periodically Syntax for Jenkins Configuration as Code Plug-In (JCasC) - jenkins

I'm trying to use the configuration as code (JCasC) plug-in to create a pipeline job that builds periodically, but I can't find the syntax for this anywhere online. I'm writing the configuration in YAML.
The "Build Periodically" field is under Build Triggers in the pipeline jobs and has a text field called Schedule. My schedule is 0 6-19 * * *
Is this even possible to do?
This is the yaml file that I am trying to edit:
jobs:
- script: >
folder('test1'){
pipelineJob('test1/seedJobTest') {
description 'seedJobTest'
logRotator {
daysToKeep 10
}
definition {
cpsScm {
scm {
git {
remote {
credentials "xxx"
url 'xxx'
}
branches 'refs/head/master'
scriptPath 'Jenkinsfile'
extensions { }
}
}
}
}
configure { project ->
project / 'properties' / 'EnvInjectJobProperty' {
'on'('true')
'info' {
'propertiesContent'('BRANCH=master')
}
}
project / 'properties' / 'org.jenkinsci.plugins.workflow.job.properties.DisableConcurrentBuildsJobProperty' {}
}
}
}

If using JCasC to configure your build/pipeline configuration:
To build periodically, regardless of SCM changes, you can add this block:
triggers {
cron('0 6-19 * * *')
}
To build periodically, only if there were SCM changes, you can use this block:
triggers {
scm('0 6-19 * * *')
}
To view this answer in context, here is a code snippet example:
jobs:
- script: |
job('PROJ-unit-tests') {
scm {
git(gitUrl)
}
triggers {
cron('0 6-19 * * *')
}
steps {
maven('-e clean test')
}
}
Snippet taken and adjusted from: https://github.com/jenkinsci/configuration-as-code-plugin/issues/876

Related

jenkins yaml Jcasc triggers and poll scm to add jobs automatically

I have the following jenkins yaml which works and puts in the jobs automatically. it wont add the credentials unless I go into UI and select the same ID "github" or allow me add polling or triggering
I have tried a number of combinations that either crash the deploy or don not add the jobs at all.
triggers {
pollSCM 'H/10 * * * *'
}
triggers {
cron (H/10 * * * *)
}
I would like to add cron and poll scm as once job is run manually its picked up from the repo jenkinsfile
jenkins:
systemMessage: "Jenkins: configured automatically with JCasC plugin\n\n"
tool:
git:
installations:
- home: "git"
name: "Default"
jobs:
- script: >
pipelineJob('my_pipleline_build') {
definition {
cpsScm {
scriptPath 'Jenkinsfile'
scm {
git {
remote { url 'https://github.com/my_pipleline_build.git' }
branch '*/master'
credentials: ('github')
extensions {}
}
}
}
}
}
- script: >
pipelineJob('my_other_pipleline_build') {
definition {
cpsScm {
scriptPath 'Jenkinsfile'
scm {
git {
remote { url 'https://github.com/cloud/my_other_pipleline_build.git' }
branch '*/my_pipleline_build'
credentials: ('github')
extensions {}
}
}
}
}
}
I was able to achieve using below,
- script: >
pipelineJob('my_other_pipleline_build') {
definition {
cpsScm {
scriptPath 'Jenkinsfile'
scm {
git {
remote { url 'https://github.com/cloud/my_other_pipleline_build.git'
credentials('github')
}
branch '*/my_pipleline_build'
extensions {}
}
triggers {
cron("H 12 * * 6")
}
}
}
}
}

Parameterized Build Syntax for Jenkins Configuration as Code Plug-in (JCasC)

I'm trying to use the configuration as code (JCasC) plug-in to create a pipeline job that takes in build parameters but I can't find the syntax for this anywhere online. I'm writing the configuration in YAML.
On the GUI, the field is called "This build is paramertized" and it is under the 'General' heading. I need to define two string parameters: CLUSTER_ID=cluster_id and OPENSHIFT_ADMINSTRATION_BRANCH=develop.
This is the yaml file that I am trying to edit:
jobs:
- script: >
folder('test1'){
pipelineJob('test1/seedJobTest') {
description 'seedJobTest'
logRotator {
daysToKeep 10
}
definition {
cpsScm {
scm {
git {
remote {
credentials "xxx"
url 'xxx'
}
branches 'refs/head/master'
scriptPath 'Jenkinsfile'
extensions { }
}
}
}
}
configure { project ->
project / 'properties' / 'EnvInjectJobProperty' {
'on'('true')
}
project / 'properties' / 'org.jenkinsci.plugins.workflow.job.properties.DisableConcurrentBuildsJobProperty' {}
}
}
}
Thanks for your help!
Solution
jobs:
- script: >
folder('test1'){
pipelineJob('test1/seedJobTest') {
description 'seedJobTest'
logRotator {
daysToKeep 10
}
parameters {
stringParam("CLUSTER_ID", "cluster_id", "your description here")
stringParam("OPENSHIFT_ADMINSTRATION_BRANCH", "develop", "your description here")
}
definition {
cpsScm {
scm {
git {
remote {
credentials "xxx"
url 'xxx'
}
branches 'refs/head/master'
scriptPath 'Jenkinsfile'
extensions { }
}
}
}
}
configure { project ->
project / 'properties' / 'EnvInjectJobProperty' {
'on'('true')
}
project / 'properties' / 'org.jenkinsci.plugins.workflow.job.properties.DisableConcurrentBuildsJobProperty' {}
}
}
}
How to Figure this Stuff Out in the Future - XML Job To DSL (Jenkins Plugin)
Here's how I would go about figuring this kind of thing:
Manually create a temporary pipeline job with the things you want in your seed job (the one you want to automate).
Install (if only temporarily) the "XML Job To DSL" Jenkins plugin.
Go to the main Jenkins Dashboard
In the left navigation, you'll find "XML Job To DSL." Click it.
Select the temporary job you created and click "Convert selected to DSL"
When I went about getting the params snippet for this answer, I did as I described above, but simply created two parameters. I ended up with this:
pipelineJob("test") {
description()
keepDependencies(false)
parameters {
stringParam("CLUSTER_ID", "cluster_id", "your description here")
stringParam("OPENSHIFT_ADMINSTRATION_BRANCH", "develop", "your description here")
}
definition {
cpsScm {
"" }
}
disabled(false)
}
Read-Only Parameter Option
One more thing, in case it's useful to you (as it was to me). If you want to create a parameterized seed job but you don't want those to be editable at build time, you can install the "Readonly Parameter" Jenkins plugin; then, you'll be able to do this kind of thing:
jobs:
- script: >
pipelineJob("Param Example") {
description()
keepDependencies(false)
parameters {
wHideParameterDefinition {
name('AGENT')
defaultValue('docker-host')
description('Node on which to run.')
}
wHideParameterDefinition {
name('ENV_FILE_DIR')
defaultValue('local2')
description('Name of environment directory which houses .env')
}
booleanParam("include_search_stack", false, "Build/run the local Fess, Elasticsearch, and Kibana containers.")
booleanParam("SKIP_404_GENERATION", false, "Helpful sometimes during local development.")
}
definition {
cpsScm {
scm {
git {
remote {
url("https://myrepo/blah.git")
credentials("scm")
}
branch("master")
}
}
scriptPath("pipeline/main/Jenkinsfile")
}
}
disabled(false)
}
In this example, the top two params, AGENT and ENV_FILE_DIR are sort of "hard-coded" from CasC, because the those parameters are not editable at build-time. However, the include_search_stack and SKIP_404_GENERATION parameters are editable. I used this mixed example to show that either/both are usable in the same job.
Read-only parameters have been useful in some of my use cases.

DSL Seed Job for Multibranch pipeline with Bitbucket branch plugin suppress auto build of branches

Have a DSL job to create multibranch pipeline jobs in jenkins, running Jenkins 2.107.1 with plugins: 'Branch API Plugin' 2.0.18, 'Bitbucket Branch Source Plugin' 2.2.10.
I'm unable to find a proper configuration function to enable property to "Suppress automatic SCM triggering", please help.
Here is my job that works but its just triggers the build as soon as it scans for branch:
multibranchPipelineJob("job") {
configure {
it / sources / data / 'jenkins.branch.BranchSource' / source(class: 'com.cloudbees.jenkins.plugins.bitbucket.BitbucketSCMSource') {
credentialsId('..')
id("..")
checkoutCredentialsId("..")
repoOwner("owner")
repository("my-repo")
includes()
excludes("PR-*")
}
}
}
This is how it works now... with the help of the following source code:
https://github.com/jenkinsci/bitbucket-branch-source-plugin
multibranchPipelineJob("job") {
branchSources {
branchSource {
source {
bitbucket {
credentialsId("myid")
repoOwner("iam")
repository("job")
traits {
headWildcardFilter {
includes("branchestoinclude")
excludes("toexclude")
}
}
}
}
strategy {
defaultBranchPropertyStrategy {
props {
// keep only the last 8 builds
buildRetentionBranchProperty {
buildDiscarder {
logRotator {
daysToKeepStr("-1")
numToKeepStr("8")
artifactDaysToKeepStr("-1")
artifactNumToKeepStr("-1")
}
}
}
}
}
}
}
}
// Branch behaviour
configure {
def traits = it / sources / data / 'jenkins.branch.BranchSource' / source / traits
traits << 'com.cloudbees.jenkins.plugins.bitbucket.BranchDiscoveryTrait' {
strategyId(3) // detect all branches -refer the plugin source code for various options
}
}
orphanedItemStrategy {
discardOldItems {
numToKeep(8)
}
}
}

Job DSL to create "Pipeline" type job

I have installed Pipeline Plugin which used to be called as Workflow Plugin earlier.
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin
I want to know how can i use Job Dsl to create and configure a job which is of type Pipeline
You should use pipelineJob:
pipelineJob('job-name') {
definition {
cps {
script('logic-here')
sandbox()
}
}
}
You can define the logic by inlining it:
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'logic'
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
or load it from a file located in workspace:
pipelineJob('job-name') {
definition {
cps {
script(readFileFromWorkspace('file-seedjob-in-workspace.jenkinsfile'))
sandbox()
}
}
}
Example:
Seed-job file structure:
jobs
\- productJob.groovy
logic
\- productPipeline.jenkinsfile
then productJob.groovy content:
pipelineJob('product-job') {
definition {
cps {
script(readFileFromWorkspace('logic/productPipeline.jenkinsfile'))
sandbox()
}
}
}
I believe this question is asking something how to use the Job DSL to create a pipeline job which references the Jenkinsfile for the project, and doesn't combine the job creation with the detailed step definitions as has been given in the answers to date. This makes sense: the Jenkins job creation and metadata configuration (description, triggers, etc) could belong to Jenkins admins, but the dev team should have control over what the job actually does.
#meallhour, is the below what you're after? (works as at Job DSL 1.64)
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
This example worked for me. Here's another example based on what worked for me:
pipelineJob('Your App Pipeline') {
def repo = 'https://github.com/user/yourApp.git'
def sshRepo = 'git#git.company.com:user/yourApp.git'
description("Your App Pipeline")
keepDependencies(false)
properties{
githubProjectUrl (repo)
rebuild {
autoRebuild(false)
}
}
definition {
cpsScm {
scm {
git {
remote { url(sshRepo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
If you build the pipeline first through the UI, you can use the config.xml file and the Jenkins documentation https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob to create your pipeline job.
In Job DSL, pipeline is still called workflow, see workflowJob.
The next Job DSL release will contain some enhancements for pipelines, e.g. JENKINS-32678.
First you need to install Job DSL plugin and then create a freestyle project in jenkins and select Process job DSLs from the dropdown in the build section.
Select Use the provided DSL script and provide following script.
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage name 1') {
steps {
// your logic here
}
}
stage('Stage name 2') {
steps {
// your logic here
}
}
}
}
}
''')
}
}
}
Or you can create your job by pointing the jenkinsfile located in remote git repository.
pipelineJob("job-name") {
definition {
cpsScm {
scm {
git {
remote {
url("<REPO_URL>")
credentials("<CREDENTIAL_ID>")
}
branch('<BRANCH>')
}
}
scriptPath("<JENKINS_FILE_PATH>")
}
}
}
If you are using a git repo, add a file called Jenkinsfile at the root directory of your repo. This should contain your job dsl.

How to use Job DSL with Accurev SCM?

I am using the following groovy script to create a Job DSL that uses Accurev as SCM.
Please let me know how should the correct script look like.
job('payer-server') {
scm {
accurev {
/**What to insert here **/
}
}
triggers {
scm('H/15 * * * *')
}
steps {
maven {
goals('-e clean install')
mavenOpts('-Xms256m')
mavenOpts('-Xmx512m')
properties skipTests: true
mavenInstallation('Maven 3.3.3')
}
}
}
Currently there is no built-in support for Accurev SCM. Someone already filed a feature request as JENKINS-22138.
But you can use a Configure Block to generate the necessary config XML. There is an example for configuring Subversion, which can be adapted to Accurev.
job('example') {
configure { project ->
project.remove(project / scm) // remove the existing 'scm' element
project / scm(class: 'hudson.plugins.accurev.AccurevSCM') {
serverName('foo')
// ...
}
}
triggers {
// ...
}
steps {
// ...
}
}
Please leave a comment on the feature request to describe which options of Accurev SCM you need to configure initially.

Resources