I have a pipelineJob defined in Job DSL.
It runs a pipeline/Jenkinsfile which it checks out of git.
I want people to be able to type in the git branch from which to pull the Jenkinsfile - (i.e. in a stringParam) - or, if they have not typed in a branch, to default to a branch which I have set in a choiceParam (i.e. this will be 'develop' or 'master')
This does not work:
pipelineJob('some-job') {
parameters {
choiceParam('gitCreds', [gitCreds], 'Stash credential')
stringParam('gitUrl', 'https://some-repo.git', 'URL for the Stash repo')
stringParam('gitBranchOverride', '', 'Type in some feature branch here if you wish')
choiceParam('gitBranch', ['develop'], '...otherwise the job should default to a branch here')
}
definition {
cpsScm {
scm {
git {
branch('$gitBranchOverride' ?: '$gitBranch')
extensions {
wipeOutWorkspace()
}
remote {
credentials(gitCreds)
url ('$gitUrl')
}
}
}
}
}
}
It works if I enter a value into gitBranchOverride, but if I don't, it seems to enumerate all the branches, and check out a random one - i.e. it's not honouring the value in gitBranch
Don't know if i'm understanding your problem correctly but this is how I have my code for creating pipelinejobs:
def git_branch = getBinding().getVariable('GIT_BRANCH')
def gitrepo = "ssh://git#some.git.repo/somerepo.git"
def credential_id = "awesomecredentials"
pipelineJob("MyAwesomeJob") {
description("""This job is awesome\n\n__input__:\n* My parameter\n* Branch\n\n__branch__: ${git_branch}""")
parameters {
stringParam(name='MyParameter', description='AwesomeParameterHere')
stringParam('branch', defaultValue='origin/develop', description='Branch to build')
}
definition {
cpsScm {
scm {
git {
branch('$branch')
remote {
url("gitrepo")
credentials(credential_id)
}
}
scriptPath("jenkins/my_awesome_pipeline/Jenkinsfile")
}
}
}
}
With this, my job is created with a parameter for branch with a default one selected.
Related
Please bear with me the description might be long but it might give a clean picture of the intent and issue.
I have used Job DSL Plugin to create a seeder job, which in turns creates two new Jobs. I have 2 separate repositories
For maintaining jenkins pipeline scripts.
For actual code to build.
First I have created a pipeline job in jenkins which in turns creates view and 2 jobs. Config shown below:
The Jenkinsfile given below uses Job DSL plugin api, reads the groovy script and creates the required 2 jobs.
node('master') {
checkout scm
jobDsl targets: ['dsl/seedJobBuilder.groovy'].join('\n'),
removedJobAction: 'IGNORE',
removedViewAction: 'IGNORE',
lookupStrategy: 'SEED_JOB'
}
seedJobBuilder.groovy creates a dsl pipeline job whose task would be to build the actual codebase.
listView('Build Pipelines') {
description('All build and deploy jobs')
jobs {
names(
'build',
'deploy',
)
}
columns {
status()
weather()
name()
lastSuccess()
lastFailure()
lastDuration()
buildButton()
}
}
def buildCommerce = pipelineJob('build') {
properties {
githubProjectUrl("${projectRepo}") // url of actual code repo not the jenkins script repo
}
definition {
cpsScm {
scm {
git {
remote {
url("${pipelineRepo}") // jenkins script repo url
credentials("somecredentials")
}
branch('${JENKINS_SCRIPT_BRANCH}')
}
scriptPath('pipelines/pipelineBuildEveryDay.groovy')
lightweight(false)
}
}
}
triggers {
githubPush()
}
}
Config of the above job created by Job DSL:
This job reads the pipelineBuildEveryDay groovy script, checkout the actual codebase and build and deploy.
The place where I am struggling is how do we trigger build on this second job through github hook or through ghprb. Since I don't want to manipulate manually the second job and the git url of the job is the script repo URL not the codebase URL. Is it possible to do this even? If yes what am I missing?
I have the webhook configured
pipelineBuildEveryDay.groovy
pipeline {
libraries {
lib("shared-library#${params.JENKINS_SCRIPT_BRANCH}")
}
agent {
node {
label 'master'
}
}
options {
skipDefaultCheckout(true) // No more 'Declarative: Checkout' stage
}
stages {
stage('Crazy Build Pipeline') {
tools {
jdk 'java11'
}
stages {
stage('Prepare build name') {
steps{
script{
currentBuild.displayName = "${currentBuild.number}-build"
}
}
}
stage('Checkout') {
steps {
cleanWs()
script {
checkoutRepository("${projectDir}", "${params.PROJECT_TAG}", "${params.PROJECT_REPO}")
}
}
}
stage('Run Tests') {
steps {
echo "Running test coming soon..."
}
}
}
}
}
// post build actions
post {
success {
echo "success"
}
failure {
echo "failure"
}
}
}
Well the suffering comes to an end. Posting this answer for anyone struggling with similar sort of issues.
Make sure you uncheck all other types of trigger, the only checked one should be pull request builder.
The part which screwed me was the Project URL. In my case in SCM part the github url was of the Jenkins-scripts repository URL not the URL of the codebase I want to build. So I tried to use my codebase repository URL in Github Project URL textbox.
But the real problem was using repository URL in the format 'https://code-base-repo-url.git' instead it should be 'https://code-base-repo-url'. Sounds stupid? Yeah I know!
Finally the complete Job config pipeline script if it helps:
def pipelineRepo = 'https://jenkins-script-repo'
def projectRepo = 'https://code-base-repo-url'
def projectTag = '${GIT_BRANCH}'
def buildCommerce = pipelineJob('build') {
properties {
githubProjectUrl("${projectRepo}")
}
definition {
cpsScm {
scm {
git {
remote {
url("${pipelineRepo}")
credentials("use-your-own-user-pass-cred")
}
branch('${JENKINS_SCRIPT_BRANCH}')
}
scriptPath('pipelines/pipelineBuildEveryDay.groovy')
lightweight(false)
}
}
}
triggers {
githubPullRequest {
admin('use_your_own_admin')
triggerPhrase('build please')
useGitHubHooks()
permitAll()
displayBuildErrorsOnDownstreamBuilds()
extensions {
commitStatus {
context('Jenkins')
completedStatus('SUCCESS', 'All is well')
completedStatus('FAILURE', 'Something went wrong. Investigate!')
completedStatus('ERROR', 'Something went really wrong. Investigate!')
}
}
}
}
}
I'm trying to use the configuration as code (JCasC) plug-in to create a pipeline job that takes in build parameters but I can't find the syntax for this anywhere online. I'm writing the configuration in YAML.
On the GUI, the field is called "This build is paramertized" and it is under the 'General' heading. I need to define two string parameters: CLUSTER_ID=cluster_id and OPENSHIFT_ADMINSTRATION_BRANCH=develop.
This is the yaml file that I am trying to edit:
jobs:
- script: >
folder('test1'){
pipelineJob('test1/seedJobTest') {
description 'seedJobTest'
logRotator {
daysToKeep 10
}
definition {
cpsScm {
scm {
git {
remote {
credentials "xxx"
url 'xxx'
}
branches 'refs/head/master'
scriptPath 'Jenkinsfile'
extensions { }
}
}
}
}
configure { project ->
project / 'properties' / 'EnvInjectJobProperty' {
'on'('true')
}
project / 'properties' / 'org.jenkinsci.plugins.workflow.job.properties.DisableConcurrentBuildsJobProperty' {}
}
}
}
Thanks for your help!
Solution
jobs:
- script: >
folder('test1'){
pipelineJob('test1/seedJobTest') {
description 'seedJobTest'
logRotator {
daysToKeep 10
}
parameters {
stringParam("CLUSTER_ID", "cluster_id", "your description here")
stringParam("OPENSHIFT_ADMINSTRATION_BRANCH", "develop", "your description here")
}
definition {
cpsScm {
scm {
git {
remote {
credentials "xxx"
url 'xxx'
}
branches 'refs/head/master'
scriptPath 'Jenkinsfile'
extensions { }
}
}
}
}
configure { project ->
project / 'properties' / 'EnvInjectJobProperty' {
'on'('true')
}
project / 'properties' / 'org.jenkinsci.plugins.workflow.job.properties.DisableConcurrentBuildsJobProperty' {}
}
}
}
How to Figure this Stuff Out in the Future - XML Job To DSL (Jenkins Plugin)
Here's how I would go about figuring this kind of thing:
Manually create a temporary pipeline job with the things you want in your seed job (the one you want to automate).
Install (if only temporarily) the "XML Job To DSL" Jenkins plugin.
Go to the main Jenkins Dashboard
In the left navigation, you'll find "XML Job To DSL." Click it.
Select the temporary job you created and click "Convert selected to DSL"
When I went about getting the params snippet for this answer, I did as I described above, but simply created two parameters. I ended up with this:
pipelineJob("test") {
description()
keepDependencies(false)
parameters {
stringParam("CLUSTER_ID", "cluster_id", "your description here")
stringParam("OPENSHIFT_ADMINSTRATION_BRANCH", "develop", "your description here")
}
definition {
cpsScm {
"" }
}
disabled(false)
}
Read-Only Parameter Option
One more thing, in case it's useful to you (as it was to me). If you want to create a parameterized seed job but you don't want those to be editable at build time, you can install the "Readonly Parameter" Jenkins plugin; then, you'll be able to do this kind of thing:
jobs:
- script: >
pipelineJob("Param Example") {
description()
keepDependencies(false)
parameters {
wHideParameterDefinition {
name('AGENT')
defaultValue('docker-host')
description('Node on which to run.')
}
wHideParameterDefinition {
name('ENV_FILE_DIR')
defaultValue('local2')
description('Name of environment directory which houses .env')
}
booleanParam("include_search_stack", false, "Build/run the local Fess, Elasticsearch, and Kibana containers.")
booleanParam("SKIP_404_GENERATION", false, "Helpful sometimes during local development.")
}
definition {
cpsScm {
scm {
git {
remote {
url("https://myrepo/blah.git")
credentials("scm")
}
branch("master")
}
}
scriptPath("pipeline/main/Jenkinsfile")
}
}
disabled(false)
}
In this example, the top two params, AGENT and ENV_FILE_DIR are sort of "hard-coded" from CasC, because the those parameters are not editable at build-time. However, the include_search_stack and SKIP_404_GENERATION parameters are editable. I used this mixed example to show that either/both are usable in the same job.
Read-only parameters have been useful in some of my use cases.
Is there a way to specify the location of the checkout using "agent" (not "node") in a Jenkinsfile?
pipeline {
agent { label 'my_label' }
stages {
stage('Checkout') {
steps {
// Dang. my_repo has already been checked out
dir('my_repo') {
checkout scm
}
}
}
}
}
It seems if you use "node" you have the ability to do this, but I can't seem to find a way to do with "agent".
If you set skipDefaultCheckout(), then you can checkout your SCM when you want:
pipeline {
agent { label 'my_label' }
options {
skipDefaultCheckout()
}
stages {
stage('Checkout') {
steps {
// SWEET! my_repo has not been checked out
dir('my_repo') {
checkout scm
}
}
}
}
}
Alternatively, some of the SCMs offer advanced checkout options that let you do the checkout into a different path.
Be aware, though, if you use multiple agents, you will need to manually do the checkout each time you use another agent. If MAY use the same workspace, but there is no guarantee. You should always run checkout scm, just in case it doesn't.
Following my title.
I use the groovy to do that.
But it doesn't work. who can ask me how do it?
the following is my source:
job("AAA") {
parameters {
stringParam('branch_name', 'master', 'input branch name')
stringParam('commit_id', '123456', 'input commit id')
}
gitSCM {
userRemoteConfigs {
userRemoteConfig {
url("ssh://git#abc/abc.git")
name("${branch_name}")
}
}
branches {
branchSpec {
name("${commit_id}")
}
}}
Thanks.
We do it like that:
parameters {
stringParam('GERRIT_REFSPEC', library.GERRIT_REFSPEC.default, library.GERRIT_REFSPEC.description)
stringParam('GERRIT_PATCHSET_REVISION', library.GERRIT_PATCHSET_REVISION.default, library.GERRIT_PATCHSET_REVISION.description)
}
scm {
git {
remote {
name('ci-config')
url('ssh://url-to-repo')
refspec('$GERRIT_REFSPEC')
}
branch('${GERRIT_PATCHSET_REVISION}')
}
}
But it is only used when started manually. Since it is triggered by gerrit, it sets the values that are used. Is that your problem, it is triggered, and then the values are not set?
I have installed Pipeline Plugin which used to be called as Workflow Plugin earlier.
https://wiki.jenkins-ci.org/display/JENKINS/Pipeline+Plugin
I want to know how can i use Job Dsl to create and configure a job which is of type Pipeline
You should use pipelineJob:
pipelineJob('job-name') {
definition {
cps {
script('logic-here')
sandbox()
}
}
}
You can define the logic by inlining it:
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage 1') {
steps {
echo 'logic'
}
}
stage('Stage 2') {
steps {
echo 'logic'
}
}
}
}
}
'''.stripIndent())
sandbox()
}
}
}
or load it from a file located in workspace:
pipelineJob('job-name') {
definition {
cps {
script(readFileFromWorkspace('file-seedjob-in-workspace.jenkinsfile'))
sandbox()
}
}
}
Example:
Seed-job file structure:
jobs
\- productJob.groovy
logic
\- productPipeline.jenkinsfile
then productJob.groovy content:
pipelineJob('product-job') {
definition {
cps {
script(readFileFromWorkspace('logic/productPipeline.jenkinsfile'))
sandbox()
}
}
}
I believe this question is asking something how to use the Job DSL to create a pipeline job which references the Jenkinsfile for the project, and doesn't combine the job creation with the detailed step definitions as has been given in the answers to date. This makes sense: the Jenkins job creation and metadata configuration (description, triggers, etc) could belong to Jenkins admins, but the dev team should have control over what the job actually does.
#meallhour, is the below what you're after? (works as at Job DSL 1.64)
pipelineJob('DSL_Pipeline') {
def repo = 'https://github.com/path/to/your/repo.git'
triggers {
scm('H/5 * * * *')
}
description("Pipeline for $repo")
definition {
cpsScm {
scm {
git {
remote { url(repo) }
branches('master', '**/feature*')
scriptPath('misc/Jenkinsfile.v2')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
// the single line below also works, but it
// only covers the 'master' branch and may not give you
// enough control.
// git(repo, 'master', { node -> node / 'extensions' << '' } )
}
}
}
}
Ref the Job DSL pipelineJob: https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob, and hack away at it on http://job-dsl.herokuapp.com/ to see the generated config.
This example worked for me. Here's another example based on what worked for me:
pipelineJob('Your App Pipeline') {
def repo = 'https://github.com/user/yourApp.git'
def sshRepo = 'git#git.company.com:user/yourApp.git'
description("Your App Pipeline")
keepDependencies(false)
properties{
githubProjectUrl (repo)
rebuild {
autoRebuild(false)
}
}
definition {
cpsScm {
scm {
git {
remote { url(sshRepo) }
branches('master')
scriptPath('Jenkinsfile')
extensions { } // required as otherwise it may try to tag the repo, which you may not want
}
}
}
}
If you build the pipeline first through the UI, you can use the config.xml file and the Jenkins documentation https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob to create your pipeline job.
In Job DSL, pipeline is still called workflow, see workflowJob.
The next Job DSL release will contain some enhancements for pipelines, e.g. JENKINS-32678.
First you need to install Job DSL plugin and then create a freestyle project in jenkins and select Process job DSLs from the dropdown in the build section.
Select Use the provided DSL script and provide following script.
pipelineJob('job-name') {
definition {
cps {
script('''
pipeline {
agent any
stages {
stage('Stage name 1') {
steps {
// your logic here
}
}
stage('Stage name 2') {
steps {
// your logic here
}
}
}
}
}
''')
}
}
}
Or you can create your job by pointing the jenkinsfile located in remote git repository.
pipelineJob("job-name") {
definition {
cpsScm {
scm {
git {
remote {
url("<REPO_URL>")
credentials("<CREDENTIAL_ID>")
}
branch('<BRANCH>')
}
}
scriptPath("<JENKINS_FILE_PATH>")
}
}
}
If you are using a git repo, add a file called Jenkinsfile at the root directory of your repo. This should contain your job dsl.