freeStyleJob SCM credential from parameters not working - jenkins

I'm trying to use a simple FreeStyleJob SCM , and set the credentialId UUID from build parameter. The problem is, it seems the credentials is not parsing the parameter correctly.
scm {
git {
remote {
github('\${MY_REPO_HANDLE}', 'ssh')
credentials('\${MY_REPO_CREDENTIALS}')
}
branch('\${MY_BRANCH}')
}
}
My MY_REPO_CREDENTIALS is a simple String parameter
stringParam {
name("MY_REPO_CREDENTIALS")
defaultValue("teste-credential")
}
Log:
Warning: CredentialId "${MY_REPO_CREDENTIALS}" could not be found.
UPDATE
This Jenkins Job is created by another Jenkins Job using DSL external. In resume, Job 1 whenn triggered will create the Job 2 on jenkins. When I try to use "$ (without \) the job will fail because this parameter doesn't exist on Job 1 context.
job config.xml:
<scm class="hudson.plugins.git.GitSCM">
<userRemoteConfigs>
<hudson.plugins.git.UserRemoteConfig>
<url>git#github.com:${MY_REPO_HANDLE}.git</url>
<credentialsId>${MY_REPO_CREDENTIALS}</credentialsId>
</hudson.plugins.git.UserRemoteConfig>
</userRemoteConfigs>
<branches>
<hudson.plugins.git.BranchSpec>
<name>${MY_BRANCH}</name>
</hudson.plugins.git.BranchSpec>
</branches>
<configVersion>2</configVersion>
<doGenerateSubmoduleConfigurations>false</doGenerateSubmoduleConfigurations>
<gitTool>Default</gitTool>
<browser class="hudson.plugins.git.browser.GithubWeb">
<url>https://github.com/${MY_REPO_HANDLE}/</url>
</browser>
</scm>

The final solution was set the ssh agent before build, using wrapper:
wrappers {
sshAgent("\${MY_REPO_CREDENTIALS}")
}
keepDependencies(false)
scm {
git {
remote {
github("\${MY_REPO_HANDLE}", 'ssh')
}
branch("\${MY_BRANCH}")
}
}

You can switch to using " instead of ' so you can use the autofill feature:
scm {
git {
remote {
github("${MY_REPO_HANDLE}", 'ssh')
credentials("${MY_REPO_CREDENTIALS}")
}
branch("${MY_BRANCH}")
}
}

Related

Job DSL plugin | Shared Library | Pipeline jobs | Github Hook not working

Please bear with me the description might be long but it might give a clean picture of the intent and issue.
I have used Job DSL Plugin to create a seeder job, which in turns creates two new Jobs. I have 2 separate repositories
For maintaining jenkins pipeline scripts.
For actual code to build.
First I have created a pipeline job in jenkins which in turns creates view and 2 jobs. Config shown below:
The Jenkinsfile given below uses Job DSL plugin api, reads the groovy script and creates the required 2 jobs.
node('master') {
checkout scm
jobDsl targets: ['dsl/seedJobBuilder.groovy'].join('\n'),
removedJobAction: 'IGNORE',
removedViewAction: 'IGNORE',
lookupStrategy: 'SEED_JOB'
}
seedJobBuilder.groovy creates a dsl pipeline job whose task would be to build the actual codebase.
listView('Build Pipelines') {
description('All build and deploy jobs')
jobs {
names(
'build',
'deploy',
)
}
columns {
status()
weather()
name()
lastSuccess()
lastFailure()
lastDuration()
buildButton()
}
}
def buildCommerce = pipelineJob('build') {
properties {
githubProjectUrl("${projectRepo}") // url of actual code repo not the jenkins script repo
}
definition {
cpsScm {
scm {
git {
remote {
url("${pipelineRepo}") // jenkins script repo url
credentials("somecredentials")
}
branch('${JENKINS_SCRIPT_BRANCH}')
}
scriptPath('pipelines/pipelineBuildEveryDay.groovy')
lightweight(false)
}
}
}
triggers {
githubPush()
}
}
Config of the above job created by Job DSL:
This job reads the pipelineBuildEveryDay groovy script, checkout the actual codebase and build and deploy.
The place where I am struggling is how do we trigger build on this second job through github hook or through ghprb. Since I don't want to manipulate manually the second job and the git url of the job is the script repo URL not the codebase URL. Is it possible to do this even? If yes what am I missing?
I have the webhook configured
pipelineBuildEveryDay.groovy
pipeline {
libraries {
lib("shared-library#${params.JENKINS_SCRIPT_BRANCH}")
}
agent {
node {
label 'master'
}
}
options {
skipDefaultCheckout(true) // No more 'Declarative: Checkout' stage
}
stages {
stage('Crazy Build Pipeline') {
tools {
jdk 'java11'
}
stages {
stage('Prepare build name') {
steps{
script{
currentBuild.displayName = "${currentBuild.number}-build"
}
}
}
stage('Checkout') {
steps {
cleanWs()
script {
checkoutRepository("${projectDir}", "${params.PROJECT_TAG}", "${params.PROJECT_REPO}")
}
}
}
stage('Run Tests') {
steps {
echo "Running test coming soon..."
}
}
}
}
}
// post build actions
post {
success {
echo "success"
}
failure {
echo "failure"
}
}
}
Well the suffering comes to an end. Posting this answer for anyone struggling with similar sort of issues.
Make sure you uncheck all other types of trigger, the only checked one should be pull request builder.
The part which screwed me was the Project URL. In my case in SCM part the github url was of the Jenkins-scripts repository URL not the URL of the codebase I want to build. So I tried to use my codebase repository URL in Github Project URL textbox.
But the real problem was using repository URL in the format 'https://code-base-repo-url.git' instead it should be 'https://code-base-repo-url'. Sounds stupid? Yeah I know!
Finally the complete Job config pipeline script if it helps:
def pipelineRepo = 'https://jenkins-script-repo'
def projectRepo = 'https://code-base-repo-url'
def projectTag = '${GIT_BRANCH}'
def buildCommerce = pipelineJob('build') {
properties {
githubProjectUrl("${projectRepo}")
}
definition {
cpsScm {
scm {
git {
remote {
url("${pipelineRepo}")
credentials("use-your-own-user-pass-cred")
}
branch('${JENKINS_SCRIPT_BRANCH}')
}
scriptPath('pipelines/pipelineBuildEveryDay.groovy')
lightweight(false)
}
}
}
triggers {
githubPullRequest {
admin('use_your_own_admin')
triggerPhrase('build please')
useGitHubHooks()
permitAll()
displayBuildErrorsOnDownstreamBuilds()
extensions {
commitStatus {
context('Jenkins')
completedStatus('SUCCESS', 'All is well')
completedStatus('FAILURE', 'Something went wrong. Investigate!')
completedStatus('ERROR', 'Something went really wrong. Investigate!')
}
}
}
}
}

Groovy to list proper Url of Repository from Pipeline script from SCM (Git)

I have Pipeline jobs that are defined as Pipeline script from SCM where Git is selected.
Example:
Pipeline script from SCM image
I am trying to run a Groovy Script in the Script Console to report all jobs and the Repository URL configured in the GUI, but none of the solutions I have found such as getUserRemoteConfigs()[0].getUrl() are returning the correct Repository URL.
I don't know where it is getting the value from but getUrl() is just returning some other value that just does NOT match the value shown on GUI section of the Pipeline Definition.
Does anyone have any clue what code may work?
You can use the following Groovy script to get the Git URLs.
Jenkins.instance.getAllItems(Job.class).each { jobitem ->
if(jobitem instanceof org.jenkinsci.plugins.workflow.job.WorkflowJob) {
if(jobitem.definition instanceof org.jenkinsci.plugins.workflow.cps.CpsScmFlowDefinition) {
jobitem.definition.getScm().getRepositories().each { repo ->
println("Job Name: " + jobitem.getName() + " URLs: " + repo.getURIs())
}
}
}
}
I figured it out. I really could not find this solution anywhere. I had to look at the config.xml for the job and study the model to come up with this code to return the proper repository URL, Filepath and the Branch:
import org.jenkinsci.plugins.workflow.job.WorkflowJob;
Jenkins.instance.getAllItems(Job.class).each {
project = it.getFullName()
if (it instanceof WorkflowJob) {
myDef = it.getDefinition()
try {
myDef1 = myDef.getScm()
myDef1.each {
println(project+"\t"+it.getUserRemoteConfigs()[0].getUrl()+"\t"+myDef.getScriptPath()+"\t"+it.branches[0]);
}
}
catch(Exception e) {
//println (project+"\t[Script]")
}
}
};

Coalesce parameters in Jenkins Job DSL

I have a pipelineJob defined in Job DSL.
It runs a pipeline/Jenkinsfile which it checks out of git.
I want people to be able to type in the git branch from which to pull the Jenkinsfile - (i.e. in a stringParam) - or, if they have not typed in a branch, to default to a branch which I have set in a choiceParam (i.e. this will be 'develop' or 'master')
This does not work:
pipelineJob('some-job') {
parameters {
choiceParam('gitCreds', [gitCreds], 'Stash credential')
stringParam('gitUrl', 'https://some-repo.git', 'URL for the Stash repo')
stringParam('gitBranchOverride', '', 'Type in some feature branch here if you wish')
choiceParam('gitBranch', ['develop'], '...otherwise the job should default to a branch here')
}
definition {
cpsScm {
scm {
git {
branch('$gitBranchOverride' ?: '$gitBranch')
extensions {
wipeOutWorkspace()
}
remote {
credentials(gitCreds)
url ('$gitUrl')
}
}
}
}
}
}
It works if I enter a value into gitBranchOverride, but if I don't, it seems to enumerate all the branches, and check out a random one - i.e. it's not honouring the value in gitBranch
Don't know if i'm understanding your problem correctly but this is how I have my code for creating pipelinejobs:
def git_branch = getBinding().getVariable('GIT_BRANCH')
def gitrepo = "ssh://git#some.git.repo/somerepo.git"
def credential_id = "awesomecredentials"
pipelineJob("MyAwesomeJob") {
description("""This job is awesome\n\n__input__:\n* My parameter\n* Branch\n\n__branch__: ${git_branch}""")
parameters {
stringParam(name='MyParameter', description='AwesomeParameterHere')
stringParam('branch', defaultValue='origin/develop', description='Branch to build')
}
definition {
cpsScm {
scm {
git {
branch('$branch')
remote {
url("gitrepo")
credentials(credential_id)
}
}
scriptPath("jenkins/my_awesome_pipeline/Jenkinsfile")
}
}
}
}
With this, my job is created with a parameter for branch with a default one selected.

Generated DSL job configuration gets changed after job is run

I have a DSL script that creates a job. As soon as I run the job the config.xml is changed. Because of this, the job doesn't get an update when I run the seed job again.
I suspect some plugins do this. Can you tell me the best way to find out what changes the config when the job is run?
[
[name: "Sonar/co", repo: "repo.git", pomPath: "pom.xml", branch: "development", mvnGoal: "-am -P dev -pl project clean test"]
].each { Map config ->
mavenJob(config.name) {
description "Sonar job für ${config.name}"
logRotator {
numToKeep(1)
}
label "sonar"
scm {
git {
branch "*/${config.branch}"
remote {
url "git#repository:${config.repo}"
}
browser {
gitLab("https://gitlab.DOMAIN.de/", "9.0")
}
}
}
mavenInstallation "maven339"
goals config.mvnGoal
rootPOM config.pomPath
configure { node ->
node / settings (class: 'jenkins.mvn.DefaultSettingsProvider') {
}
node / globalSettings (class: 'jenkins.mvn.DefaultGlobalSettingsProvider') {
}
}
}
}

Jenkins multibranch pipeline Scan without execution

Is it possible to Scan a Multibranch Pipeline to detect the branches with a Jenkinsfile, but without the pipeline execution?
My projects have different branches and I don't want that all the children pipelines branches with a Jenkinsfile to start to execute when I launch a build scan from the parent pipeline multibranch.
In your Branch Sources section you can add a Property named Suppress automatic SCM triggering.
This prevents Jenkins from building everything with an Jenkinsfile.
Also, you can do it programatically
import jenkins.branch.*
import jenkins.model.Jenkins
for (f in Jenkins.instance.getAllItems(jenkins.branch.MultiBranchProject.class)) {
if (f.parent instanceof jenkins.branch.OrganizationFolder) {
continue;
}
for (s in f.sources) {
def prop = new jenkins.branch.NoTriggerBranchProperty();
def propList = [prop] as jenkins.branch.BranchProperty[];
def strategy = new jenkins.branch.DefaultBranchPropertyStrategy(propList);
s.setStrategy(strategy);
}
f.computation.run()
}
This is a Groovy snippet you can execute in Jenkins, it's gonna do the scanning but will not start new "builds" for all discovered branches.
If you are using job-dsl you could simply do this and it will scan everything without actually running the build the first time you index.
organizationFolder('Some folder name') {
buildStrategies {
skipInitialBuildOnFirstBranchIndexing()
}
}
To add to #Stqs's answer, you could also set noTriggerBranchProperty it using Job DSL plugin, e.g.:
multibranchPipelineJob('example') {
...
branchSources {
branchSource {
...
strategy {
defaultBranchPropertyStrategy {
props {
// Suppresses the normal SCM commit trigger coming from branch indexing
noTriggerBranchProperty()
...
}
}
}
}
}
...
}
organizationFolder('my-folder') {
buildStrategies {
buildRegularBranches()
buildChangeRequests {
ignoreTargetOnlyChanges true
ignoreUntrustedChanges false
}
}
}
Note: plugin basic-branch-build-strategies is required
REFERENCES:
https://issues.jenkins.io/browse/JENKINS-63799
http://jenkins-ci.361315.n4.nabble.com/JobDSL-an-example-of-configuring-a-bitbucket-source-trait-of-bitbucketForkDiscovery-in-the-multibrand-td5014968.html#a5015085
After much struggle I've found this solution, it should only avoid triggering the build when branch indexing and not disable automatic build after commit. Just add it in the first stage of your project:
when {
not {
expression {
def causes = currentBuild.getBuildCauses()
String causesClass = causes._class[0]
return causesClass.contains('BranchIndexingCause')
}
}
}

Resources