We are using bitbucket-push-and-pull-request plugin in order to build our project. The pipeline is set to checkout the source repo when webhook is triggered and everything works fine, when PR's coming from the origin repo.
When the pull requests comes from forked repo then the problem appear where cannot find the commit because is still in the fork.
Any idea how we can solve it?
Here is a example of the jenkinsfile:
pipeline {
triggers {
bitBucketTrigger credentialsId: 'GIT_CREDS',
triggers: [
[$class: 'BitBucketPPRPullRequestServerTriggerFilter',
actionFilter: [$class: 'BitBucketPPRPullRequestServerCreatedActionFilter',
allowedBranches: ''
]
],
[$class: 'BitBucketPPRPullRequestServerTriggerFilter',
actionFilter: [$class: 'BitBucketPPRPullRequestServerMergedActionFilter',
allowedBranches: ''
]
],
[$class: 'BitBucketPPRPullRequestServerTriggerFilter',
actionFilter: [$class: 'BitBucketPPRPullRequestServerSourceUpdatedActionFilter',
allowedBranches: ''
]
]
]
}
agent any
stages {
stage('Checkout Bitbucket repo') {
steps {
script {
git branch: 'env.CHANGE_BRANCH',
credentialsId: 'GIT_CREDS',
url: env.GIT_URL
}
}
}
stage('Start the build') {
steps {
script {
sh 'echo "BUILD"'
}
}
}
}
}
Here is the LOG:
stderr: fatal: ambiguous argument '0648334d4491907a45a840a7326c3b8f54180144^{commit}': unknown revision or path not in the working tree.
Use '--' to separate paths from revisions, like this:
'git <command> [<revision>...] -- [<file>...]'
I was able to fix this by adding Refspec in Advanced tab.
The refspec is:
Related
I have a declarative pipeline. In this pipeline I want to add a verify step to check the last code commit time which lies in another repository. Based on the commit time, I do have to proceed with the next steps.
I am facing issues in fetching the commit time. I am very new to git commands, so unable to resolve the issue.
Points - My Jenkins file is in one repository, and I need the commit details of another repository. I need specifically the commit time (Epoc)
My code looks like this -
#Library('xx-libs') _
pipeline {
agent any
options {
timeout(time: 2, unit: 'HOURS')
}
parameters {
xxxxx
}
environment {
xxxxxx
}
stages {
stage('Checkout source code') {
steps {
script {
checkout ([$class: 'GitSCM',
branches: [[name: '*/'+branch]],
extensions: [
[$class: 'CloneOption', noTags: true, timeout: 20],
[$class: 'RelativeTargetDirectory', relativeTargetDir: '/tmp/core/'],
[$class: 'SparseCheckoutPaths', sparseCheckoutPaths:[[$class:'SparseCheckoutPath', path:'code/System/Infra/Version/version.cpp']]],
[$class: 'CleanBeforeCheckout']
],
userRemoteConfigs: [[
credentialsId: 'azure-bearer-auto-updated',
url: 'https://xx.xx.com/xxx/xxx/_git/core'
]]
])
git_time = sh script: "git show -s --format=%ct", returnStdout: true
echo "$git_time"
}
}
}
} // stages
}
I have created job in jenkins to trigger a build whenever changes happen in github. But i want to know how to get the details of the user who made the changes to configure email notification.
Could anyone help me with the solution please?
You can use git show command to fetch the author ID and email using below command. Assign this to a variable and use it anywhere in the pipeline.
To Fetch Author ID: git show -s --pretty=%an
To Fetch Author email: git show -s --pretty=%ae
Pipeline script can be written as like below.
pipeline {
agent any
options {
timestamps()
}
stages {
stage('Test Stage') {
steps {
checkout changelog: true, poll: false, scm: [$class: 'GitSCM', branches: [[name: '*/Sample_branch']], doGenerateSubmoduleConfigurations: false, extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'TestCredentials', url: '']]]
script {
Author_ID=sh(script: "git show -s --pretty=%an", returnStdout: true).trim()
Author_Name=sh(script: "git show -s --pretty=%ae", returnStdout: true).trim()
}
echo "${Author_ID} and ${Author_Name}"
}
}
}
}
I'm pretty new to jenkins and groovy and I'm trying to do a sparse checkout in my jenkins file. Currently I simply do this:
stage('Check out branch from Gitlab'){
echo 'Pulling...' + env.BRANCH_NAME
checkout scm
}
I wish to execute a sparse checkout from a Jenkins Groovy script and I'm struggling to find a good way of doing this. Is there a way of using the "checkout" command to do this?
You should configure a set of parameters for the GitSCM more info here
A basic configuration is presented as an example below:
pipeline {
agent any
stages {
stage ("Git Checkout"){
steps {
script {
checkout([
$class: 'GitSCM',
branches: [[name: "devel"]],
doGenerateSubmoduleConfigurations: false,
extensions: [[
$class: 'RelativeTargetDirectory',
relativeTargetDir: "/tmp/jenkins/devel"
]],
submoduleCfg: [],
userRemoteConfigs: [[
credentialsId: 'jenkinsCredentialsId',
url: 'https://git.example.com/git/example'
]]
])
}
}
}
}
}
I attached a fully working Jenkins pipeline of one stage. It checks out the branch devel of the repository https://git.example.com/git/example on directory /tmp/jenkins/devel. Also please note that you should add (if not already done) the credentials of the repository in Jenkins Credentials (/jenkins/credentials/), in the above example is under id jenkinsCredentialsId
You can read the link for GitSCM to find out more details and attributes that you can configure.
I'm creating this new Job based on pipeline on jenkins. I want my jenkinsfile to be on bitbucket reposiotry : Let's say my config file is on bitbucket.org/config.git
The job mission is to clean install a project bitbucket.org/myProject.git
How can I configure the pipeline so it will trigger if any push is made in bitbucket.org/myProject.git and following the steps defined in bitbucket.org/config.git?
I do not want to create multi-branch pipeline and I do not want my jenkins file to be on the same repository than my project to compile.
My current config is:
pipeline {
agent any
parameters {
string(defaultValue: '', description: 'URL', name: 'GIT_URL')
string(defaultValue: '', description: 'Credential', name: 'CREDENTIAL_ID')
}
stages {
stage ('Initialize') {
steps {
git branch: 'develop', credentialsId: "${params.CREDENTIAL_ID}", url: "${params.GIT_URL}"
}
}
stage ('Build') {
steps {
sh 'mvn clean install '
echo 'build'
}
}
}
You can use shared Libraries in Jenkins. you would still need a Jenkinsfile in your code, but that would not contain any logic. It would simply refer the shared library and pass any params like git repo path.
For more information on shared libraries please refer this link https://jenkins.io/doc/book/pipeline/shared-libraries/.
For triggering the build, you can define a trigger in your pipeline. Example :
triggers {
pollSCM('H/5 * * * *')
}
or use webhooks if you don't want to poll.
Actually, i managed to make it work.
In my jenkins pipeline, i activated "Build when a change is pushed to BitBucket".
node {
checkout([$class: 'GitSCM',
branches: [[name: 'feature/test-b']],
doGenerateSubmoduleConfigurations: false,
extensions: [[$class: 'SubmoduleOption', disableSubmodules: false,
parentCredentials: false, recursiveSubmodules: true, reference: '',
trackingSubmodules: false]], submoduleCfg: [],
userRemoteConfigs: [[credentialsId: 'admin',
url: 'http://localhost:7990/scm/bout/boutique-a.git']]])
}
When a change is made in boutique-a in the branch 'feature/test-b' my job is triggered which is cool.
Now i have this other issue, how can i trigger when change are made in feature/*
It looks like i cannot access to env.BRANCH_NAME when im not in a multibranch pipeline
I have 40-50 github repositories , each repo contain one maven job.
I want to create multibranch pipeline job for each repository.
can I use the same Jenkinsfile for all projects without add Jenkinsfile for each repository. (take it from another SCM repo) ?
I know that I can use shared library to create a full pipeline , but I prefer something cleaner.
To accomplish this, I would suggest to create a pipeline with two parameters and pass the values based on the repo to build. 1) GIT BRANCH - to build and deploy required branch
2) GIT URL - to provide the git URL to checkout the code.
Providing a reference template.
node('NODE NAME')
{
withEnv([REQUIRED ENV VARIBALES])
{ withCredentials([[$class: 'UsernamePasswordMultiBinding', credentialsId: 'CREDENTIALS ID', passwordVariable: 'PW', usernameVariable: 'USER']])
{ try
{ stage 'Build'
checkout changelog: false, poll: false, scm: [$class: 'GitSCM', branches: [[name: gitbranch]], doGenerateSubmoduleConfigurations: false,
extensions: [], submoduleCfg: [], userRemoteConfigs: [[credentialsId: 'CREDENTIALS ID',
url: 'GIT URL']]]
****
MAVEN BUILD
****
stage 'Docker Image build & Push'
*****
DOCKER BUILD AND PUSH TO REPO
*****
}
catch (err) {
notify("Failed ${err}")
currentBuild.result = 'FAILURE'
}
stage 'Deploy to ENV'
*****
DEPLOYMENT TO REQUIRED ENV
*****
notify('Success -Deployed to Environment')
catch (err) {
notify("Failed ${err}")
currentBuild.result = 'FAILURE'
}
}
}
}
def notify(status)
{
****
NOTIFICATION FUCNTION
****
}
Link the Jenkinsfile in the pipeline job and provide the values- build with parameters, while building the Jenkins job.
Hope this helps.