I'm using Jenkins file that located in my git repository.
I have configured new job using the pipeline script from SCM that point to my jenkinsfile. I'm trying to use in my Jenkins file pipeline script the git module in order to pull my data from my git repo without configure pre-static variable and just to use the variable of the repository URL under pipeline script from SCM that already was configured in my job .
There is a way to get somehow the variable Repository URL
from this plugin without using parameters in my Jenkins pipeline script.
I have already tried the environment variable GIT_URL and other stuff that related to git from here but this didn't work.
You can find all information about scm in scm variable (instance of GitSCM if you are using git).
You can get repository URL this way
def repositoryUrl = scm.userRemoteConfigs[0].url
But if you just want to checkout that repository you can simply invoke checkout scm without needing to specify anything else. See checkout step
from this post I found a way that you can use the checkout scm to get the git repo url like this:
checkout scm
def url = sh(returnStdout: true, script: 'git config remote.origin.url').trim()
but checkout scm will pull the code and I want to avoid from that.
So I found another way (not the pretty one):
node('master'){
try{
GIT_REPO_URL = null
command = "grep -oP '(?<=url>)[^<]+' /var/lib/jenkins/jobs/${JOB_NAME}/config.xml"
GIT_REPO_URL = sh(returnStdout: true, script: command).trim();
echo "Detected Git Repo URL: ${GIT_REPO_URL}"
}
catch(err){
throw err
error "Colud not find any Git repository for the job ${JOB_NAME}"
}
}
this is did the trick for me.
Probably not directly a solution for your particular case, as you're working with git.
But for those still working with SVN using the SubversionSCM, the repository URL can be obtained using
def repositoryUrl = scm.locations[0].remote
I believe that the best solution is like this answer.
An example using declarative pipeline:
pipeline {
agent any;
stages {
stage('test'){
steps {
script {
def s = checkout scm;
if (s.GIT_URL != null) print s.GIT_URL
else if (s.SVN_URL != null) print s.SVN_URL
else print s
}
}
}
}
}
Note - this does a full checkout. If that is not desirable, I would try to handle that in checkout parameters (like here)
Related
I'm fairly new to Jenkins Pipeline groovy scripts but I have written a script which performs an SVN Checkout, NuGet Restore, etc and eventually copying an msi file to the server. After successfully packaging the .msi file I would like to tag the source in SVN but I'm struggling to find a method of doing this.
The svn check out is performed as follows:
def svn = checkout scm
I was sort of hoping I could just to the following:
svn = copy scm "svn://svn/MyPath/MyApp/tag/${versionNumber}" -m "V${versionNumber}"
I could obviously use the bat command and specify the full svn command but then I'd have to enter the Jenkins credentials into the groovy script which is not ideal.
Any help/pointers would be greatly appreciated.
If you are tied to Subversion you could extract bits of code from this post, but I would HIGHLY recommend, if you can, to migrate your code to Git as the tooling support in Git is so much better and where the world has already gone.
In Git this is all I have to do tag a branch as part of my Jenkins multibranch pipeline:
success {
script {
if( "${env.BRANCH_NAME}" == "develop" ) {
bat "git tag ${JOB_NAME}_${BUILD_NUMBER}"
bat "git push ${env.REPO} --tags"
}
}
}
${env.REPO} is my clone URL.
SO much easier.
I have 200-300 jobs of multibranchPipelineJob , I want to to create all of them with DSL.
I have this script to get the job name
for(job in Hudson.instance.getAllItems(org.jenkinsci.plugins.workflow.job.WorkflowJob)
) {
println job.fullName
}
it gives me the job name , but I can't figure out how to get the git repository from it.
any idea ?
In Multibranch pipeline project only the top level job contains information about a repository. So you should iterate over WorkflowMultiBranchProject instead of WorkflowJob.
This way you can get a repository URL and a List of RefSpecs.
for(job in Hudson.instance.getAllItems(org.jenkinsci.plugins.workflow.multibranch.WorkflowMultiBranchProject)) {
def repositoryUrl = job.SCMSources[0].remote
def refSpecs = job.SCMSources[0].refSpecs
}
Note that this is applied only for GIT repositories. For SVN it would be slightly different.
I'm trying to write my first Jenkins shared library and I'm struggling with something basic - getting the branch name.
I could do:
sh(returnStdout: true, script: 'git rev-parse --abbrev-ref HEAD').trim()
However, that requires a checkout. Would it be possible to get the branch name (for both multibranch and freestyle) pipeline projects? I know I'll be using git, but I would like to avoid doing a checkout (until it is necessary).
The GIT_BRANCH environment variable should give you what you want. It won't work in pipeline until Jenkins 2.60 and upgraded pipeline model definition plugin.
If you are using a pipeline job, you can
Capture object returned from scm checkout
Reference environment variable
pipeline {
// ...
stages {
stage('Setup') {
steps {
script {
// capture scm variables
def scmVars = checkout scm
String branch = scmVars.GIT_BRANCH
// or use the environment variable
branch = env.GIT_BRANCH
}
}
}
// ...
}
}
Environment variable reference.
I ended up using this:
env.CHANGE_BRANCH ?: env.GIT_BRANCH ?: scm.branches[0]?.name?.split('/')[1] ?: 'UNKNOWN'
However, this requires me to approve several things in In-Script Approvals page.
I'm using Jenkins Scripted Pipeline that uses Groovy style scripting, and created a Jenkinsfile to describe the pipeline. I need to create the workspace with the folder name same as git repo name, and then checkout the code in the workspace folder.
My question is, before doing the checkout scm, is there a way to know the git repo name or the git repo url?
String determineRepoName() {
return scm.getUserRemoteConfigs()[0].getUrl().tokenize('/')[3].split("\\.")[0]
}
This relatively ugly code is what I use to get the repoName. The key is that the URL of the repo is stored in:
scm.getUserRemoteConfigs()[0].getUrl()
from there you need to do some string ops to get what you want.
Update:
String determineRepoName() {
return scm.getUserRemoteConfigs()[0].getUrl().tokenize('/').last().split("\\.")[0]
}
This works also for repositories with a deeper hierarchy (https://domain/project/subproject/repo or ssh git repo which does not contain the two // at the start.
Maybe a silly answer, but isn't it possible using the environment Jenkins environment variable env.BITBUCKET_REPOSITORY?
I have several projects that use a Jenkinsfile which is practically the same. The only difference is the git project that it has to checkout. This forces me to have one Jenkinsfile per project although they could share the same one:
node{
def mvnHome = tool 'M3'
def artifactId
def pomVersion
stage('Commit Stage'){
echo 'Downloading from Git...'
git branch: 'develop', credentialsId: 'xxx', url: 'https://bitbucket.org/xxx/yyy.git'
echo 'Building project and generating Docker image...'
sh "${mvnHome}/bin/mvn clean install docker:build -DskipTests"
...
Is there a way to preconfigure the git location as a variable during the job creation so I can reuse the same Jenkinsfile?
...
stage('Commit Stage'){
echo 'Downloading from Git...'
git branch: 'develop', credentialsId: 'xxx', url: env.GIT_REPO_LOCATION
...
I know I can set it up this way:
This project is parameterized -> String Parameter -> GIT_REPO_LOCATION, default= http://xxxx, and access it with env.GIT_REPO_LOCATION.
The downside is that the user is promted to start the build with the default value or change it. I would need that it were transparent to he user. Is there a way to do it?
You can use the Pipeline Shared Groovy Library plugin to have a library that all your projects share in a git repository. In the documentation you can read about it in detail.
If you have a lot of Pipelines that are mostly similar, the global variable mechanism provides a handy tool to build a higher-level DSL that captures the similarity. For example, all Jenkins plugins are built and tested in the same way, so we might write a step named buildPlugin:
// vars/buildPlugin.groovy
def call(body) {
// evaluate the body block, and collect configuration into the object
def config = [:]
body.resolveStrategy = Closure.DELEGATE_FIRST
body.delegate = config
body()
// now build, based on the configuration provided
node {
git url: "https://github.com/jenkinsci/${config.name}-plugin.git"
sh "mvn install"
mail to: "...", subject: "${config.name} plugin build", body: "..."
}
}
Assuming the script has either been loaded as a Global Shared Library
or as a Folder-level Shared Library the resulting Jenkinsfile will be
dramatically simpler:
Jenkinsfile (Scripted Pipeline)
buildPlugin {
name = 'git'
}
The example shows how a jenkinsfile passes name = git to the library.
I currently use a similar setup and am very happy with it.
Instead of having a Jenkinsfile in each Git repository, you can have an additional git repository from where you get the common Jenkinsfile - this works when using Pipeline type Job and selecting the option Pipeline script from SCM. This way Jenkins checks out the repo where you have the common Jenkinsfile before checking out the user repo.
In case the job can be triggered automatically, you can create a post-receive hook in each git repo that calls the Jenkins Pipeline with the repo as a parameter, so that the user does not have to manually run the job entering the repo as a parameter (GIT_REPO_LOCATION).
In case the job cannot be triggered automatically, the least annoying method I can think of is having a Choice parameter with a list of repositories instead of a String parameter.