How to run multiple pipeline jenkins for multiple branch - jenkins

I have a scenario where in I have a frontend repository with multiple branches.
Here's my repo vs application structure.
I have a single Jenkinsfile like below:
parameters{
string(name: 'CUSTOMER_NAME', defaultValue: 'customer_1')
}
stages {
stage('Build') {
steps {
sh '''
yarn --mutex network
/usr/local/bin/grunt fetch_and_deploy:$CUSTOMER_NAME -ac test
/usr/local/bin/grunt collect_web'''
}
}
}
The above Jenkinsfile is same for all customers so I would like to understand what is the best way to have multiple customers build using a same Jenkinsfile and build different pipelines based on the parameter $CUSTOMER_NAME

I am not sure if I understood your problem. But I guess you could use a shared pipeline library: https://jenkins.io/doc/book/pipeline/shared-libraries/
You can put the build step in the library and call it with CUSTOMER_NAME as parameter.
(Please note: a shared pipeline library must be stored in a separate GIT repository!)

Related

Teamcity Shared Library and Stash/Unstash Like Jenkins

I am currently a jenkins user and is exploring the Teamcity.
In jenkins we have a concept of shared libraries, which basically extends a generic groovy code into different jenkins pipeline and avoid re-writing the same functionality in each jenkins file following the DRY (don't repeat yourself) , hide implementation complexity, keep pipelines short and easier to understand
Example:
There could be a repository having all the Groovy functions like:
Repo: http:://github.com/DEVOPS/Utilities.git (repo Utilities)
Sample Groovy Scipt ==>> GitUtils.groovy with below functions
public void setGitConfig(String userName, String email) {
sh "git config --global user.name ${userName}"
sh "git config --global user.mail ${email}"
}
public void gitPush(StringbranchName) {
sh "git push origin ${branchName}"
}
In jenkinsfile we can just call this function like below (of course we need to define config in jenkins for it to know the Repo url for Shared library and give it a name):
Pipeline
//name of shared library given in jenkins
#Library('utilities') _
pipeline {
agent any
stages {
stage ('Example') {
steps {
// log.info 'Starting'
script {
def gitutil = new GitUtils()
gitutils.setGitConfig("Ray", "Ray#rayban.com")
}
}
}
}
}
And that's it anyone wanting the same function has to just include the library in jenkinsfile and use it in pipeline
Questions:
Can we migrate over the same to Teamcity, if yes how can it be done? We do not want to spend lot of time to re-writing
Jenkins also support Stashing and unstashing of workspace between stages, is the similar concept present in teamcity?
Example:
pipeline {
agent any
stages {
stage('Git checkout'){
steps {
stash includes: '/root/hello-world/*', name: 'mysrc'
}
}
stage('maven build'){
agent { label 'slave-1' }
steps {
unstash 'mysrc'
sh label: '', script: 'mvn clean package'
}
}
}
}
As for reusing common TeamCity Kotlin DSL libraries, this can be done via maven dependencies. For that you have to mention it in the pom.xml file within your DSL code. You can also consider using JitPack if your DSL library code is hosted on GitHub for example and you do not want to handle building it separately and publishing its maven artifacts.
Although with migration from Jenkins to TeamCity you will most likely have to rewrite the common library (if you still need one at all), as TeamCity project model and DSL are quite different to what you have in Jenkins.
Speaking of stashing/unstashing workspaces, it may be covered by either artifact rules and artifact dependencies (as described here: https://www.jetbrains.com/help/teamcity/artifact-dependencies.html) or repository clone mirroring on agents.

Is it Possible to Run Jenkinsfile from Jenkinsfile

Currently we are developing centralized control system for our CI/CD projects. There are many projects with many branches so we are using multibranch pipeline ( This forces us to use Jenkinsfile from project branches so we can't provide custom Jenkinsfile like Pipeline projects ). We want to control everything under 1 git repo where for every project there should be kubernetes YAMLS's, Dockerfile and Jenkinsfile. When developer presses build button, Jenkinsfile from their project repo suppose to run our jenkinsfile. Is it possible to do this?
E.g. :
pipeline {
agent any
stages {
stage('Retrieve Jenkinsfile From Repo') { // RETRIEVE JENKINSFILE FROM REPO
steps {
git branch: "master",
credentialsId: 'gitlab_credentials',
url: "jenkinsfile_repo"
scripts {
// RUN JENKINSFILE FROM THE REPO
}
}
}
}
}
Main reason we are doing this, there are sensetive context in jenkinsfile like production database connections. We don't want to store jenkinsfile under developers' repo. Also you can suggest correct way to achieve that beside using only 1 repo.
EDIT: https://plugins.jenkins.io/remote-file/
This plugin solved all my problems. I could'not try comments below
As an option you can use pipeline build step.
pipeline {
agent any
stages {
stage ('build another job') {
steps {
build 'second_job_name_here'
}
}
}
}
Try load step
scripts {
// rename Jenkinsfile to .groovy
sh 'mv Jenkinsfile Jenkins.groovy'
// RUN JENKINSFILE FROM THE REPO
load 'Jenkinsfile.groovy'
}

Jenkins declarative pipeline with Docker/Dockerfile agent from SCM

With Jenkins using the Declarative Pipeline Syntax how do i get the Dockerfile (Dockerfile.ci in this example) from the SCM (Git) since the agent block is executed before all the stages?
pipeline {
agent {
dockerfile {
filename 'Dockerfile.ci'
}
}
stage ('Checkout') {
steps {
git(
url: 'https://www.github.com/...',
credentialsId: 'CREDENTIALS',
branch: "develop"
)
}
}
[...]
}
In all the examples i've seen, the Dockerfile seems to be already present in the workspace.
You could try to declare agent for each stage separately, for checkout stage you could use some default agent and docker agent for others.
pipeline {
agent none
stage ('Checkout') {
agent any
steps {
git(
url: 'https://www.github.com/...',
credentialsId: 'CREDENTIALS',
branch: "develop"
)
}
}
stage ('Build') {
agent {
dockerfile {
filename 'Dockerfile.ci'
}
steps {
[...]
}
}
}
[...]
}
If you're using a multi-branch pipeline it automatically checks out your SCM before evaluating the agent. So in that case you can specify the agent from a file in the SCM.
The answer is in the Jenkins documentation on the Dockerfile parameter:
In order to use this option, the Jenkinsfile must be loaded from
either a Multibranch Pipeline or a Pipeline from SCM.
Just scroll down to the Dockerfile section, and it's documented there.
The obvious problem with this approach is that it impairs pipeline development. Now instead of testing code in a pipeline field on the server, it must be committed to the source repository for each testable change. NOTE also that the Jenkinsfile checkout cannot be sparse or lightweight as that will only pick up the script -- and not any accompanying Dockerfile to be built.
I can think of a couple ways to work around this.
Develop against agents in nodes with the reuseNode true directive. Then when code is stable, the separate agent blocks can be combined together at the top of the Jenkinsfile which must then be loaded from the SCM.
Develop using the dir() solution that specs the exact workspace directory, or alternately use one of the other examples in this solution.

How to graphically visualize/tag build branch in Jenkins?

I am building Jenkins build pipeline and I was wondering if it is possible to somehow tag/visualize the build branch in Jenkins in the similar way as it is automatically possible in TeamCity.
I am using declarative pipeline defined in separate git repository and Jenkins 2.46.3.
From the picture it is not obvious that the last 2 builds were executed on a separate branch:
Thanks
You can modify the current build's display name and description using the following code:
currentBuild.displayName = env.BRANCH_NAME
currentBuild.description = 'Final Release'
This was recently highlighted in the BlueOcean 1.1 announcement, which shows both of them, in contrast to the regular interface, which only shows the displayName.
An example of a modified displayName from our public instance looks as follows:
You can find the code which generates this in our shared library here and here, essentially it is:
currentBuild.displayName = "#${currentBuild.getNumber()} - ${newVersion} (${increment})"
As you are mentioning Declarative Pipelines, let add that you have to wrap this code in a script block, of course. So probably (untested):
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'Hello World'
script {
currentBuild.displayName = env.BRANCH_NAME
}
}
}
}
}
Alternatively, you can extract it into a separate function.

Set build parameter values at runtime with the Pipeline

I'm using a private Github repo with a Jenkinsfile to build a project. I'd actually like to do two separate builds, one for develop that builds whenever a branch is pushed, and one for qa that builds nightly. I've set up a Github Organization as this seems to be the only way to use credentials to check out the repository and perform the build.
My Jenkinsfile looks like:
node {
stage('Preparation') {
properties([[$class: 'ParametersDefinitionProperty',
parameterDefinitions: [
[$class: 'StringParameterDefinition', name: 'build_url'],
[$class: 'StringParameterDefinition', name: 'build_url2'],
]
]])
checkout scm
}
stage('Build') {
dir('Vecna_iDeliver_Torso') {
sh 'npm install'
sh 'node_modules/.bin/gulp build'
}
}
stage('Upload') {
sh 'aws s3 sync dist s3://app-dev'
}
stage('Cleanup') {
deleteDir()
}
}
This all works great, but I need to be able to set the environment variables (build urls) when running gulp, and their values will depend on the environment I want to build for. The s3 bucket I want to upload to will also depend on the environment.
When I set the properties above and then find the build job under my Github organization, I can see that it's accepting the build parameters. However, there doesn't seem to be any way for me to set these externally. I can only use them with "Build with Parameters." This would be fine if I wanted to run the build manually each time, but I want it to run nightly. Since the two different environments require different build parameter values, I can't set them as defaults.
Is there any way for me to set build parameter values ahead of time using the Jenkins pipeline?

Resources