Jenkins pipeline configuration in Pipeline configuration section - jenkins

I'm trying to setup "generic" build system and using Docker with Jenkins to build and run tests with pipeline.
I use wrapper script (pulled from repo) that contains most of the stuff docker needs. Only thing that changes is a tag for images.
How can I somehow define this tag in build configuration as an environment variable or similar which can be then passed to actual pipeline script.
Simplified script:
pipeline {
stages {
stage("Build test image") {
dockerImage = docker.build("...", "--build-arg MYBRANCH=${SOMEVAR}")
}
}
}
So how I can set (per build config) SOMEVAR?
I could have per custom Jenkinsfile branch but eventually that will just end up with maintenance nightmare (I already now do have 7 branches to build)

It can be defined by static in environment or dynamic in parameters. In case of parameters then you should provide values when run a build through interface or api.
pipeline {
environment {
SOMEVAR = "123"
}
parameters {
choice(name: 'CHOICE_VAR', choices: ['1', '2', '3'], description: 'Type...')
string(name: 'STRING_VAR', defaultValue: '', description: 'Type...')
}
stages {
stage("Build test image") {
dockerImage = docker.build("...", "--build-arg MYBRANCH=${env.SOMEVAR}")
dockerImage = docker.build("...", "--build-arg MYBRANCH=${params.CHOICE_VAR}")
dockerImage = docker.build("...", "--build-arg MYBRANCH=${params.STRING_VAR}")
}
}
}

Related

passing Jenkins env variables between stages on different agents

I've looked at this Pass Artifact or String to upstream job in Jenkins Pipeline and this Pass variables between Jenkins stages and this How do I pass variables between stages in a declarative Jenkins pipeline?, but none of these questions seem to deal with my specific problem.
Basically I have a pipeline consisting of multiple stages, each run in its own agent.
In the first stage I run a shell script. Here two variables are generated. I would like to use these variables in the next stage. The methods I've seen so far seem to only work when passing variables within the same agent.
pipeline {
stages {
stage("stage 1") {
agent {
docker {
image 'my_image:latest'
}
}
steps {
sh ("""
export VAR1=foo
export VAR2=bar
""")
}
}
stage("stage 2") {
agent {
docker {
image 'my_other_image:latest'
}
}
steps {
sh ("echo "$VAR1 $VAR2")
//expecting to see "foo bar" printed here
}
}

How to share environment variable value across different Jenkins Pipelines?

I have two Jenkins Pipelines :
Pipeline A : In a stage, I defined an environment variable called MAVEN_PROFILE (the user can choose a value from a list)
Pipeline B : I need to get the MAVEN_PROFILE environment variable value that was set in Pipeline A
I need two pipelines because I can't do it in a single Pipeline for process reason.
I saw there was some answers on how to share variable between stages within a single Pipeline but this not my case.
I want to share environment variable value between different Pipelines.
Pipeline A
pipeline {
agent any
...
stages {
stage('Profile Selection'){
steps {
script {
env.MAVEN_PROFILE = input message: 'Choose the profile :',
parameters: [choice(name: 'MAVEN_PROFILE',
choices: 'all\nserver\nclient', description: 'Profiles')]
}
}
}
stage(...){
steps {
script {
bat "mvn deploy -P ${env.MAVEN_PROFILE}"
}
}
}
... other stages
}
}
Pipeline B
pipeline {
agent any
...
stages {
... other stages
stage(...){
steps {
script {
bat "mvn release ... -P ${env.environmentVariableValueFromPipelineA}"
}
}
}
}
}
They're not running in the same environment, so they can't directly share environment variables. The easiest is probably to write these values to a file in the workspace in pipeline A, and read them back in in pipeline B. Something like this:
Pipeline A:
sh "echo ${MAVEN_PROFILE} > .MAVEN_PROFILE"
Pipeline B:
def MAVEN_PROFILE = sh(script: 'cat .MAVEN_PROFILE', returnStdout: true).trim()

How to build docker images using a Declarative Jenkinsfile

I'm new to using Jenkins....
I'm trying to automate the production of an image (to be stashed in a repo) using a declarative Jenkinsfile. I find the documentation to be confusing (at best). Simply put, how can I convert the following scripted example (from the docs)
node {
checkout scm
def customImage = docker.build("my-image:${env.BUILD_ID}")
customImage.push()
}
to a declarative Jenkinsfile....
You can use scripted pipeline blocks in a declarative pipeline as a workaround
pipeline {
agent any
stages {
stage('Build image') {
steps {
echo 'Starting to build docker image'
script {
def customImage = docker.build("my-image:${env.BUILD_ID}")
customImage.push()
}
}
}
}
}
I'm using following approach:
steps {
withDockerRegistry([ credentialsId: "<CREDENTIALS_ID>", url: "<PRIVATE_REGISTRY_URL>" ]) {
// following commands will be executed within logged docker registry
sh 'docker push <image>'
}
}
Where:
CREDENTIALS_ID stands for key in Jenkis under which you store credentials to your docker registry.
PRIVATE_REGISTRY_URL stands for url of your private docker registry. If you are using docker hub then it should be empty.
I cannot recommend the declarative syntax for building a Docker image bcos it seems that every important step requires falling back to the old scripting syntax. But if you must, a hybrid approach seems to work.
First a detail about the scm step: when I defined the Jenkins "Pipeline script from SCM" project that fetches my Jenkinsfile with a declarative pipline from git, Jenkins cloned the repo as the first step in the pipeline even tho I did not define a scm step.
For the build and push steps, I can only find solutions that are a hybrid of old-style scripted pipeline steps inside the new-style declarative syntax. For example see gustavoapolinario's work at Medium:
https://medium.com/#gustavo.guss/jenkins-building-docker-image-and-sending-to-registry-64b84ea45ee9
which has this hybrid pipeline definition:
pipeline {
environment {
registry = "gustavoapolinario/docker-test"
registryCredential = 'dockerhub'
dockerImage = ''
}
agent any
stages {
stage('Cloning Git') {
steps {
git 'https://github.com/gustavoapolinario/microservices-node-example-todo-frontend.git'
}
}
stage('Building image') {
steps{
script {
dockerImage = docker.build registry + ":$BUILD_NUMBER"
}
}
}
stage('Deploy Image') {
steps{
script {
docker.withRegistry( '', registryCredential ) {
dockerImage.push()
}
}
}
}
stage('Remove Unused docker image') {
steps{
sh "docker rmi $registry:$BUILD_NUMBER"
}
}
}
}
Because the first step here is a clone, I think he built this example as a standalone pipeline project in Jenkins (not a Pipeline script from SCM project).

Initialize Jenkins with Pipeline Job

I am building Jenkins with a Dockerfile, and during the Docker build I would like to have Jenkins pre-configured with a set of jobs. I find this works well with Jobs DSL, where jobs are seeded, but I have yet to preconfigure the "Pipeline" DSL. Given the direction of Jenkins and use of Jenkisfile, Pipeline, etc, I think there must be some way to allow Jenkins to automatically run with a set of jobs that were built using the Pipeline approach
Example Pipeline:
pipeline {
agent {
label 'cft'
}
parameters {
string(name: 'StackName', defaultValue: 'cft-stack', description: 'The name to give the CFT stack.')
string(name: 'KeyName', defaultValue: 'ACCOUNT', description: 'The account key to use for encryption.')
string(name: 'VpcId', defaultValue: 'vpc-1234', description: 'The VPC to assign to the cluster resources.')
string(name: 'SubnetID', defaultValue: 'subnet-1234, subnet-6789', description: 'The subnet(s) to assign to the cluster resources.')
stages {
stage('Build') {
steps {
s3Download(file:'cft.yaml'
, bucket:'cft-resources'
, path:'cft.yaml'
, force:true)
cfnUpdate(stack:"${params.StackName}"
, file:"cft.yaml"
, params:[
"SnapshotId=${params.SnapshotId}",
"KeyName=${params.KeyName}",
"VpcId=${params.VpcId}"
]
, timeoutInMinutes: 20
)
}
}
}
post {
failure {
echo 'FAILURE'
cfnDelete(stack:"${params.StackName}")
}
}
}
Dockerfile:
COPY ./groovy/*.groovy /usr/share/jenkins/ref/init.groovy.d/
Pipeline's Groovy files differ from the Groovy code that can be executed to configure Jenkins. You can't add pipelines the way you're trying to do.
Your options include
copy the XML file for the job definition (pointing to your repo, as the pipeline should be in the Jenkinsfile in the repo)
create a job using Groovy and configure it (not really practicable IMHO)
use JobDSL (again, with XML as starting point) to specify your Jenkins jobs. An example for automatically adding this can be found in tknerr/jenkins-pipes-infra.

How to set Jenkins environment variables in run-time

I want to set some jenkins environment variables in run time based on my computation. How can i set this run-time in my jenkinsfile's step section.
for example: based on my calculation i get abc=1. How can i set this in real time in my jenkinsfile's step section so that i can use it later by calling $abc.
I am declaring my pipeline and environment variables as explained here:
https://jenkins.io/doc/pipeline/tour/environment/
i'm using Jenkins ver. 2.41
Here an example how to set variables and use it in the same Jenkinsfile.
The Variable versionToDeploy will be used by the build job step.
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'build the artifacts'
script {
versionToDeploy = '2.3.0'
}
}
}
}
post {
success {
echo 'start deploy job'
build job: 'pipeline-declarative-multi-job-deploy', parameters: [[$class: 'StringParameterValue', name: 'version', value: versionToDeploy]]
}
}
}

Resources