how to read from configfile in jenkins pipeline BEFORE stages - jenkins

I have configfile, which is a JSON file. I want to be able to read it before any steps, as it provides variables I need to execute them. However, I don't know where do I put that. To contain config file provider call, I tried creating a separate node before pipeline, to no avail, also tried to set up script in stages, stage (also as post).

I did a simple practice on my jenkins as following.
def config;
node(){
configFileProvider([configFile(fileId: '<your config file id>', targetLocation: 'myConfig')]) {
config = readJSON file: 'myConfig'
}
}
pipeline {
agent any
stages {
stage('Build') {
steps {
echo config.myKey // or config['myKey']
}
}
}
}

Related

How to use file parameter in Jenkins declarative pipeline?

I am currently using the following code to upload file in Jenkins declarative pipeline and read the content from file. But the file is not stored in the Jenkins workspace or anywhere. So, whenever I run pipelines, it shows file not found error.
I tried other ways which are available on Internet but did not get output. Can anyone suggest a proper way to upload file in Jenkins and read the data from it?
pipeline {
agent any
parameters {
file(name: 'yamlFile', description: 'Upload file test')
}
stages {
stage ("Checkout demo repo") {
steps {
script{
echo "${WORKSPACE}"
def configVal = readYaml file: yamlFile
}
}
}
}
}
I never had luck using the default File Input with a declarative Pipeline. Instead, I used the File Parameters plugin. Here is an example.
pipeline {
agent any
parameters {
base64File 'yamlFile'
}
stages {
stage('Example') {
steps {
withFileParameter('yamlFile') {
def configVal = readYaml file: yamlFile
}
}
}
}
}

Multiple Jenkinsfiles, One Agent Label

I have a project which has multiple build pipelines to allow for different types of builds against it (no, I don't have the ability to make one build out of it; that is outside my control).
Each of these pipelines is represented by a Jenkinsfile in the project repo, and each one must use the same build agent label (they need to share other pieces of configuration as well, but it's the build agent label which is the current problem). I'm trying to put the label into some sort of a configuration file in the project repo, so that all the Jenkinsfiles can read it.
I expected this to be simple, as you don't need this config data until you have already checked out a copy of the sources to read the Jenkinsfile. As far as I can tell, it is impossible.
It seems to me that a Jenkinsfile cannot read files from SCM until the project has done its SCM step. However, that's too late: the argument to agent{label} is read before any stages get run.
Here's a minimal case:
final def config
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
checkout scm // we don't need all the submodules here
echo "Reading configuration JSON"
script { config = readJSON file: 'buildjobs/buildjob-config.json' }
echo "Read configuration JSON"
}
}
stage('Build and Deploy') {
agent {
label config.agent_label
}
steps {
echo 'Got into Stage 2'
}
}
}
}
When I run this, I get:
java.lang.NullPointerException: Cannot get property 'agent_label' on null object I don't get either of the echoes from the 'Configure' stage.
If I change the label for the 'Build and Deploy' stage to 'master', the build succeeds and prints out all three echo statements.
Is there any way to read a file from the Git workspace before the agent labels need to be set?
Please see https://stackoverflow.com/a/52807254/7983309. I think you are running into this issue. label is unable to resolve config.agent_label to its updated value. Whatever is set in the first line is being sent to your second stage.
EDIT1:
env.agentName = ''
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
script {
env.agentName = 'slave'
echo env.agentName
}
}
}
stage('Finish') {
steps {
node (agentName as String) { println env.agentName }
script {
echo agentName
}
}
}
}
}
Source - In a declarative jenkins pipeline - can I set the agent label dynamically?

External workspace manager plugin with declarative pipeline

I want to use the mentioned plugin with a declarative pipeline, to be precise I want to convert the following documentation example to a declarative pipeline:
The pipeline code in the upstream job is the following:
stage ('Stage 1. Allocate workspace in the upstream job')
def extWorkspace = exwsAllocate 'diskpool1'
node ('linux') {
exws (extWorkspace) {
stage('Stage 2. Build in the upstream job')
git url: 'https://github.com/alexsomai/dummy-hello-world.git'
def mvnHome = tool 'M3'
sh '${mvnHome}/bin/mvn clean install -DskipTests'
}
}
And the downstream's Pipeline code is:
stage ('Stage 3. Select the upstream run')
def run = selectRun 'upstream'
stage ('Stage 4. Allocate workspace in the downstream job')
def extWorkspace = exwsAllocate selectedRun: run
node ('test') {
exws (extWorkspace) {
stage('Stage 5. Run tests in the downstream job')
def mvnHome = tool 'M3'
sh '${mvnHome}/bin/mvn test'
}
}
Thanks!
I searched everywhere for a clear answer to this, yet never found a definitive answer. So, I pulled the External Workspace Plugin code and read it. The answer is simple as long as the plugins Model doesn't change.
Anytunc's answer is very close, but the issue is getting the path from the External Workspace Plugin and getting it into the customWorkspace configuration.
What I ended up doing was creating a method:
def getExternalWorkspace() {
extWorkspace = exwsAllocate diskPoolId: "jenkins"
return extWorkspace.getCompleteWorkspacePath()
}
and setting my agent to:
agent {
node {
label 'Linux'
customWorkspace getExternalWorkspace()
}
}
If you'd rather not set the entire pipeline to that path, you could create as many external workspaces as you want, then use
...
steps {
dir(getExternalWorkspace()) {
do fancy stuff
...
}
}
...
You can use this agent directive:
agent {
node {
label 'my-defined-label'
customWorkspace '/some/other/path'
}
}

Can I "import" the stages in a Jenkins Declarative pipeline

I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

How to set PATH in Jenkins Declarative Pipeline

In Jenkins scripted pipeline you can set PATH env variable like this :
node {
git url: 'https://github.com/jglick/simple-maven-project-with-tests.git'
withEnv(["PATH+MAVEN=${tool 'M3'}/bin"]) {
sh 'mvn -B verify'
}
}
Notice the PATH+MAVEN as explained here https://jenkins.io/doc/pipeline/steps/workflow-basic-steps/#code-withenv-code-set-environment-variables :
A list of environment variables to set, each in the form
VARIABLE=value or VARIABLE= to unset variables otherwise defined. You
may also use the syntax PATH+WHATEVER=/something to prepend /something
to $PATH.
But I didn't find how to do it in declarative pipeline using environment syntax (as explained here : https://jenkins.io/doc/pipeline/tour/environment).
environment {
DISABLE_AUTH = 'true'
DB_ENGINE = 'sqlite'
}
Ideally I would like to update the PATH to use custom tools for all my stages.
It is possible with environment section:
pipeline {
agent { label 'docker' }
environment {
PATH = "/hot/new/bin:${env.PATH}"
}
stages {
stage ('build') {
steps {
echo "PATH is: ${env.PATH}"
}
}
}
}
See this answer for info.
As a workaround, you can define an environment variable and use it in the sh step:
pipeline {
environment {
MAVEN_HOME = tool('M3')
}
stages {
stage(Maven') {
sh '${MAVEN_HOME}/bin/mvn -B verify'
}
}
}
Check the following link, this explains how to configure your tools.
Using the declarative pipeline things become a bit different but overall it is easier to understand.
declarative-maven-project
Using the tool section in pipeline is only allowed for pre-installed Global Tools. Some tools are provided by plugins, but if it not exists I'am afraid you cannot use the environment setup via pipeline tool declaration.
I hope to be wrong!

Resources