Is there a way to use the Jenkins "Execute system groovy script" step from a pipeline file which is SCM commited ?
If yes, how would I access the predefined variables (like build) in it ?
If no, would I be able to replicate the functionality otherwise, using for example the Shared Library Plugin ?
Thanks !
You can put groovy code in a pipeline in a (always-source-controlled) Jenkinsfile, like this:
pipeline {
agent { label 'docker' }
stages {
stage ('build') {
steps {
script {
// i used a script block because you can jam arbitrary groovy in here
// without being constrained by the declarative Jenkinsfile DSL
def awesomeVar = 'so_true'
print "look at this: ${awesomeVar}"
// accessing a predefined variable:
echo "currentBuild.number: ${currentBuild.number}"
}
}
}
}
}
Produces console log:
[Pipeline] echo
look at this: so_true
[Pipeline] echo
currentBuild.number: 84
Click on the "Pipeline Syntax" link in the left navigation of any of pipeline job to get a bunch of examples of things you can access in the "Global Variables Reference."
Related
I have a doubt on the below issue can someone please help me on this.
I wanted to pass maven pom.xml properties from the shell in jenkins pipeline which needs to be substituted by maven and not by groovy or shell.
Example:
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'mvn -Doracle.db.url=${db.url} package'
}
}
}
}
Here ${db.url} should be substituted by the url from maven setting.xml file properties and not by groovy or shell in Jenkins pipeline.
I have tried different combination but it gives me error in Jenkins pipeline.
If the above maven property is constant(some constant url) then it is easy to pass but when I wanted to pass any variable property (${db.url}) then I am not able to do so with any syntax.
If you want maven to evaluate ${db.url}, it has to be like
pipeline {
agent any
stages {
stage('build') {
steps {
sh 'mvn -Doracle.db.url=\\${db.url} package'
}
}
}
}
Now if you see, Jenkins will prepare the maven command like this-
If you don't escape it will give you Bad substitution error
I'm a using the jenkins pipeline. My usecase is that the developer are using a simple *.ini file that is parsed by a python script to add or remove stage within the jenkinsfile whenever they want. I don't want them to manually edit the jenkinsfile because they won't know how it works.
Expected behaviour is:
When a build is triggered I would like to first execute a python script which might write into the jenkinsfile to add/remove stage according to the *.ini file.
As far as I understand, when an event trigger a jenkins build, the first thing it does is opening the jenkinsfile. However I would like to know if it's possible to run some prebuild script before that ?
Thanks
Edit: here's a simple view of run of the pipeline (blue ocean UI)
The ini file might for example remove in the stage Compilation the step Building Plan C by removing the groovy code doing that in the jenkins file
Give an example for reference.
node {
git url: '', branch: '', credentialsId: ''
def parseStr = sh(script: 'python parser.py xxx.ini', returnStdout: true).trim()
// the python parser expect to return a JSON string like:
// {'run_stage1': false, 'run_stage2': true}
def parseObj = readJSON text: parseStr
stage('stage 1') {
if(parseObj.run_stage1) {
echo 'stage1'
...
}
}
stage('stage 1') {
if(parseObj.run_stage2) {
echo 'stage1'
....
}
}
}
Jenkins pipeline had supply apis: readJSON, readYaml, readProperties to read JSON, YAML and Properties files.
If you choose any of them to replace ini file, you can drop the python parser to make your pipeline more simple
Is it possible to take an entire stage('foo') {...} definition and extract it into a shared library within Jenkins? The docs are very clear on how to pull an individual step out, but I can't find any way to take an entire stage, parameterize it, and re-use it globally. I thought perhaps just return stage... would work, but it errors out as an invalid return value.
It depends if you use scripted or declarative pipeline.
Scripted pipeline is more flexible and it allows you e.g. create stages based on some conditions (each pipeline run can have a different number and kind of stages). In this kind of pipeline you can extract a full stage to the shared library class and call it from inside the node {} block. Consider following example:
// src/ScriptedFooStage.groovy
class ScriptedFooStage {
private final Script script
ScriptedFooStage(Script script) {
this.script = script
}
// You can pass as many parameters as needed
void execute(String name, boolean param1) {
script.stage(name) {
script.echo "Triggering ${name} stage..."
script.sh "echo 'Execute your desired bash command here'"
if (param1) {
script.sh "echo 'Executing conditional command, because param1 == true'"
}
}
}
}
Then the Jenkinsfile may look like this:
node {
new ScriptedFooStage(this).execute('Foo', true)
}
As you can see the whole stage was encapsulated in the ScriptedFooStage.execute() method. Its name is also taken from the parameter name - scripted pipeline allows you doing such thing.
Declarative pipeline on the other hand is more strict and opinionated. It's fixed if it comes to the number of stages and their names (you can't model dynamically what stages are present per build and what are their names). You can still take advantage of shared library classes, but you are limited to execute them inside script {} block inside stage('Name') { steps {} } block. It means that you can't extract the whole stage to the separate class, but only some part that gets executed at the steps level. Consider following example:
// src/DeclarativeFooStage.groovy
class DeclarativeFooStage {
private final Script script
DeclarativeFooStage(Script script) {
this.script = script
}
// You can pass as many parameters as needed
void execute(String name, boolean param1) {
script.echo "Triggering script with name == ${name}"
script.sh "echo 'Execute your desired bash command here'"
if (param1) {
script.sh "echo 'Executing conditional command, because param1 == true'"
}
}
}
And the Jenkinsfile may look like this:
// Jenkinsfile
pipeline {
agent any
stages {
stage('Foo') {
steps {
script {
new DeclarativeFooStage(this).execute('something', false)
}
}
}
}
}
If we would try execute new DeclarativeFooStage(this).execute('something', false) outside script {} block in the declarative pipeline we would get compilation errors.
Conclusion
The choice between scripted or declarative pipeline depends on specific use case. If you want to get the best flexibility when it comes to modeling your pipeline business logic, scripted pipeline might be the good choice. However, it comes with some price. For instance, scripted pipeline does not support restarting pipeline build from specific stage - this is supported only by declarative pipeline. (Imagine you have 10 stages in the pipeline and stage 7 failed because of some silly mistake and you would like to restart the build from 7th stage - in scripted pipeline you would have to re-run from the very beginning, while declarative pipeline can restart from 7th stage by remembering the results from all 6 previous stages).
To complete Szymon Stepniak answer I will leave here note that in declarative pipeline you may also share whole pipeline:
// vars/myDeliveryPipeline.groovy
def call(Map pipelineParams) {
pipeline {
agent any
stages {
stage('build') {
...
}
stage ('test') {
...
}
...
}
}
}
And then call it
// Jenkinsfile
myDeliveryPipeline(foo: 'FOO', bar: 'BAR')
But as far as remember you may call only one pipeline in a Jenkins file which make it not very customizable.
Source
https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/
I have Jenkins pipeline Job with parameters (name, group, taskNumber)
I need to write pipeline script which will call groovy script (this one?: https://github.com/peterjenkins1/jenkins-scripts/blob/master/add-job.groovy)
I want to create new job (with name name_group_taskNamber) every times when I build main Pipeline Job.
I don't understand:
Where do I need to put may groovy script ?
How does Pipeline script should look like? :
node{
stage('Build'){
def pipeline = load "CreateJob.groovy"
pipeline.run()
}
}
You can use and configure a shared library like here (a git repo): https://github.com/lvthillo/shared-library . You need to configure this in your Jenkins global configuration.
It contains a folder vars/. Here you can manage pipelines and groovy scripts like my slackNotifier.groovy. The script is just a groovy script to print the build result in Slack.
In the jenkins pipeline job we will import our shared library:
#Library('name-of-shared-pipeline-library')_
mavenPipeline {
//define parameters
}
In the case above also the pipeline is in the shared library but this isn't necessary.
You can just write your pipeline in the job itself and call only the function from the pipeline like this:
This is the script in the shared library:
// vars/sayHello.groovy
def call(String name = 'human') {
echo "Hello, ${name}."
}
And in your pipeline:
final Lib= library('my-shared-library')
...
stage('stage name'){
echo "output"
Lib.sayHello.groovy('Peter')
}
...
EDIT:
In new declarative pipelines you can use:
pipeline {
agent { node { label 'xxx' } }
options {
buildDiscarder(logRotator(numToKeepStr: '3', artifactNumToKeepStr: '1'))
}
stages {
stage('test') {
steps {
sh 'echo "execute say hello script:"'
sayHello("Peter")
}
}
}
post {
always {
cleanWs()
}
}
}
def sayHello(String name = 'human') {
echo "Hello, ${name}."
}
output:
[test] Running shell script
+ echo 'execute say hello script:'
execute say hello script:
[Pipeline] echo
Hello, Peter.
[Pipeline] }
[Pipeline] // stage
We do it by using the https://wiki.jenkins.io/display/JENKINS/Jobcopy+Builder+plugin, try build another step in pipeline script and pass the parms which are to be considered
In one stage of my declarative jenkins pipeline codes, it executes a bash script(sh '''./a.sh''', script "a.sh" is maintained outsides) - in that script, the value of "jarVersion" is injected in ${WORKSPACE}/.jarVersion (echo "jarVersion=${jarVersion}" > ${WORKSPACE}/.jarVersion). At later stage, we need get the value of jarVersion. We use load "${WORKSPACE}/.jarVersion" and ${jarVersion} to get the value. It works when we do so in pipeline script.
However, when we set this pipeline as a shared library (put it in /vars/testSuite.groovy) and call it in another pipeline script. It can not recognize var ${jarVersion}.
Please advise how to solve the issue. A common question is: how to transfer a value in a script from stage A to stage B?
stage('getJarVersion'){
steps{
script{
load "${WORKSPACE}/.jarVersion"
currentBuild.description = "jarVersion:${jarVersion}"
}
}
}
I expected it could work as it is in pipeline scripts.
But it shows:
groovy.lang.MissingPropertyException: No such property: jarVersion for class: testSuite
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:53)
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.getProperty(ScriptBytecodeAdapter.java:458)
at com.cloudbees.groovy.cps.sandbox.DefaultInvoker.getProperty(DefaultInvoker.java:34)
at com.cloudbees.groovy.cps.impl.PropertyAccessBlock.rawGet(PropertyAccessBlock.java:20)
at testSuite.call(/jenkins/jobs/TestSuite1/builds/11/libs/pipelineUtilities/vars/testSuite.groovy:84)
With the stages under the same groovy file, you have to declare the object out of the stage blocks and before the node block. So for each stage, you can define the value inside the variable:
Pipeline {
def my_var
stage('stage1'){
---------
}
stage('stage2'){
---------
}
}
If you are defining a stage per file, you have to create the closures with the input object and to pass it in the call from the parent groovy file:
test.groovy:
def call(def my_obj, String my_string) {
stage('my_stage') {
println(my_obj)
}
}
parent_test.groovy
test(obj_value,string_value)