Jenkins scripted pipeline or declarative pipeline - jenkins

I'm trying to convert my old style project base workflow to a pipeline based on Jenkins. While going through docs I found there are two different syntaxes named scripted and declarative. Such as the Jenkins web declarative syntax release recently (end of 2016). Although there is a new syntax release Jenkins still supports scripted syntax as well.
Now, I'm not sure in which situation each of these two types would be a best match. So will declarative be the future of the Jenkins pipeline?
Anyone who can share some thoughts about these two syntax types.

When Jenkins Pipeline was first created, Groovy was selected as the foundation. Jenkins has long shipped with an embedded Groovy engine to provide advanced scripting capabilities for admins and users alike. Additionally, the implementors of Jenkins Pipeline found Groovy to be a solid foundation upon which to build what is now referred to as the "Scripted Pipeline" DSL.
As it is a fully featured programming environment, Scripted Pipeline offers a tremendous amount of flexibility and extensibility to Jenkins users. The Groovy learning-curve isn’t typically desirable for all members of a given team, so Declarative Pipeline was created to offer a simpler and more opinionated syntax for authoring Jenkins Pipeline.
The two are both fundamentally the same Pipeline sub-system underneath. They are both durable implementations of "Pipeline as code." They are both able to use steps built into Pipeline or provided by plugins. Both are able to utilize Shared Libraries
Where they differ however is in syntax and flexibility. Declarative limits what is available to the user with a more strict and pre-defined structure, making it an ideal choice for simpler continuous delivery pipelines. Scripted provides very few limits, insofar that the only limits on structure and syntax tend to be defined by Groovy itself, rather than any Pipeline-specific systems, making it an ideal choice for power-users and those with more complex requirements. As the name implies, Declarative Pipeline encourages a declarative programming model. Whereas Scripted Pipelines follow a more imperative programming model.
Copied from Syntax Comparison

Another thing to consider is declarative pipelines have a script() step. This can run any scripted pipeline. So my recommendation would be to use declarative pipelines, and if needed use script() for scripted pipelines. Therefore you get the best of both worlds.

I made the switch to declarative recently from scripted with the kubernetes agent. Up until July '18 declarative pipelines didn't have the full ability to specify kubernetes pods. However with the addition of the yamlFile step you can now read your pod template from a yaml file in your repo.
This then lets you use e.g. vscode's great kubernetes plugin to validate your pod template, then read it into your Jenkinsfile and use the containers in steps as you please.
pipeline {
agent {
kubernetes {
label 'jenkins-pod'
yamlFile 'jenkinsPodTemplate.yml'
}
}
stages {
stage('Checkout code and parse Jenkinsfile.json') {
steps {
container('jnlp'){
script{
inputFile = readFile('Jenkinsfile.json')
config = new groovy.json.JsonSlurperClassic().parseText(inputFile)
containerTag = env.BRANCH_NAME + '-' + env.GIT_COMMIT.substring(0, 7)
println "pipeline config ==> ${config}"
} // script
} // container('jnlp')
} // steps
} // stage
As mentioned above you can add script blocks. Example pod template with custom jnlp and docker.
apiVersion: v1
kind: Pod
metadata:
name: jenkins-pod
spec:
containers:
- name: jnlp
image: jenkins/jnlp-slave:3.23-1
imagePullPolicy: IfNotPresent
tty: true
- name: rsync
image: mrsixw/concourse-rsync-resource
imagePullPolicy: IfNotPresent
tty: true
volumeMounts:
- name: nfs
mountPath: /dags
- name: docker
image: docker:17.03
imagePullPolicy: IfNotPresent
command:
- cat
tty: true
volumeMounts:
- name: docker
mountPath: /var/run/docker.sock
volumes:
- name: docker
hostPath:
path: /var/run/docker.sock
- name: nfs
nfs:
server: 10.154.0.3
path: /airflow/dags

declarative appears to be the more future-proof option and the one that people recommend. it's the only one the Visual Pipeline Editor can support. it supports validation. and it ends up having most of the power of scripted since you can fall back to scripted in most contexts. occasionally someone comes up with a use case where they can't quite do what they want to do with declarative, but this is generally people who have been using scripted for some time, and these feature gaps are likely to close in time.
more context: https://jenkins.io/blog/2017/02/03/declarative-pipeline-ga/

The Jenkins documentation properly explains and compares both the types.
To quote:
"Scripted Pipeline offers a tremendous amount of flexibility and extensibility to Jenkins users. The Groovy learning-curve isn’t typically desirable for all members of a given team, so Declarative Pipeline was created to offer a simpler and more opinionated syntax for authoring Jenkins Pipeline.
The two are both fundamentally the same Pipeline sub-system underneath."
Read more here:https://jenkins.io/doc/book/pipeline/syntax/#compare

The declarative pipeline is defined within a block labelled ‘pipeline’ whereas the scripted pipeline is defined within a ‘node’.
Syntax - Declarative pipeline has 'Stages' , 'Steps'
If the build is failed, declarative one gives you an option to restart the build from that stage again which is not true in scripted option
If there is any issue in scripting, the declarative one will notify you as soon as you build the job but in case of scripted , it will pass the stage that is 'Okay' and throw error on the stage which is 'Not ok'
You can also refer this. A very Good read -> https://e.printstacktrace.blog/jenkins-scripted-pipeline-vs-declarative-pipeline-the-4-practical-differences/
#Szymon.Stepniak https://stackoverflow.com/users/2194470/szymon-stepniak?tab=profile

I also have this question, which brought me here. Declarative pipeline certainly seems like the preferred method and I personally find it much more readable, but I'm trying to convert a mid-level complexity Freestyle job to Declarative and I've found at least one plugin, the Build Blocker plugin, that I can't get to run even in the a script block in a step (I've tried putting the corresponding "blockOn" command everywhere with no luck, and the return error is usually "No such DSL method 'blockOn' found among steps".) So I think plugin support is a separate issue even with the script block (someone please correct me if I'm wrong in this.) I've also had to use the script block several times to get what I consider simple behaviors to work such as setting the build display name.
Due to my experience, I'm leaning towards redoing my work as scripted since support for Declarative still isn't up to where we need, but it's unfortunate as I agree this seems the most future proof option, and it is officially supported. Maybe consider how many plugins you intend to use before making a choice.

Related

Is a Jenkinsfile valid standalone groovy?

I'm trying to wrap my head around how this declarative Jenkinsfile is Groovy. I want to write supporting code to execute this outside the Jenkins environment, in pure Groovy, if that's possible. I've been writing example groovy code but still am unsure what "pipeline", "agent", and "stages" are.
Any tips to understand this structure is appreciated
EDIT: I edited this question with simplified code below. I'm just wondering if there is a way that this can be turned into valid groovy code without the preprocessor/groovyshell environment that is utilized by Jenkins
pipeline {
stages {
// extra code here
}
}
No, you can't run Jenkinsfile as a standalone Groovy script. In short, Jenkins executes the pipeline code inside a pre-configured GroovyShell that knows how to evaluate things like pipeline, agent, stages, and so forth. However, there is a way to execute Jenkinsfie without the Jenkins server - you can use JenkinsPipelineUnit test library to write JUnit/Spock unit tests that will evaluate your Jenkinsfile and display the call stack tree. It uses mocks, so you can treat it as interaction-based testing, to see if a specific part of your pipeline gets executed. Plus, you can catch some code errors prior to running the pipeline on the server.
A simple unit test for the declarative pipeline can look like this:
import com.lesfurets.jenkins.unit.declarative.*
class TestExampleDeclarativeJob extends DeclarativePipelineTest {
#Test
void should_execute_without_errors() throws Exception {
def script = runScript("Jenkinsfile")
assertJobStatusSuccess()
printCallStack()
}
}
You can find more examples in the official README.md - https://github.com/jenkinsci/JenkinsPipelineUnit
Alternatively, you can try Jenkinsfile Runner command-line tool that can execute your Jenkinsfile outside of the Jenkins server - https://github.com/jenkinsci/jenkinsfile-runner
UPDATE
I edited this question with simplified code below. I'm just wondering if there is a way that this can be turned into valid groovy code without the preprocessor/groovyshell environment that is utilized by Jenkins.
Your pipeline code example looks like a valid Jenkinsfile, but you can't turned it into a Groovy code that can be run e.g. from the command-line as a regular Groovy script:
$ groovy Jenkinsfile
This won't work, because Groovy is not aware of the Jenkins Pipeline syntax. The syntax is added as a DSL via the Jenkins plugin, and it uses a dedicated GroovyShell that is pre-configured to interpret the pipeline syntax correctly.
If you are interested in checking if the syntax of the Jenkins Pipeline is correct, there are a few different options:
npm-groovy-lint (https://github.com/nvuillam/npm-groovy-lint) can validate (and even auto-fix) the syntax of your Jenkinsfile without connecting to the Jenkins server,
Command-Line Pipeline Linter (https://www.jenkins.io/doc/book/pipeline/development/#linter) can send your pipeline code to the Jenkins server and validate its syntax.
These are a few tools that can help you with catching up the syntax errors before you run the pipeline. But that's just a nice addon to your toolbox. The first step, as always, is to understand what the syntax means, and the official documentation (https://www.jenkins.io/doc/book/pipeline/syntax) is the best place to start.

Is there any way to dynamically convert a Jenkinsfile into a Drone.yml pipeline configuration?

I currently have some existing Jenkinfiles from an older Jenkins CI/CD pipeline configuration. I've started migrating services to Drone CI recently but not quite sure how some of the Jenkins (groovy) commands translate to Drone's yaml syntax.
Example (redacted / sample):
// ...
stage('version')
choice = new ChoiceParameterDefinition('VERSION', ['x', 'y', 'z'] as String[], '...')
def type = input(id: 'type', message: 'Select one', parameters: [choice])
stage('Tag') {
sh "./some-script/.sh -t ${type}"
}
// ...
Is there anything that could do the conversion automatically? The DroneCI docs are pretty vague and don't cover many important pipeline design aspects (at least not from what I've found).
Unfortunately this is impossible to achieve in DroneCI by the same means. This is because Jenkins allows entering input from the UI when running a pipeline, while DroneCI does not.
You can however specify properties such as the version number in a different file that the pipeline can identify and process accordingly.

How to test Jenkinsfile

I am trying to write the test cases to validate the Jenkinsfile, But the load script function not working expecting the extension to be provided and throwing ResourceException exception loadScript("Jenkinsfile")
Is their better way to test the Jenkinsfile
The problem is that there are not enough tools for the development of pipelines. Pipelines is DSL and it imposes a restrictions.
There is an interesting approach to using flags. For example, test which defines outside pipeline(in job). If test=true, a pipeline change some "production" logic to "test" - select another agent, load artifacts into another repository, run another command and so on.
But recently appeared Pipeline Unit Testing Framework. It allows you to unit test Pipelines and Shared Libraries before running them in full. It provides a mock execution environment where real Pipeline steps are replaced with mock objects that you can use to check for expected behavior.
Useful links:
Jenkins World 2017: JenkinsPipelineUnit: Test your Continuous Delivery Pipeline
Pipeline Development Tools
You can validate your Declarative Pipeline locally thanks to Jenkins built-in features.This can be done using a Jenkins CLI command or by making an HTTP POST request with appropriate parameters.
The command is the following:
curl -s -X POST -F "jenkinsfile=<YourJenkinsfile" \
https://user:password#jenkins.example.com/pipeline-model-converter/validate
For a practical example follow this guide:
https://pillsfromtheweb.blogspot.com/2020/10/validate-jenkinsfile.html

Is there any way to build jobs with groovy script

I am passing Extended Choice Parameters from one job to another job, in the second job I am writing a groovy script to receive the parameter, and on basis of that parameter job must run multiple times in parallel. But there is no method available to build jobs in groovy.
Use build job from Jenkins Pipeline
build job: 'jobName',
parameters:[[$class: 'StringParameterValue', name: 'val1', value: '1' ],
[$class: 'LabelParameterValue', name: 'SLAVE_NODE', label: 'slavename']
]
The jenkins-pipeline that you added to your job is probably what you are searching for. With pipelines, you can define your build using a Groovy DSL.
You find an introduction in the documentation. A (incomplete) list of steps available through plugins can be found in the steps reference.
P.S. Be warned that there are two different flavors: declarative pipelines (defined using the pipeline keyword) do not offer full freedom, but are a bit easier to handle regarding build failures and parse errors in your pipeline code. Scripted pipelines (with node steps allocating an executor) offer (nearly) the full power of Groovy.

Template workflows in Jenkins

Every jenkins pipeline does pretty much the same thing - atleast in a small team with multiple projects.
Build (from the same sourcecode repo) --> run tests --> publish artifacts (to the same artifact repo)
We are creating many new projects and they all have very similar lifecycle. Is it possible to create a template pipeline from which I can create concrete pipleines and make necessary changes to the jobs?
There are a couple of approaches that I use that work well for me and my team.
part 1) is to identify which orchestration plugins suits you best in jenkins.
Plugins and approaches that worked well for me were:
a) Use http://ci.openstack.org/jenkins-job-builder/
It abstract the jobs definitions and flows using a higher level library. It allows you to define jobs in YAML which is fairly simple and it supports most of the common usage cases (jobs, templates, flows).
These yaml files can then be consumed by the jenkins-jobs-builder python cli tool through an orchestration tool such as ansible, puppet,chef.
You can use YAML anchors to replace blocks that are common to multiple jobs, or ever template them from a template engine (erb,jinja2)
b) Use the workflow-plugin, https://github.com/jenkinsci/workflow-plugin
The workflow plugin allows you to have a single workflow in groovy, instead of a set of jobs that chain together.
"For example, to check out and build several repositories in parallel, each on its own slave:
parallel repos.collectEntries {repo -> [/* thread label */repo, {
node {
dir('sources') { // switch to subdir
git url: "https://github.com/user/${repo}"
sh 'make all -Dtarget=../build'
}
}
}]}
"
If you build these workflow definitions from a template engine (ERB, jinja2), and integrate them with a configuration management tool (again ansible,chef,puppet).
It becomes a lot easier to make small and larger changes that affect one or all the jobs.
For example, you can template that some jenkins boxes compile, publish and deploy the artifacts into a development environment, while others simply deploy the artifacts into a QA environment.
This can all be achieved from the same template, using if/then statements and macros in jinja2/erb.
Ex (an abstraction):
if ($environment == dev=) then compile, publish, deploy($environment)
elif ($environment== qa) then deploy($environment)
part2) is to make sure all the jenkins configuration for all the jobs and flows is kept in source control, and make sure a change of a job definition in source control will be automatically propagated to the jenkins server(s) (again ansible, puppet, chef).
Or even have a jenkins jobs that monitors its own repo of jobs definitions and automatically updates itself
When you achieve #1 and #2 you should be at a position where you can with some confidence allow all your team members to make changes to their jobs/projects, giving you information of who changed what and when, and be able to rollback changes easily from change control when things go wrong.
its pretty much about getting jenkins to deploy code from a series of templated jobs that were themselves defined in code.
Another approach we've been following is managing jobs via Ansible templates. We started way before jenkins_job module became available, and are using url module to talk to jenkins, but overall approach will be the same:
j2 templates created for different jobs
loop goes over project definitions, and updates jobs and views in jenkins
by default common definition is used, and very minimal description is required:
default_project:
jobs:
Build:
template: build.xml.j2
Release: ...
projects:
DefaultProject1:
properties:
repository: git://../..
CustomProject2:
properties:
a: b
c: d
jobs:
Custom-Build:
template: custom.j2

Resources