How to test Jenkinsfile - jenkins

I am trying to write the test cases to validate the Jenkinsfile, But the load script function not working expecting the extension to be provided and throwing ResourceException exception loadScript("Jenkinsfile")
Is their better way to test the Jenkinsfile

The problem is that there are not enough tools for the development of pipelines. Pipelines is DSL and it imposes a restrictions.
There is an interesting approach to using flags. For example, test which defines outside pipeline(in job). If test=true, a pipeline change some "production" logic to "test" - select another agent, load artifacts into another repository, run another command and so on.
But recently appeared Pipeline Unit Testing Framework. It allows you to unit test Pipelines and Shared Libraries before running them in full. It provides a mock execution environment where real Pipeline steps are replaced with mock objects that you can use to check for expected behavior.
Useful links:
Jenkins World 2017: JenkinsPipelineUnit: Test your Continuous Delivery Pipeline
Pipeline Development Tools

You can validate your Declarative Pipeline locally thanks to Jenkins built-in features.This can be done using a Jenkins CLI command or by making an HTTP POST request with appropriate parameters.
The command is the following:
curl -s -X POST -F "jenkinsfile=<YourJenkinsfile" \
https://user:password#jenkins.example.com/pipeline-model-converter/validate
For a practical example follow this guide:
https://pillsfromtheweb.blogspot.com/2020/10/validate-jenkinsfile.html

Related

Is a Jenkinsfile valid standalone groovy?

I'm trying to wrap my head around how this declarative Jenkinsfile is Groovy. I want to write supporting code to execute this outside the Jenkins environment, in pure Groovy, if that's possible. I've been writing example groovy code but still am unsure what "pipeline", "agent", and "stages" are.
Any tips to understand this structure is appreciated
EDIT: I edited this question with simplified code below. I'm just wondering if there is a way that this can be turned into valid groovy code without the preprocessor/groovyshell environment that is utilized by Jenkins
pipeline {
stages {
// extra code here
}
}
No, you can't run Jenkinsfile as a standalone Groovy script. In short, Jenkins executes the pipeline code inside a pre-configured GroovyShell that knows how to evaluate things like pipeline, agent, stages, and so forth. However, there is a way to execute Jenkinsfie without the Jenkins server - you can use JenkinsPipelineUnit test library to write JUnit/Spock unit tests that will evaluate your Jenkinsfile and display the call stack tree. It uses mocks, so you can treat it as interaction-based testing, to see if a specific part of your pipeline gets executed. Plus, you can catch some code errors prior to running the pipeline on the server.
A simple unit test for the declarative pipeline can look like this:
import com.lesfurets.jenkins.unit.declarative.*
class TestExampleDeclarativeJob extends DeclarativePipelineTest {
#Test
void should_execute_without_errors() throws Exception {
def script = runScript("Jenkinsfile")
assertJobStatusSuccess()
printCallStack()
}
}
You can find more examples in the official README.md - https://github.com/jenkinsci/JenkinsPipelineUnit
Alternatively, you can try Jenkinsfile Runner command-line tool that can execute your Jenkinsfile outside of the Jenkins server - https://github.com/jenkinsci/jenkinsfile-runner
UPDATE
I edited this question with simplified code below. I'm just wondering if there is a way that this can be turned into valid groovy code without the preprocessor/groovyshell environment that is utilized by Jenkins.
Your pipeline code example looks like a valid Jenkinsfile, but you can't turned it into a Groovy code that can be run e.g. from the command-line as a regular Groovy script:
$ groovy Jenkinsfile
This won't work, because Groovy is not aware of the Jenkins Pipeline syntax. The syntax is added as a DSL via the Jenkins plugin, and it uses a dedicated GroovyShell that is pre-configured to interpret the pipeline syntax correctly.
If you are interested in checking if the syntax of the Jenkins Pipeline is correct, there are a few different options:
npm-groovy-lint (https://github.com/nvuillam/npm-groovy-lint) can validate (and even auto-fix) the syntax of your Jenkinsfile without connecting to the Jenkins server,
Command-Line Pipeline Linter (https://www.jenkins.io/doc/book/pipeline/development/#linter) can send your pipeline code to the Jenkins server and validate its syntax.
These are a few tools that can help you with catching up the syntax errors before you run the pipeline. But that's just a nice addon to your toolbox. The first step, as always, is to understand what the syntax means, and the official documentation (https://www.jenkins.io/doc/book/pipeline/syntax) is the best place to start.

Can Jenkins run some code or it is only scheduling external programs?

I'm new to Jenkins and I heard it is really good for continues integration.
My flow is not complicated: I need to get a list from SQL by some query, parse it line by line, send each line to some virtual machines (which will run this line and create some file as result), and then analyze the results.
Where in Jenkins can I program my code?
Is Jenkins' purpose is only to schedule external programs one by one and not to run the code in Jenkins itself?
Is there a way to write code in jenkins that is not a bunch of CMD commands?
you can do the following ( syntaxic pipline ) :
- stage 1 : execute command over ssh (jenkins plugin) to execute sql query
- stage 2 : send each line to dedicated VM with Ansible playbook
- stage 3 : analyze depend on the tech you are using there are plenty of jenkins plugin to connect to monitoring and analyzing tools like grafana or zabbix .. that can ease the process
I think what you are asking for is scripted pipeline: https://jenkins.io/doc/book/pipeline/
This allows you to write Groovy code to execute on the Jenkins master. You can do almost anything you want from there, just like any programming language.

Establish relationship between two Jenkins Jobs available on different Jenkins server

I am building Jenkins for Test / QA automation scripts, lets name it TEST_JOB. For application, I have application source code Jenkins build, name it DEV_JOB.
My scenario is when DEV_JOB completes execution (successfully), execute TEST_JOB immediately. I am aware about setting up project upstream / downstream [ Build after other projects are built ] to accomplish this task. But here, Problem is DEV_JOB is on different server than TEST_JOB. Due to which, TEST_JOB fails to recognize DEV_JOB.
Now, how would I achieve this scenario?
You can use Jenkins API for remote trigger of Job.
Say you have job on DEV_JOB on JENKINS_1, add a penultimate step(or upstream/downstream project having only this step) which invokes TEST_JOB using remote API call of JENKINS_2 server.
Example command would be
$(curl --user "username:password" "http://JENKINS_2/job/TEST_JOB/buildWithParameters?SOMEPARAMETER=$SOMEPARAMETER")
username:password is a valid user on JENKINS_2.
Avoid using your own account here but rather a 'build trigger' account that only has permissions to start those jobs.

JMeter & Jenkins - passing jmeter parameters to downstream build

The Setup - A jenkins job using jenkins parameters testApp and testEnv. The Execution Batch looks like this:
C:\jmeter\apache-jmeter-3.2\bin\jmeter.bat -n -t
C:\JMeter\Scripts\API_scripts\%testApp%.jmx -Jtestenv=%testEnv% -JtestApp=%testApp% -JtestBrowser=NA -l
C:\AUTO_Results\jtl\%testApp%_%testEnv%.jtl
Post-build Actions
Console output (build lob) parsing with a global rule so that the Failures that are logged in the Jenkins Console window will consider the JMeter script failing. (discussed Jenkins shows JMeter script failure even though the script actually passed)
Triggered parameterized build - this is a separate jmeter script that updates a wiki page with either PASS/FAIL and uploads the JMeter report.
The Issue - How do I get the downstream Triggered build to use the parameters from the upstream script? I set the Parameter = Current build parameters but it's not applying those. Also, I wont know the value of the testResult parameter until the upstream build finishes. I tried adding %testResult%=PASS to the 'Predefined parameters' box
As per Parameterized Trigger Plugin page:
The parameters section can contain a combination of one or more of the following:
a set of predefined properties
properties from a properties file read from the workspace of the triggering build
the parameters of the current build
Subversion revision: makes sure the triggered projects are built with the same revision(s) of the triggering build. You still have to make sure those projects are actually configured to checkout the right Subversion URLs.
Restrict matrix execution to a subset: allows you to specify the same combination filter expression as you use in the matrix project configuration and further restricts the subset of the downstream matrix builds to be run.
So you basically need to copy over the parameters you would like to have in the "downstream" job from the current one.
As a workaround to current performance plugin limitations you can consider running JMeter using Taurus tool as a wrapper, it has flexible and powerful pass/fail criteria subsystem which will basically return to Jenkins non-zero exit code triggering build failure in case of issue in the test. If everything goes well Taurus exit code will be 0 which is considered successful by Jenkins. Check out How to Run Taurus with the Jenkins Performance Plugin article for more details.

Jenkins 2 pipeline deploying to udeploy

I am creating a CI/CD pipeline. I am trying to create a groovy function in order to deploy a build to udeploy.
I know I will need to pass the parameters used in to the function such as:
udeployServer,
component,
artifactDirectory,
version,
deployApplication,
environment and
deployProcess.
I was wondering has anyone tried to implement this or has anyone any idea how I should approach this?
Thanks
I don't know anything about udeploy servers but I do know there is no pipeline plugin for udeploy, which means that you will not have a function such as :
udeploy: server=yourserver component=yourcomponent artifactDirectory=...
However Jenkins allow you to use shell commands inside your groovy pipeline, so you should be able to do pretty much everything you need. So I guess the real question is how do you usually deploy a build to udeploy ? Do you do it via a REST API, do you push a file via FTP, ... ?
Jenkins build will be pretty straightforward, have a look at how to checkout and build using Jenkins pipeline.
An example pipeline could look like :
{
stage 'Build'
def mvnHome = tool 'M3'
sh "${mvnHome}/bin/mvn clean install"
//... Some other stages as needed...
stage 'Deploy'
sh "execute sh deploy script here..."
}
... where you deploy stage could use other plugins to copy files to your server, run REST API requests, etc. While writing a pipeline, have a look at Pipeline Syntax link for a Snippet Generator giving more detailed information about existing plugins.

Resources