Interacting with Jenkins Pipeline Stage - jenkins

Is it possible to somehow interact with the Jenkins Pipeline from the script (bat or sh)? E.g. to initiate new stage?
echo This is a batch file in no stage
pipeline.stage(build)
echo This is a batch file in build stage
I have a pretty compact build job written in PowerShell. My team is now experimenting with the Jenkins Pipeline feature and it would be great to split our ps build code to stages (build core, build modules, test, coverage and so on). We could easily do it by creating function for every stage but it would be inefficient (loading ps modules...)

I would propose you another way:
you may define your different steps as CmdLet in your Powershell script:
function step_1()
{
[CmdletBinding(SupportsShouldProcess=$true)]
param ()
Write-Verbose "step 1"
}
function step_2()
{
[CmdletBinding(SupportsShouldProcess=$true)]
param ()
Write-Verbose "step 2"
}
Then, you can define the Powershell method, like I describe hier: To call a PowerScript from the Groovy-Script
In your Groovy pipeline-script:
node ('MyWindowsSlave') {
stage ('Stage 1') {
PowerShell(". '.\\all-stages.ps1'; stage1 -Verbose")
}
stage ('Stage 2') {
PowerShell(". '.\\all-stages.ps1'; stage2 -Verbose")
}
}
You may also consider to split your Groovy Script in several files, and let the powershell developer using they own Groovy sub-script.
Regards.

You can use the input step (https://jenkins.io/doc/pipeline/steps/pipeline-input-step/) to wait a user response to continue the pipeline:
A simple 'yes' or 'no', a multiple choosen, a password confirmation...
You can control who can response to this input event using ldap groups for example.

Related

Is it possible to have multiple jenkinsfile and custom name for jenkinsfile

My whole scripts are in one branch of repo and I have multiple jenkins pipeline job.
1. smoke
2. Regression
3. Epic wise Execution
each have a different pipeline script. So is it possible to have multiple jenkins file with custom name ?
pipeline {
node('Slave-Machine-1') {
env.NODE_HOME="${tool '8.9.4'}"
env.PATH="${env.NODE_HOME}/bin:${env.PATH}"
def AUTO = ''
stage("Install Dependency") {
sshagent(['agent-id']) {
sh 'npm install'
sh 'npm run webdriver-install'
}
}
stage("smoke") {
sh 'npm run smoke-test'
}
}
}
This is my sample pipeline script. similarly i have multiple pipeline scripts
You can name your pipeline scripts random_joe or anything you like as long as:
You do not use multibranch or organization pipeline projects, which specifically look for the filename Jenkinsfile to automatically create new jobs
You do not mind your text editor not syntax highlighting the pipeline scripts until you add the extension .groovy to them
It is advisable to follow conventions wherever not impracticable though.

How manage to get an automated build server with jenkins for projects in Delphi

In a nutshell, how manage to get an automated build server with jenkins or other, to build several Delphi's projetcs using MSBuild?
I am currently a trainee in a company. I have managed to find a solution to migrate and change the old SCM software : PVCS to SVN. But they are using old shell scripts and Cygwin to build with several options to compile/release all or certain Delphi projects and produce DLL and EXE. I wanted firstly to use Jenkins to try to reproduce the same mechanism, but I am not sure it is the best way to deal with this. I have tried to set a free-style job and a multibranch pipeline. The first is ok to build one project but the latter is not a success, I don't know groovy...
I am not interested in the test part of the continuous integration I just want to have an automated build for several Delphi projects.
I don't know how to deal with this. Maybe the best way is to make as much as jenkins' jobs as there is delphi's projects? But how to control them after?
I have read about Maven and Ant but I am not sure it is relevant in my case.
Any advices are welcomed
You can create simple jobs "free-style job" or "pipelines". The pipelines are more powerful, but more complicated if you are starting.
You can start by creating a Job for each project. Then you can chain projects with different jenkins options. When a Job finish the other job start. See image following image.
You can also use to compile an existing plugin for existing RAD Studio for Jenkins. Use it in "free-style job".
The other option is to use pipelines, but you should know something about Groovy.
For example, a simple pipeline with several steps would be this:
pipeline {
agent any
stages {
stage('Stage: Show message Hola Mundo') {
steps {
echo 'Paso 1. Hola Mundo'
}
}
stage('Download source from GIT') {
steps {
echo 'Downloading...'
git([url: 'https://XXX_repository_xxxx.git/gitProject', branch: 'master', credentialsId: 'a234234a-344e-2344-9440-423444xxxxxx'])
}
}
stage('Executing MSDOS file (BAT)') {
steps {
echo '-- Sample Executing BAT file'
bat '"c:\\Program Files (x86)\\Embarcadero\\Studio\\19.0\\bin\\rsvars.bat"'
}
}
stage('MSBuild a Delphi project') {
steps {
println("************ EXECUTING MSBUILD ******************")
echo '-- Lanzar la ejecuciĆ³n de rsVars ---------'
bat '"c:\\Program Files (x86)\\Embarcadero\\Studio\\19.0\\bin\\rsvars.bat"'
echo '-- MSBuils del proyecto TestLauncher -------'
bat '"c:\\local\\AutomaticTestsProject\\compilar.bat"'
}
}
stage('Execute a test project (EXE)') {
steps {
bat 'c:\\local\\AutomaticTestsProject\\BIN\\AutomaticTestsProject.exe'
}
}
stage('Send emeil') {
steps {
emailext (
subject: "Job '${env.JOB_NAME} ${env.BUILD_NUMBER}'",
body: """<p>Check console output at ${env.JOB_NAME}</p>""",
to: "destinatary#hotmail.com",
from: "JenkinsMachine#mail.com" )
}
}
}
}

Jenkins declarative pipeline: input with conditional steps without blocking the executor

I'm trying to get the following features to work in Jenkins' Declarative Pipeline syntax:
Conditional execution of certain stages only on the master branch
input to ask for user confirmation to deploy to a staging environment
While waiting for confirmation, it doesn't block an executor
Here's what I've ended up with:
pipeline {
agent none
stages {
stage('1. Compile') {
agent any
steps {
echo 'compile'
}
}
stage('2. Build & push Docker image') {
agent any
when {
branch 'master'
}
steps {
echo "build & push docker image"
}
}
stage('3. Deploy to stage') {
when {
branch 'master'
}
input {
message "Deploy to stage?"
ok "Deploy"
}
agent any
steps {
echo 'Deploy to stage'
}
}
}
}
The problem is that stage 2 needs the output from 1, but this is not available when it runs. If I replace the various agent directives with a global agent any, then the output is available, but the executor is blocked waiting for user input at stage 3. And if I try and combine 1 & 2 into a single stage, then I lose the ability to conditionally run some steps only on master.
Is there any way to achieve all the behaviour I'm looking for?
You need to use the stash command at the end of your first step and then unstash when you need the files
I think these are available in the snippet generator
As per the documentation
Saves a set of files for use later in the same build, generally on
another node/workspace. Stashed files are not otherwise available and
are generally discarded at the end of the build. Note that the stash
and unstash steps are designed for use with small files. For large
data transfers, use the External Workspace Manager plugin, or use an
external repository manager such as Nexus or Artifactory

Jenkins pipeline groovy testing in shell

Can jenkins pipeline scripts be tested using groovysh or groovy scriptname to run tests for validation without using the Jenkins UI
For example for a simple script
pipeline {
stages {
stage ('test') {
steps {
sh '''
env
'''
}
}
}
}
running a test like this, depending on the subset of scripting gives:
No signature of method: * is applicable for argument types
groovysh_evaluate.pipeline()
or for
stage('test'){
sh '''
env
'''
}
reports:
No signature of method: groovysh_evaluate.stages()
or simply
sh '''
env
'''
reports:
No signature of method: groovysh_evaluate.sh()
The question may be which imports are required and how to install them outside of a jenkins installation?
Why would anyone want to do this?
Simplify and shorten iterating over test cases, validation of library versions without modifying jenkins installations and other unit and functional test scenarios.
JenkinsPipelineUnit is what you're looking for.
This testing framework lets you write unit tests on the configuration and conditional logic of the pipeline code, by providing a mock execution of the pipeline. You can mock built-in Jenkins commands, job configurations, see the stacktrace of the whole execution and even track regressions.

How do I create a Body block for a custom Jenkins Pipeline Step in the Java code of my Jenkins plugin?

Context
I am creating a Jenkins plugin that adds a custom pipeline step.
I have successfully got Java code to execute when my step (dostuff) is used in a pipeline script, such as
script1
node {
stage( 'doingstuff' ) {
dostuff()
}
}
However, I want my custom step to render some stages and parallel streams in the Jenkins WebUI visualization, so I want to be doing the equivalent of parallel([...]) and stage( ... ) { ... } and maybe even node { ... } blocks in my plugin. For example, if the following were my intended pipeline:
script2
node {
stage( 'one' ) {
parallel([
"stream one": {
sh "echo hi from stream one"
},
"stream two": {
sh "echo hi from stream two"
}
])
}
}
I would like to reduce it to
script3
node {
dostuff()
}
where dostuff() will do the equivalent of
script4
stage( 'one' ) {
parallel([
"stream one": {
sh "echo hi from stream one"
},
"stream two": {
sh "echo hi from stream two"
}
])
}
and, importantly, will render properly in the "Pipeline Steps" view of the Jenkins WebUI, and so that the parallel streams will render properly in the Blue Ocean web UI.
While I have seen how to execute the body block that is passed to a custom step defined in a plugin, from a pipeline script (for example), I cannot figure out the idiom for creating a body block in Java code.
I do not want to/cannot do this in pure Groovy because
The "real" logic I want to write will depend on classes in several non-Jenkins JARs that aren't on the classpath by default, from several Maven repositories (and so cannot be imported (though they could maybe be #Grab'd))
The "real" logic I want to write will use concurrency libraries, including Java's synchronized, and as far as I can tell pipeline scripts do not support this.
The "real" logic I want to write would like to share state across all instances of the logic in the JVM, and as far as I can tell pipeline scripts cannot communicate with other running pipeline scripts.
Question(s)
Is it/should it be possible to create a pipeline body block in Java, and have a custom pipeline step execute that block?
Are there any plugins that do this that I could look at?
I spoke to Andrew Bayer - engineer working on pipeline for CloudBees - at Jenkins World 2017 and he confirmed that it's architecturally impossible to compose a pipeline step of other pipeline steps from pure Java in a Jenkins plug-in.

Resources