How to use "Parameterized Remote Trigger Plugin" in Jenkins Pipeline script? - jenkins

I tried search but didn't find any example. I tried https://jenkins.io/doc/pipeline/examples/#trigger-job-on-all-nodes and got it is for the different nodes on the same Jenkins.
I would like to trigger a build on another Jenkins. I configured the Remote Hosts and Authentication in system configuration of my Jenkins.
How to call "Parameterized Remote Trigger Plugin" in Jenkins Pipeline script?

Seems to be an open bug: https://issues.jenkins-ci.org/browse/JENKINS-38657
As a workaround you could create another job locally of an old type and use the plugin in the old school non pipeline script way. Then in your pipeline script you would just trigger this job. I know it's an ugly adapter but then you have parametrize this adapter and have it up and running for almost anything ;)
EDIT:
The bug 38657 is already closed, the plugin is available as pipeline step since 16th of May 2018. Usage should be as easy as:
//Trigger remote job
def handle = triggerRemoteJob(remoteJenkinsName: 'remoteJenkins', job: 'RemoteJob')
More information on the triggerRemoteJob step

For anyone wondering how to do this using the Declarative Jenkinsfile Syntax:
steps {
triggerRemoteJob remoteJenkinsName: 'configured-remote-jenkins-name', job: 'trigger-job-folder/trigger-job-name', blockBuildUntilComplete: true
}

Related

Notifications on jenkins job failures - with pipeline from scm

We have several jenkins pipeline jobs setup as "pipeline from scm" that checkout a jenkins file from github and runs it. There is sufficient try/catch based error handling inside the jenkinsfile to trap error conditions and notify the right channels.This blog post goes into a quite a bit of depth about how to achieve this.
However, if there is issue fetching the jenkinsfile in the first place, the job fails silently. How does one generate notifications from general job launch failures before the pipeline is even started?
Jenkins SCM pipeline doesn't have any execution provision similar to catch/finally that will be called if Jenkinsfile load is failed, And I don't think there will be any in future.
However there is this global-post-script which runs groovy script after every build of every job on Jenkins. You have to place that script in $JENKINS_HOME/global-post-script/ directory.
Using this you can send notifications or email to admins based on project that failed and/or reason/exceptions of failure.
Sample code that you can put in script
if ("$BUILD_RESULT" != 'SUCCESS') {
def job = hudson.model.Hudson.instance.getItem("$JOB_NAME")
def build = job.getBuild("$BUILD_NUMBER")
def exceptionsToHandle = ["java.io.FileNotFoundException","hudson.plugins.git.GitException"]
def foundExection = build
.getLog()
.split('\n')
.toList()
.stream()
.filter{ line ->
!line.trim().isEmpty() && !exceptionsToHandle.stream().filter{ex -> line.contains(ex)}.collect().isEmpty()
}
.collect()
.size() > 0;
println "do something with '$foundExection'"
}
You can validate your Jenkinsfile before pushing it to repository.
Command-line Pipeline Linter
There are some IDE Integrations as well
Apparently this is an open issue with Jenkins: https://issues.jenkins.io/browse/JENKINS-57946
I have decided not to use Yogesh answer mentioned earlier. For me it is simpler to just copy the content of the Jenkinsfile directly into the Jenkins project instead of pointing Jenkins to the GIT location of the Jenkinsfile. However, in addition I keep the Jenkinsfile in GIT. But make sure to keep the GIT and the Jenkins version identical.

How to use SauceOnDemand Plugin with Jenkins Pipeline

I have a freestyle Jenkins job with the Sauce OnDemand plugin installed. That plugin provides UI so I can define which browser and devices I wish to test on in the Job config. It works as expected.
Now I want to migrate my job to a Declarative Pipeline job.
The Pipeline Snippet generator, under the "sauce: Sauce" sample step, says I should do:
sauce('<guid for my saucelabs user>') {
// some block
}
Does anyone what //some block should be that would allow me to specify the browsers I wish to test on?
I am not using node, I am using .NET.
What am I missing?

Jenkins: how to trigger pipeline on git tag

We want to use Jenkins to generate releases/deployments on specific project milestones. Is it possible to trigger a Jenkins Pipeline (defined in a Jenkinsfile or Groovy script) when a tag is pushed to a Git repository?
We host a private Gitlab server, so Github solutions are not applicable to our case.
This is currently something that is sorely lacking in the pipeline / multibranch workflow. See a ticket around this here: https://issues.jenkins-ci.org/browse/JENKINS-34395
If you're not opposed to using release branches instead of tags, you might find that to be easier. For example, if you decided that all branches that start with release- are to be treated as "release branches", you can go...
if( env.BRANCH_NAME.startsWith("release-") ) {
// groovy code on release goes here
}
And if you need to use the name that comes after release-, such as release-10.1 turning into 10.1, just create a variable like so...
if( env.BRANCH_NAME.startsWith("release-") ) {
def releaseName = env.BRANCH_NAME.drop(8)
}
Both of these will probably require some method whitelisting in order to be functional.
I had the same desire and rolled my own, maybe not pretty but it worked...
In your pipeline job, mark that "This project is parameterized" and add a parameter for your tag. Then in the pipeline script checkout the tag if it is present.
Create a freestyle job that runs a script to:
Checkout
Run git describe --tags --abbrev=0 to get the latest tag.
Check that tag against a running list of builds (like in a file).
If the build hasn't occurred, trigger the pipeline job via a url passing your tag as a parameter (in your pipeline job under "Build Triggers" set "Trigger builds remotely (e.g. from scripts) and it will show the correct url.
Add the tag to your running list of builds so it doesn't get triggered again.
Have this job run frequently.
if you use multibranch pipeline, there is a discover tag. Use that plus Spencer solution

How to get notified about SCM / version control issues when using Jenkins 2.0 pipelines?

This blog post describes very well how to setup notifications for failed jobs within the Pipeline DSL.
Unfortunately, this approach has a severe drawback: There is no (email) notification at all if the SCM is not reachable because Jenkins is not able to checkout the Jenkinsfile. Does anyone know a solution or workaround for that so that I get notified if a Pipeline Job fails because of SCM issues while checking out the Jenkinsfile (or also in case of syntax errors within the Jenkinsfile)?
If you have the Email-ext plugin installed you can call it from your pipeline script.
You can use the snippet generator that comes with pipeline plugin (available at $JENKINS_URL/pipeline-syntax).
Select the plugin and configure it as you would used to do in a post build step.
Put the generated snippet in your pipeline. You might want to wrap it in a try {..} finally {..}
emailext attachLog: true,
body: 'Oops',
recipientProviders: [[ $class: 'DevelopersRecipientProvider']],
subject: 'Failing tests', to: 'someone#example.com'
The beuaty about this pipeline-syntax is that it will dynamically add the plugins you install (so if you used to use a different notification plugin, it should show up here).
You can also take a look at this documentation where they give some examples as well.
This is a similar question

Jenkins how to create pipeline manual step

Prior Jenkins2 I was using Build Pipeline Plugin to build and manually deploy application to server.
Old configuration:
That works great, but I want to use new Jenkins pipeline, generated from groovy script (Jenkinsfile), to create manual step.
So far I came up with input jenkins step.
Used jenkinsfile script:
node {
stage 'Checkout'
// Get some code from repository
stage 'Build'
// Run the build
}
stage 'deployment'
input 'Do you approve deployment?'
node {
//deploy things
}
But this waits for user input, noting that build is not completed. I could add timeout to input, but this won't allow me to pick/trigger a build and deploy it later on:
How can I achive same/similiar result for manual step/trigger with new jenkins-pipeline as prior with Build Pipeline Plugin?
This is a huge gap in the Jenkins Pipeline capabilities IMO. Definitely hard to provide due to the fact that a pipeline is a single job. One solution might be to "archive" the workspace as an "artifact" (tar and archive **/* as 'workspace.tar.gz'), and then have another pipeline copy the artifact and and untar it into the new workspace. This allows the second pipeline to pickup where the previous one left off. Of course there is no way to gauentee that the second pipeline cannot be executed out of turn or more than once. Which is too bad. The Delivery Pipeline Plugin really shines here. You execute a new pipeline right from the view - instead of the first job. Anyway - not much of an answer - but its the path I'm going to try.
EDIT: This plugin looks promising:
https://github.com/jenkinsci/external-workspace-manager-plugin/blob/master/doc/PIPELINE_EXAMPLES.md

Resources