Load multiple Jenkins pipeline scripts from Git via Job DSL seed script - jenkins

My use case: I want to set up a Jenkins configuration via the Jenkins Helm chart, using the JCasC plugin. I would also like to define a number of jobs via the Pipeline plugin in a series of Jenkinsfiles, so that the entire setup is configured in code, with a clean, complete installation able to be performed just by running helm install.
However, I'm having trouble loading my Pipeline scripts. In my JCasC, I have defined a Job DSL seed script as follows:
job('seedJob') {
scm {
git {
remote {
url 'ssh://git#foo/bar.git'
credentials 'creds'
}
}
}
steps {
dsl {
external 'jenkins/jobs/*.groovy'
}
}
}
This successfully pulls the scripts from the repo, an example of which is:
pipeline {
// hello.groovy
// Do stuff
}
However, the job fails when parsing the Pipeline script with the following error:
ERROR: (hello.groovy, line 1) No signature of method: hello.pipeline() is applicable for argument types: (hello$_run_closure1) values: [hello$_run_closure1#4c6f43b6]
Possible solutions: pipelineJob(java.lang.String), pipelineJob(java.lang.String, groovy.lang.Closure)
Finished: FAILURE
My suspicion is that Pipeline scripts can't be read this way via Job DSL. If this is the case, is there are way I can achieve the loading of multiple Pipeline scripts from a single seed job?

Seed jobs should look like:
pipelineJob('job_name_here') {
definition {
cpsScm {
scm {
git {
branches('*/master')
branches('*/release')
remote {
credentials('credentials_id_from_jenkins_here')
name('name')
url('git#gitlab_repo_here.git')
}
}
}
}
}
triggers {
gitlab {
ciSkip(true)
triggerOnPush(true)
triggerOnMergeRequest(false)
triggerOnClosedMergeRequest(true)
branchFilterType('RegexBasedFilter')
targetBranchRegex('(.*master.*|.*release.*)')
secretToken('paste_secret_token_for_webhook_here')
}
}
}

Related

How to build multiple projects the same time using jenkins pipeline

I already wrote an example Jenkinsfile to checkout and build and deploy a signal project. Is there a way to do all these for multiple project in different git repo the same time just using one Jenkinsfile ? I know I can set up these projects as independent jobs and use a Jenkinsfile to call them,but I'm wondering if I can do this without independent jobs.
Thanks.
You can make use of Job DSL Plugin to achieve this.
Jenkins Job DSL API will help you to write DSL scripts. You can find all the built-in DSL methods that will be needed to construct jobs.
Example pipeline script:
pipeline {
agent any
stages {
stage('Job1') {
steps {
//Pipeline Job
jobDsl scriptText: '''pipelineJob(\"$job1\") {
definition {
cpsScm {
scm {
git {
remote{
name('origin')
url('https://github.com/satta19/user-node.git')
credentials('git2-cred')
}
branch ('master')
}
}
scriptPath('Jenkinsfile')
}
}
}'''
}
}
stage('Job2') {
steps {
//Freestyle job
jobDsl scriptText: '''job(\"$job2\") {
steps {
shell(\'echo Hello World!\')
}
}'''
}
}
}
}
Note: I have taken the jobs name as string parameter i.e. $job1 and $job2 in the above example pipeline script.

How to create pass threshold for TestCafe Tests on Jenkins

We've TestCafe.js UI tests that runs regression suite on Jenkins environment.
We're exploring a way to create a mechanism, wherein we can potentially set certain pass threshold for the test suite to make the Jenkins job status as Pass / Fail.
i.e. if 98% + tests pass then mark the test job as pass.
Under XUnit projects same could be achieved using XUnit test Plugin etc.
Example reference: How can I have Jenkins fail a build only when the number of test failures changes?
How to fail a Jenkins job based on pass rate threshold of testng tests
How to not mark Jenkins job as FAILURE when pytest tests fail
Is similar possible for TestCafe based tests either through TestCafe customization / through some Jenkins plugin?
Our Jenkins file:
#!groovy
pipeline {
environment {
CI = 'true'
}
options {
buildDiscarder(logRotator(numToKeepStr: '50'))
disableResume()
ansiColor('xterm')
}
agent none
// Define the stages of the pipeline:
stages {
stage('setup') {
steps {
script {
cicd.setupBuild()
}
}
}
// Use the make target to run tests:
stage('Tests') {
agent any
steps {
script {
cicd.withSecret(<keys>) {
cicd.runMake("test")
}
}
}
post {
cleanup {
archiveArtifacts artifacts: "screenshots/**", allowEmptyArchive: true
}
}
}
}
post {
success {
script { cicd.buildSuccess() }
}
failure {
script {
slackSend channel: "#<test-notifications-channel>", color: 'bad', message: "Regression tests failed or unstable <${env.RUN_DISPLAY_URL}|${env.JOB_NAME}>"
cicd.buildFailure()
}
}
}
}
enter code here
TestCafe provides a bunch of specified reporters, which generate a report in the special format. Once produced, CI system (or a plugin therein) can parse a report and perform threshold checks based on the number of failed/passed tests. TestCafe documentation includes an example with Jenkins integration. The Jenkins JUnit Plugin used in the example doesn't support set threshold yet: issue. But you can try to follow the steps in the guide in a similar way, except using Jenkins xUnit Plugin.

Jenkins Scripted Pipeline - specifying the workspace directory before node allocates the workspace

I've got a multibranch pipeline, defined in a scripted pipeline (from a library) that is coordinating ~100 builds, each build across multiple slaves (different operating systems). One of the Operating systems is Windows, which has a 255 character path limitation. Because some of our jobs have ~200 character paths in them (which we can't control because it is a vendor provided hell), i need to change the step/node workspace on our windows slaves, ideally changing it with the node() step, so that git is automatically checked out only once into the custom workspace.
I've tried all kinds of various styles:
This works in the Declarative Pipeline:
stage('blah') {
node {
label 'win'
customWorkspace "c:\\w\\${JOB_NAME"
}
steps {
...
}
}
But i can't find the equivalent for scripted pipelines:
pipeline {
stage('stage1') {
node('win-node') {
// the git repository is checked out to ${env.WORKSPACE}, but it's unusable due to the path length issue
ws("c:\\w\\${JOB_NAME}") {
// this switches the workspace, but doesn't clone the git repo again
body()
}
}
}
}
Ideally, i'd like something like this:
pipeline {
stage('stage1') {
node('win-node', ws="c:\\w\\${JOB_NAME}") {
body()
}
}
}
Any recommendations?
Not tested (specially define options inside node), but you could try to skip default checkout and do it after changing the workspace, something like this:
pipeline {
stage('stage1') {
node('win-node') {
options {
skipDefaultCheckout true // prevent checkout to default workspace
}
ws("c:\\w\\${JOB_NAME}") {
checkout scm // perform default checkout here
body()
}
}
}
}

Can I "import" the stages in a Jenkins Declarative pipeline

I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

How To Deployit configuration in jenkins pipeline as code in jenkins

i found one link in google
https://docs.xebialabs.com/xl-deploy/concept/jenkins-xl-deploy-plugin.html
here the following steps are present but it is throwing
No dsl method xldCreatePackage
node {
stage('Checkout') {
git url: '<git_project_url>'
}
stage('Package') {
xldCreatePackage artifactsPath: 'build/libs', manifestPath: 'deployit-manifest.xml', darPath: '$JOB_NAME-$BUILD_NUMBER.0.dar'
}
stage('Publish') {
xldPublishPackage serverCredentials: '<user_name>', darPath: '$JOB_NAME-$BUILD_NUMBER.0.dar'
}
stage('Deploy') {
xldDeploy serverCredentials: '<user_name>', environmentId: 'Environments/Dev', packageId: 'Applications/<project_name>/$BUILD_NUMBER.0'
}
}
You need to install the XL Deploy plugin into your Jenkins installation.
Right now you have the Pipeline plugin installed which gives you the Jenkinsfile & pipeline capability only. The XebiaLabs Jenkins plugin will take advantage of that but you need to plugin to give you the functionality you want.

Resources