Creating a Jenkins Pipeline build from a pipeline - jenkins

I'm trying to automate the creation of a Jenkins Pipeline build from within a pipeline.
I have a pipeline which creates a Bitbucket repository and commits some code to it, including a Jenkinsfile.
I need to add another step to this pipeline to then create the Pipeline build for it, which would run the steps in the Jenkinsfile.
I think the Jobs DSL should be able to handle this but the documentation I've found for it has been very sparse, and I'm still not entirely sure if it's possible or how to do it.
Any help would be appreciated. The generated Pipeline build I would imagine just needs to have a link to the repository and be told to run the Jenkinsfile there?

Yes, Job DSL is what you need for your use case.
See this and this to help you get started.
EDIT
pipeline {
agent {
label 'slave'
}
stages{
stage('stage'){
steps {
// some other steps
jobDsl scriptText: '''pipelineJob(\'new-job\') {
def repo = \'https://xxxxx#bitbucket.org/xxxx/dummyrepo.git\'
triggers {
scm(\'H/5 * * * *\')
}
definition {
cpsScm {
scm {
git {
remote {
url(repo)
credentials('bitbucket-jenkins-access')
}
branches(\'master\')
scriptPath(\'Jenkinsfile\')
extensions { }
}
}
}
}
}'''
}
}
}
}
Documentation - https://jenkinsci.github.io/job-dsl-plugin/#path/pipelineJob-scm-git

By using this python library jenins-job-builder you can easily create your expected pipeline or free-style job from another pipeline or from any other remote location.
Example:
steps-1
python3 -m venv .venv
source .venv/bin/activate
pip install --user jenkins-job-builder
steps-2
Once you have done above, Create 2 file, one with name config.ini and the other one is job.yml. Please note - there are no strict rules about the file name. It can be up to you.
The config.ini file contain can looks like
[job_builder]
allow_duplicates = False
keep_descriptions = False
ignore_cache = True
recursive = False
update = all
[jenkins]
password = jenkins-password
query_plugins_info = False
url = http://jenkins-url.net
user = jenkins-username
If you are creating a pipeline job , then your job.yml file can look like
- job:
name: pipeline01
display-name: 'pipeline01'
description: 'Do not edit this job through the web!'
project-type: pipeline
dsl: |
node(){
stage('hello') {
sh 'echo "Hellow World!"'
}
}
steps-3
after all the above. Invoke below command
jenkins-jobs --conf config.ini update job.yml
Note- jenkins-jobs command can only be available if you have followed steps-1

Related

Jenkins pipeline, how can I copy artifact from previous build to current build?

In Jenkins Pipeline, how can I copy the artifacts from a previous build to the current build?
I want to do this even if the previous build failed.
Stuart Rowe also recommended to me on the Pipeline Authoring Sig Gitter channel that I look at the Copy Artifact Plugin, but also gave me some sample Jenkins Pipeline syntax to use.
Based on the advice that he gave, I came up with this fuller Pipeline example
which copies the artifacts from the previous build into the current build,
whether the previous build succeeded or failed.
pipeline {
agent any;
stages {
stage("Zeroth stage") {
steps {
script {
if (currentBuild.previousBuild) {
try {
copyArtifacts(projectName: currentBuild.projectName,
selector: specific("${currentBuild.previousBuild.number}"))
def previousFile = readFile(file: "usefulfile.txt")
echo("The current build is ${currentBuild.number}")
echo("The previous build artifact was: ${previousFile}")
} catch(err) {
// ignore error
}
}
}
}
}
stage("First stage") {
steps {
echo("Hello")
writeFile(file: "usefulfile.txt", text: "This file ${env.BUILD_NUMBER} is useful, need to archive it.")
archiveArtifacts(artifacts: 'usefulfile.txt')
}
}
stage("Error") {
steps {
error("Failed")
}
}
}
}
Suppose you want a single file to from previous build, you can even use curl to place file in workspace before mvn invocation.
stage('Copy csv') {
steps {
sh "mkdir -p ${env.WORKSPACE}/dump"
sh "curl http://<jenkins-url>:<port>/job/<job-folder>/job/<job-name>/job/<release>/lastSuccessfulBuild/artifact/dump/sample.csv/*view*/ -o ${env.WORKSPACE}/dump/sample.csv"
}
}
Thanks,
Ashish
You Can Use Copy Artifact Plugin
For configuration visit https://wiki.jenkins.io/display/JENKINS/Copy+Artifact+Plugin

Multiple Jenkinsfiles, One Agent Label

I have a project which has multiple build pipelines to allow for different types of builds against it (no, I don't have the ability to make one build out of it; that is outside my control).
Each of these pipelines is represented by a Jenkinsfile in the project repo, and each one must use the same build agent label (they need to share other pieces of configuration as well, but it's the build agent label which is the current problem). I'm trying to put the label into some sort of a configuration file in the project repo, so that all the Jenkinsfiles can read it.
I expected this to be simple, as you don't need this config data until you have already checked out a copy of the sources to read the Jenkinsfile. As far as I can tell, it is impossible.
It seems to me that a Jenkinsfile cannot read files from SCM until the project has done its SCM step. However, that's too late: the argument to agent{label} is read before any stages get run.
Here's a minimal case:
final def config
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
checkout scm // we don't need all the submodules here
echo "Reading configuration JSON"
script { config = readJSON file: 'buildjobs/buildjob-config.json' }
echo "Read configuration JSON"
}
}
stage('Build and Deploy') {
agent {
label config.agent_label
}
steps {
echo 'Got into Stage 2'
}
}
}
}
When I run this, I get:
java.lang.NullPointerException: Cannot get property 'agent_label' on null object I don't get either of the echoes from the 'Configure' stage.
If I change the label for the 'Build and Deploy' stage to 'master', the build succeeds and prints out all three echo statements.
Is there any way to read a file from the Git workspace before the agent labels need to be set?
Please see https://stackoverflow.com/a/52807254/7983309. I think you are running into this issue. label is unable to resolve config.agent_label to its updated value. Whatever is set in the first line is being sent to your second stage.
EDIT1:
env.agentName = ''
pipeline {
agent none
stages {
stage('Configure') {
agent {
label 'master'
}
steps {
script {
env.agentName = 'slave'
echo env.agentName
}
}
}
stage('Finish') {
steps {
node (agentName as String) { println env.agentName }
script {
echo agentName
}
}
}
}
}
Source - In a declarative jenkins pipeline - can I set the agent label dynamically?

Jenkins multibranch pipeline post build actions

How to define post build actions for Jenkins multi pipeline project?
There is a separate option available when you have a simple project but not for multipipeline.
To add post build steps to a Multibranch Pipeline, you need to code these steps into the finally block, an example is below:
node {
try {
stage("Checkout") {
// checkout scm
}
stage("Build & test") {
// build & Unit test
}
} catch (e) {
// fail the build if an exception is thrown
currentBuild.result = "FAILED"
throw e
} finally {
// Post build steps here
/* Success or failure, always run post build steps */
// send email
// publish test results etc etc
}
}
For most of the post-build steps you would want there are online examples of them on how to write in pipeline format. If you have any specific one please list it here
When you write a pipeline, you describe the whole flow yourself, which gives you great flexibility to do whatever you want, including running post-build steps.
You can see an example of using post-build steps in a pipeline I wrote:
https://github.com/geek-kb/Android_Pipeline/blob/master/Jenkinsfile
Example from that code:
run_in_stage('Post steps', {
sh """
# Add libCore.so files to symbols.zip
find ${cwd}/Product-CoreSDK/obj/local -name libCore.so | zip -r ${cwd}/Product/build/outputs/symbols.zip -#
# Remove unaligned apk's
rm -f ${cwd}/Product/build/outputs/apk/*-unaligned.apk
"""
})

gradle artifactorypublish: jenkins pipeline does not publish properties

I'm trying to set up a jenkins pipeline for publishing a zip file to jfrog artifactory.
I am using com.jfrog.artifactory plugin to do so. This works great from command line gradle and I can run the artifactoryPublish task to publish the artifacts and tie them back to the module, which then has a tie back to the artifacts.
The artifacts show up with the properties:
build.name = `projectname`
build.number = `some large number`
And I can click from them to the build/module and back to the artifact.
However, when I run this from a jenkinsfile pipeline, the artifacts get published and get tied back to the module, but then the module does not successfully tie the module back to the artifacts.
The artifacts do not receives the build.name and build.number properties and i cannot click from the module back to the artifacts, as the module cannot find or resolve the paths back to the artifacts(a zip file and a generated pom).
I am passing the params from jenkins like:
ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER} which seems to work on other projects... but for whatever reason I cannot shake it.
I can include more jenkinsfile if that would help debug, but i'm really just checking out a repository and trying to publish it.
I have been reading heavily the documentation here:
https://www.jfrog.com/confluence/display/RTF/Gradle+Artifactory+Plugin
and haven't been able to make it work through -Pproject stuff.
Does anyone have any idea what else I can try? i don't really want to use the jenkins pipeline artifactory plugin directly because it's so nice to be able to deploy from the command line too.
build.gradle:
publishing {
publications {
ManualUpdaterPackage(MavenPublication){
artifact assembleManualUpdaterPackage
}
}
}
artifactory {
contextUrl = "${artifactoryUrl}" //The base Artifactory URL if not overridden by the publisher/resolver
publish {
defaults {
publications('ManualUpdaterPackage')
}
repository {
repoKey = project.version.endsWith('-SNAPSHOT') ? snapshotRepo : releaseRepo
username = "${artifactory_user}"
password = "${artifactory_password}"
maven = true
}
}
}
task assembleManualUpdaterPackage (type: Zip){
dependsOn anotherTask
from (packageDir + "/")
include '**'
// archiveName "manualUpdaterPackage-${version}.zip"
destinationDir(file(manualUpdaterZipDir))
}
jenkinsfile snip:
withCredentials(
[
[
$class : 'UsernamePasswordMultiBinding',
credentialsId : 'validcreds',
passwordVariable: 'ORG_GRADLE_PROJECT_artifactory_password',
usernameVariable: 'ORG_GRADLE_PROJECT_artifactory_user'
]
]
) {
withEnv(
[
"ORG_GRADLE_PROJECT_buildInfo.build.number=${env.BUILD_NUMBER}",
"ORG_GRADLE_PROJECT_buildInfo.build.name=${artifactName}",
"ORG_GRADLE_PROJECT_buildInfo.build.url=${env.JOB_URL}"
]
) {
sh 'chmod +x gradlew'
sh "./gradlew --no-daemon clean artifactoryPublish"
}
}
https://www.jfrog.com/confluence/display/RTF/Working+With+Pipeline+Jobs+in+Jenkins#WorkingWithPipelineJobsinJenkins-GradleBuildswithArtifactory
Eventually my coworker recommended looking into the Artifactory Pipeline Gradle plugin instead. It is very nice to work with and we've had much quicker success with it.

Can I "import" the stages in a Jenkins Declarative pipeline

I have several pipeline jobs, which are configured very similarly.
They all have the same stages (of which there are about 10).
I am now I am thinking about moving to the declarative pipeline (https://jenkins.io/blog/2016/09/19/blueocean-beta-declarative-pipeline-pipeline-editor/).
But I do not want to define the ~10 stages in every pipeline. I want to define them at one place, and "import" them somehow.
Is this possible with declarative pipelines at all? I see that there are Libraries, but it does not seem like I could include the stage definition using them.
You will have to create a shared-library to implement what i am about to suggest. For shared-library implementation, you may check the following posts:
Using Building Blocks in Jenkins Declarative Pipeline
Upload file in Jenkins input step to workspace (Mainly for images so one can easily figure out things)
Now if you want to use a Jenkinsfile (kind of a template) which can be reused across multiple projects (jobs), then that is indeed possible.
Once you have created a shared-library repository with vars directory in it, then you just have to create a Groovy file (let's say, commonPipeline.groovy) inside vars directory.
Here's an example that works because I have used it earlier in multiple jobs.
$ cat shared-lib/vars/commonPipeline.groovy
// You can create function(s) as shown below, if required
def someFunctionA() {
// Your code
}
// This is where you will define all the stages that you want
// to run as a whole in multiple projects (jobs)
def call(Map config) {
pipeline {
agent {
node { label 'slaveA || slaveB' }
}
environment {
myvar_Y = 'apple'
myvar_Z = 'orange'
}
stages {
stage('Checkout') {
steps {
deleteDir()
checkout scm
}
}
stage ('Build') {
steps {
script {
check_something = someFunctionA()
if (check_something) {
echo "Build!"
# your_build_code
} else {
error "Something bad happened! Exiting..."
}
}
}
}
stage ('Test') {
steps {
echo "Running tests..."
// your_test_code
}
}
stage ('Deploy') {
steps {
script {
sh '''
# your_deploy_code
'''
}
}
}
}
post {
failure {
sh '''
# anything_you_need_to_perform_in_failure_step
'''
}
success {
sh '''
# anything_you_need_to_perform_in_success_step
'''
}
}
}
}
With above Groovy file in place, all you have to do now is to call it in your various Jenkins projects. Since you might already be having an existing Jenkinsfile (if not, create it) in your Jenkins project, you just have to replace the existing content of that file with the following:
$ cat Jenkinsfile
// Assuming you have named your shared-library as `my-shared-lib` & `Default version` to `master` branch in
// `Manage Jenkins` » `Configure System` » `Global Pipeline Libraries` section
#Library('my-shared-lib#master')_
def params = [:]
params=[
jenkins_var: "${env.JOB_BASE_NAME}",
]
commonPipeline params
Note: As you can see above, I am calling commonPipeline.groovy file. So, all your bulky Jenkinsfile will get reduced to just five or six lines of code, and those few lines are also going to be common across all those projects. Also note that I have used jenkins_var above. It can be any name. It's not actually used but is required for pipeline to run. Some Groovy expert can clarify that part.
Ref: https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

Resources