I have Copy Artifact Plugin installed & trying to build and deploy through jenkins pipeline with following Jenkinsfile
Parameter DEPLOY_BUILD_NUMBER default to current build number. I want to make it such a way pipeline should build and deploy if DEPLOY_BUILD_NUMBER is current build number OR just deploy whatever build number specified for DEPLOY_BUILD_NUMBER
pipeline {
agent { label 'windows' }
parameters {
string(
name: 'DEPLOY_BUILD_NUMBER',
defaultValue: '${BUILD_NUMBER}',
description: 'Fresh Build and Deploy OR Deploy Previous Build Number'
)
}
stages {
stage ('Build') {
steps {
echo "Building"
}
post {
success {
archiveArtifacts artifacts: 'build.tar.gz', fingerprint: true
}
}
}
stage ('Deploy') {
steps {
echo "Deploying...."
script {
step ([$class: 'CopyArtifact',
projectName: '${JOB_NAME}',
filter: "*.tar.gz"]);
}
}
}
}
post {
always {
cleanWs()
}
}
}
When I run this pipeline I get following error
java.lang.UnsupportedOperationException: no known implementation of interface jenkins.tasks.SimpleBuildStep is named CopyArtifact
Also tried
stage ('Deploy') {
steps {
echo "Deploying...."
copyArtifacts filter: '*.tar.gz', fingerprintArtifacts: true, projectName: '${JOB_NAME}'
}
}
which failed with following error
java.lang.NoSuchMethodError: No such DSL method 'copyArtifacts' found among steps
and
stage ('Deploy') {
steps {
echo "Deploying...."
script {
copyArtifacts filter: '*.tar.gz', fingerprintArtifacts: true, projectName: '${JOB_NAME}'
}
}
}
which gave me
java.lang.NoSuchMethodError: No such DSL method 'copyArtifacts' found among steps
What is the correct syntax for copyArtifacts ? what I am missing here ?
I would check the version of the Copy Artifacts plugin you have installed (you can see that in /pluginManager/installed), the minimum version that supports pipeline is 1.39
CopyArtifact defines a step, copyArtifacts, that you can use directly.
Check the step reference here
Related
I am doing code review with gerritcodereview and I need to create a jenkins pipeline for CI, CD. I am using the events triggered by gerrit trigger plugin.
I want to obtain this:
PastchSet Created
build start on refs/changes/**/**/** branch
report results to gerrit for code review
Change Merged(into develop) or Ref Updated(develop)
build start on origin/develop branch
deploy code to internal server
Ref Updated(master)
build start on origin/master branch
deploy code to external server
Questions for which I didn't find good answers:
do I need to use a simple pipeline or multibranch pipeline?
how do I start the build on the correct branch?
how can I checkout the correct branch using a Jenkinsfile instead of using the configuration page?
You should create multibranch pipeline, and write your declarative/scripted
pipeline in Jenkinsfile
example pipeline
pipeline {
agent any
tools {
maven 'maven-3.3.6'
jdk 'jdk-11'
}
options {
buildDiscarder(logRotator(numToKeepStr: '5'))
}
stages {
stage('Build/Test') {
when {
changeRequest()
}
steps {
sh "mvn clean verify"
}
post {
success {
gerritReview labels: [Verified: 1], message: "Successful build, ${env.RUN_DISPLAY_URL}."
}
unstable {
gerritReview labels: [Verified: 0], message: "Unstable build, ${env.RUN_DISPLAY_URL}"
}
failure {
gerritReview labels: [Verified: -1], message: "Failed build, ${env.RUN_DISPLAY_URL}"
}
}
}
stage('Deploy') {
when {
branch 'develop'
}
steps {
sh 'mvn deploy'
}
}
}
}
stage build&test will run for any change in changeRequest, any new change, or patchset will trigger this stage
stage deploy will be triggered for any change merged to develop.
You could have multiple stages for one branch, they will be executed in sequence
A lot of the examples I see like How can I use the Jenkins Copy Artifacts Plugin from within the pipelines (jenkinsfile)? share a file within the SAME pipeline. I want to share a file between two different pipelines.
I tried to use the Copy Artifacts plugin like so
Pipeline1:
node {'linux-0') {
stage("Create file") {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifact: 'hello.txt', fingerprint: true
}
}
Pipeline2:
node('linux-1') {
stage("copy") {
copyArtifacts projectName: 'Pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
and I get the following error for Pipeline2
ERROR: Unable to find project for artifact copy: Pipeline1
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
Finished: FAILURE
What am I missing?
NOTE: My real scripted, i.e., not declarative, pipelines are more complicated than these so I can't readily convert them to declarative pipelines.
I just tested this and it worked fine for me. Here is my code, pipeline1:
pipeline {
agent any
stages {
stage('Hello') {
steps {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifacts: 'hello.txt', fingerprint: true
}
}
}
}
pipeline2
pipeline {
agent any
stages {
stage('Hello') {
steps {
copyArtifacts projectName: 'pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
}
}
copyArtifacts projectName: 'pipeline1'
Ensure that the project name is exactly same as the first pipleline's name (
and there are no conflicts on that name). If you have conflicts or use folder plugin ensure to look at this link to refer the project accordingly:
https://wiki.jenkins.io/display/JENKINS/How+to+reference+another+project+by+name
I am using three stages here , In this if my second stage Build fails it should skip the third stage copy. may i know how to use conditions here in pipeline job?
node('') {
stage ('clone'){
Build job : 'Job1'
}
stage ('Build'){
parallel(firstTask: {
stage ('Job2'){
build job: 'Job2', propagate: true
}
}, secondTask: {
stage ('Job3'){
build job: 'Job3', propagate: true
}
})
stage ('copy'){
build job: 'copy'
}
}
}
First and foremost you will need to declare stage under stages and not under node. As per default behaviour of pipeline, if a build fails in a stage, it will automatically skip next stages.
There are lot of options for using conditions in pipeline. One of the option I often use is when {}.
Here is an example-
Jenkinsfile (Declarative Pipeline)
pipeline {
agent any
stages {
stage('Example Build') {
steps {
echo 'Hello World'
}
}
stage('Example Deploy') {
when {
branch 'production'
}
steps {
echo 'Deploying'
}
}
}
}
For more details and options, refer this documentation - https://jenkins.io/doc/book/pipeline/syntax/
I have a Bitbucked repo, and I want to satrt my Jenkins pipeline job only afrer commit with tag like "release-1.0.*"
So, I seted my job up with pipeline script:
pipeline {
agent any
stages {
stage ('Prepare') {
when {
tag "release*"
}
steps {
git branch: 'tag1', url: 'git#bitbucket.org:m*********ny/tests.git'
}
}
stage ('Deploy') {
steps {
sshPublisher(publishers: [sshPublisherDesc(configName: "JenkinsSrv", transfers: [sshTransfer(execCommand: 'pwd')])])
}
}
}
post ('POST BUILD'){
always {
echo 'This is post action!!!'
}
}
}
Also, I turned on Bitbucked webhook plugin, than my repo notify Jenkins about new changes.
But my solution doesn't work. Help me resolve this case.
enter image description here
According to the official documentation for a Jenkins pipeline, the option you are looking for is the changelog condition inside the when directive. For example:
when { changelog 'release*' }
I have a Jenkins pipeline whose Build step has an archiveArtifacts command.
After the Build step there is Unit test, Integration test and Deploy.
In Deploy step, I want to use one of the artifacts. I thought I could find it in the same place the Build step generated it, but apparently the archiveArtifacts has deleted them.
As a workaround I can copy the artifact before it is archived, but it doesn't look elegant to me. Is there any better way?
As I understand it, archiveArtifacts is more for saving artifacts for use by something (or someone) after the build has finished. I would recommend looking at using "stash" and "unstash" for transferring files between stages or nodes.
You just go...
stash include: 'globdescribingfiles', name: 'stashnameusedlatertounstash'
and when you want to later retrieve that artifact...
unstash 'stashnameusedlatertounstash'
and the stashed files will be put into the current working directory.
Here's the example of that given in the Jenkinsfile docs (https://jenkins.io/doc/book/pipeline/jenkinsfile/#using-multiple-agents):
pipeline {
agent none
stages {
stage('Build') {
agent any
steps {
checkout scm
sh 'make'
stash includes: '**/target/*.jar', name: 'app'
}
}
stage('Test on Linux') {
agent {
label 'linux'
}
steps {
unstash 'app'
sh 'make check'
}
post {
always {
junit '**/target/*.xml'
}
}
}
stage('Test on Windows') {
agent {
label 'windows'
}
steps {
unstash 'app'
bat 'make check'
}
post {
always {
junit '**/target/*.xml'
}
}
}
}
}