Jenkinsfile copyArtifacts plugin: Use environment variable in filter - jenkins

I have the Jenkins copyArtifact plugin installed and am using declarative pipeline syntax.
I've set up my pipeline declaring environment variables as such:
pipeline {
environment {
ENVIRONMENT="prod"
}
}
I want to be able to filter copying artifacts based on a filter determined by the environment variable specified above. However, when specifying the following in my declarative pipeline:
steps {
copyArtifacts(
filter: "build_${env.ENVIRONMENT}_*.exe"
)
// do stuff
}
I get the following error:
Failed to copy artifacts from feature/project with filter: build_${env.ENVIRONMENT}_*.exe
The documentation indicates that "filter" is meant to be a string of syntax
ant-expression to filter artifacts to copy
However I've been unable to integrate reading of pipeline environment variables in this filter parameter. Does anyone know if including environment variables in the filter string is possible? If so, what is the syntax?

Yes it is allowed, but you would need to first archive before doing copy or should be available in the artifacts.
Example:
#!groovy
import hudson.model.Result
import groovy.json.*
pipeline
{
agent any
environment
{
ENVIRONMENT= "prod"
}
stages
{
stage ('stage 1')
{
steps
{
script
{
// Example to show archiving of data. I Copy folder "MyArchive" to jenkins artifacts as an example.
dir("C:\\Data")
{
// Archive all files in folder "My_Archive"
archiveArtifacts allowEmptyArchive: true, artifacts: "My_Archive/"
copyArtifacts filter: "My_Archive/build_${env.ENVIRONMENT}_*.exe", fingerprintArtifacts: true, projectName: 'Archivetest_Pipeline', selector: lastWithArtifacts(), target: 'C:\\Data\\Test'
}
}
}
}
}
}
You can also use the pipeline syntax generator to configure inputs as per your need.
https://www.jenkins.io/doc/book/pipeline/getting-started/#snippet-generator

Related

Jenkins: unable to access the artifacts on the initial run

My setup: main node runs on Linux and an agent on Windows. I want to compile a library on an agent, archive those artifacts and copy them on the main node to create a release togather with the Linux compiled binaries.
This is my Jenkinsfile:
pipeline {
agent none
stages {
stage('Build-Windows') {
agent {
dockerfile {
filename 'docker/Dockerfile-Windows'
label 'windows'
}
}
steps {
bat "tools/ci/build.bat"
archiveArtifacts artifacts: 'build_32/bin/mylib.dll'
}
}
}
post {
success {
node('linux') {
copyArtifacts filter: 'build_32/bin/mylib.dll', flatten: true, projectName: '${JOB_NAME}', target: 'Win32'
}
}
}
}
My problem is, when I run this project for the first time, I get the following error
Unable to find project for artifact copy: mylib
But when I comment the copyArtifacts block and rerun the project, it is successful and I have artifacts vivible in the project overview. After this I can reenable the copyArtifacts and then the artifacts will be copied as expected.
How to configure the pipeline so it can access the artifacts on the initial run?
The copyArtifacts capability is usually used to copy artifacts between different builds and not between agents on the same build. Instead, to achieve what you want you can use the stash and unstash keywords which are designed exactly for passing artifacts from different agents in the same pipeline execution:
stash: Stash some files to be used later in the build.
Saves a set of files for later use on any node/workspace in the same Pipeline run. By default, stashed files are discarded at the end of a pipeline run
unstash: Restore files previously stashed.
Restores a set of files previously stashed into the current workspace.
In your case it can look like:
pipeline {
agent none
stages {
stage('Build-Windows') {
agent {
dockerfile {
filename 'docker/Dockerfile-Windows'
label 'windows'
}
}
steps {
bat "tools/ci/build.bat"
// dir is used to control the path structure of the stashed artifact
dir('build_32/bin'){
stash name: "build_artifact" ,includes: 'mylib.dll'
}
}
}
}
post {
success {
node('linux') {
// dir is used to control the output location of the unstash keyword
dir('Win32'){
unstash "build_artifact"
}
}
}
}

Sharing files between Jenkins pipelines

A lot of the examples I see like How can I use the Jenkins Copy Artifacts Plugin from within the pipelines (jenkinsfile)? share a file within the SAME pipeline. I want to share a file between two different pipelines.
I tried to use the Copy Artifacts plugin like so
Pipeline1:
node {'linux-0') {
stage("Create file") {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifact: 'hello.txt', fingerprint: true
}
}
Pipeline2:
node('linux-1') {
stage("copy") {
copyArtifacts projectName: 'Pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
and I get the following error for Pipeline2
ERROR: Unable to find project for artifact copy: Pipeline1
This may be due to incorrect project name or permission settings; see help for project name in job configuration.
Finished: FAILURE
What am I missing?
NOTE: My real scripted, i.e., not declarative, pipelines are more complicated than these so I can't readily convert them to declarative pipelines.
I just tested this and it worked fine for me. Here is my code, pipeline1:
pipeline {
agent any
stages {
stage('Hello') {
steps {
sh "echo \"hello world\" > hello.txt"
archiveArtifacts artifacts: 'hello.txt', fingerprint: true
}
}
}
}
pipeline2
pipeline {
agent any
stages {
stage('Hello') {
steps {
copyArtifacts projectName: 'pipeline1',
fingerprintArtifacts: true,
filter: 'hello.txt'
}
}
}
}
copyArtifacts projectName: 'pipeline1'
Ensure that the project name is exactly same as the first pipleline's name (
and there are no conflicts on that name). If you have conflicts or use folder plugin ensure to look at this link to refer the project accordingly:
https://wiki.jenkins.io/display/JENKINS/How+to+reference+another+project+by+name

Use workspace location in the post build script in jenkins

I am trying to use the artifacts created in the workspace post jenkins build in a postbuild shell script.
I am not able to use them as workspace artifacts are automatically getting deleted before it comes postbuild script.
Could anyone help me to address this?
When the post-build stage is running, your workspace is already removed. When you think of it, your regular stage and post-build stage may even be running on different nodes, so there can't be any expectation that the files are in your workspace.
To access your artifacts in the post-build stage, you need to fetch them manually, e.g. by using Copy Artifact plugin:
post {
always {
// fetch artifacts of this job and this number to $WORKSPACE
step([
$class: 'CopyArtifact',
filter: '*',
fingerprintArtifacts: true,
optional: true,
projectName: "${JOB_NAME}",
selector: [$class: 'SpecificBuildSelector',
buildNumber: "${BUILD_NUMBER}"]
])
script {
try {
for(file in findFiles(glob: "*")) {
println "Found file ${file}"
}
} catch(error) {
println "Failed to find files"
}
}
}
}

Extracting an entire Jenkins stage to a shared library?

Is it possible to take an entire stage('foo') {...} definition and extract it into a shared library within Jenkins? The docs are very clear on how to pull an individual step out, but I can't find any way to take an entire stage, parameterize it, and re-use it globally. I thought perhaps just return stage... would work, but it errors out as an invalid return value.
It depends if you use scripted or declarative pipeline.
Scripted pipeline is more flexible and it allows you e.g. create stages based on some conditions (each pipeline run can have a different number and kind of stages). In this kind of pipeline you can extract a full stage to the shared library class and call it from inside the node {} block. Consider following example:
// src/ScriptedFooStage.groovy
class ScriptedFooStage {
private final Script script
ScriptedFooStage(Script script) {
this.script = script
}
// You can pass as many parameters as needed
void execute(String name, boolean param1) {
script.stage(name) {
script.echo "Triggering ${name} stage..."
script.sh "echo 'Execute your desired bash command here'"
if (param1) {
script.sh "echo 'Executing conditional command, because param1 == true'"
}
}
}
}
Then the Jenkinsfile may look like this:
node {
new ScriptedFooStage(this).execute('Foo', true)
}
As you can see the whole stage was encapsulated in the ScriptedFooStage.execute() method. Its name is also taken from the parameter name - scripted pipeline allows you doing such thing.
Declarative pipeline on the other hand is more strict and opinionated. It's fixed if it comes to the number of stages and their names (you can't model dynamically what stages are present per build and what are their names). You can still take advantage of shared library classes, but you are limited to execute them inside script {} block inside stage('Name') { steps {} } block. It means that you can't extract the whole stage to the separate class, but only some part that gets executed at the steps level. Consider following example:
// src/DeclarativeFooStage.groovy
class DeclarativeFooStage {
private final Script script
DeclarativeFooStage(Script script) {
this.script = script
}
// You can pass as many parameters as needed
void execute(String name, boolean param1) {
script.echo "Triggering script with name == ${name}"
script.sh "echo 'Execute your desired bash command here'"
if (param1) {
script.sh "echo 'Executing conditional command, because param1 == true'"
}
}
}
And the Jenkinsfile may look like this:
// Jenkinsfile
pipeline {
agent any
stages {
stage('Foo') {
steps {
script {
new DeclarativeFooStage(this).execute('something', false)
}
}
}
}
}
If we would try execute new DeclarativeFooStage(this).execute('something', false) outside script {} block in the declarative pipeline we would get compilation errors.
Conclusion
The choice between scripted or declarative pipeline depends on specific use case. If you want to get the best flexibility when it comes to modeling your pipeline business logic, scripted pipeline might be the good choice. However, it comes with some price. For instance, scripted pipeline does not support restarting pipeline build from specific stage - this is supported only by declarative pipeline. (Imagine you have 10 stages in the pipeline and stage 7 failed because of some silly mistake and you would like to restart the build from 7th stage - in scripted pipeline you would have to re-run from the very beginning, while declarative pipeline can restart from 7th stage by remembering the results from all 6 previous stages).
To complete Szymon Stepniak answer I will leave here note that in declarative pipeline you may also share whole pipeline:
// vars/myDeliveryPipeline.groovy
def call(Map pipelineParams) {
pipeline {
agent any
stages {
stage('build') {
...
}
stage ('test') {
...
}
...
}
}
}
And then call it
// Jenkinsfile
myDeliveryPipeline(foo: 'FOO', bar: 'BAR')
But as far as remember you may call only one pipeline in a Jenkins file which make it not very customizable.
Source
https://www.jenkins.io/blog/2017/10/02/pipeline-templates-with-shared-libraries/

How to set Jenkins environment variables in run-time

I want to set some jenkins environment variables in run time based on my computation. How can i set this run-time in my jenkinsfile's step section.
for example: based on my calculation i get abc=1. How can i set this in real time in my jenkinsfile's step section so that i can use it later by calling $abc.
I am declaring my pipeline and environment variables as explained here:
https://jenkins.io/doc/pipeline/tour/environment/
i'm using Jenkins ver. 2.41
Here an example how to set variables and use it in the same Jenkinsfile.
The Variable versionToDeploy will be used by the build job step.
pipeline {
agent any
stages {
stage('Example') {
steps {
echo 'build the artifacts'
script {
versionToDeploy = '2.3.0'
}
}
}
}
post {
success {
echo 'start deploy job'
build job: 'pipeline-declarative-multi-job-deploy', parameters: [[$class: 'StringParameterValue', name: 'version', value: versionToDeploy]]
}
}
}

Resources