In a nutshell:
How can I access the location of the produced artifacts within a shell script started in a build or post-build action?
The longer story:
I'm trying to setup a jenkins job to automate the building and propagation of debian packages.
So far, I was already successfull in using the debian-pbuilder plugin to perform the build process, such that jenkins presents the final artifacts after successfully finishing the job:
mypackage_1+020200224114528.NOREV.4_all.deb
mypackage_1+020200224114528.NOREV.4_amd64.buildinfo
mypackage_1+020200224114528.NOREV.4_amd64.changes
mypackage_1+020200224114528.NOREV.4.dsc
mypackage_1+020200224114528.NOREV.4.tar.xz
Now I would like to also automate the deployment process into the local reprepro repository, which would actually just require a simple shell script invocation, I've put together.
My problem: I find no way to determine the artifact location for that deployment script to operate on. The "debian-pbuilder" plugin generates the artifacts in a temporary directory ($WORKSPACE/binaries.tmp15567690749093469649), which changes with every build.
Since the artifacts are listed properly in the finished job status view, I would expect that the artifact details are provided to the script (e.g. by environment variables). But that is obvously not the case.
I've already search extensively for a solution, but didn't find anything helpful.
Or is it me (still somewhat a Rookie in Jenkins), following a wron approach here?
You can use archiveArtifacts. You have binaries.tmp directory in the Workspace and you can use it, but before execute clear workspace using deleteDir().
Pipeline example:
pipeline {
agent any
stages {
stage('Build') {
steps {
deleteDir()
...
}
}
}
post {
always {
archiveArtifacts artifacts: 'binaries*/**', fingerprint: true
}
}
}
You can also check https://plugins.jenkins.io/copyartifact/
Related
I want to use the Jenkins "PRQA" plugin, which seems not to have the option to use it from a pipeline. The plugin would run static code analysis and publish the results.
In my case, it requires some preparations that are already done in a pipelinejob. Because of that, I want to include the job into that pipeline, but on the same executor with the data prepared by the pipeline as some kind of inlined job-step.
I have tried to create a job for the PRQA-Plugin-Step and execute this with the build step from the pipeline. But this tries to start the job on a new executor (and stalls because I have only one executor).
pipeline {
agent any
stages {
stage('Build') {
steps {
echo 'Prepare'
}
}
stage('SCA') {
steps {
//Run this without using a new executor with the Environment that exists now
build 'PRQA_Job'
}
}
}
}
What is the correct way to run the job on the same executor with the current working directory.
With specified build 'PRQA_Job' it's not possible to run second job on the same executor (1 job = 1 executor), since main job just waiting for a triggered job to be finished. But you can run another job on the same agent with more than 1 executor to reach workspace from main job.
For a test porpose specify agent name in both jobs: agent 'agent_name_here'
If you want to use plugin functionality for a plugin, which has no native pipeline support, you could try using "step: General Build step" feature for Jenkins Pipelines. You can use the Pipeline Syntax wizzard linked in the Job configuration windows to generate the needed Pipeline description.
If the plugin does not show up in the "step: General Build step" part of Jenkins you can use a separate Job. To copy all the needed files/Data into this second Job you will require to use Archive Artifact/Copy Artifact functionality of Jenkins to save files from your Pipeline build.
For more information on how to sue Archive Artifact/Copy Artifact see https://plugins.jenkins.io/copyartifact/ and
https://www.jenkins.io/doc/pipeline/tour/tests-and-artifacts/
For a given scripted pipeline(jenkins), the pipeline should only get triggered through webhook from GitLab
Build Now option should be disabled for that pipeline.
Can we configure Jenkins, to disable Build Now option for a specific pipeline script job in jenkins?
EDIT: Here the solution with an scripted Pipeline:
node {
def userIdCause = currentBuild.getBuildCauses('hudson.model.Cause$UserIdCause')
stage("Authorize Usage") {
if (userIdCause.size()) {
error('Aborting Build due to manual start - thats not permitted!')
}
}
}
How about the following solution without any extra plugin on an declarative pipeline:
pipeline {
...
stages {
stage ("Authorize Usage") {
when { expression { getCause() == "USER" } }
steps {
currentBuild.description = 'Aborting Build due to manual start - thats not permitted!'
error('Aborting Build due to manual start - thats not permitted!')
}
}
...
}
Have taken a look at this plug-in supplied on the Jenkin's site? Matrix Authorization Strategy Plugin :
Matrix Strategy
Specifically this sectionL Allow configuring per-agent permissions. This allows e.g. restricting per-agent build permissions when using the Authorize Project plugin (JENKINS-46654)
Not ideal, but if this is a 'freestyle pipeline job'
a quick workaround is to add a build step "Execute shell" as first step. You can use this to prevent a build, when noting has changed.
Every time your sources changes and you push to your repo, a build will have been triggered and as there are changes this script will not exit.
When you click the 'Build now', nothing should have changed in your repo (as the only way it can is through a push which would then trigger a build) it will causes an exit, and fail the build.
if [[ $GIT_COMMIT -eq $GIT_PREVIOUS_COMMIT ]]
then
echo "Exiting build - Nothing has changed"
echo "This is to prevent the usage of Jenkins 'build now'"
exit 1
fi
EDIT: This is the answer to the question of user #mohet in the comments of my other answer because it was to long for the comment section (https://stackoverflow.com/a/55058788/7746963).
The currentBuild variable, which is of type RunWrapper, may be used to refer to the currently running build...
Source: https://opensource.triology.de/jenkins/pipeline-syntax/globals .
hudson.model is the package name of most corresponding core jenkins classes. 'Hudson' because jenkins was once cloned from the codebase of his ancestor named 'hudson'.
You can look up them here: https://javadoc.jenkins.io/hudson/model/package-summary.html .
There you will also find https://javadoc.jenkins.io/hudson/model/Cause.UserIdCause.html . To specify directly the package$classname in some methods like getbuildcauses is the straightforward thought of jenkins dev Team. This reduces the failure potential and makes the code better readable and understandable.
I have git monorepo with different apps. Currently I have single Jenkinsfile in root folder that contains pipeline for app alls. It is very time consuming to execute full pipeline for all apps when commit changed only one app.
We use GitFlow-like approach to branching so Multibranch Pipeline jobs in Jenkins as perfect fit for our project.
I'm looking for a way to have several jobs in Jenkins, each one will be triggered only when code of appropriate application was changed.
Perfect solution for me looks like this:
I have several Multibranch Pipeline jobs in Jenkins. Each one looks for changes only to given directory and subdirectories. Each one uses own Jenkinsfile. Jobs pull git every X minutes and if there are changes to appropriate directories in existing branches - initiates build; if there are new branches with changes to appropriate directories - initiates build.
What stops me from this implementation
I'm missing a way to define commit to which folders must be ignored during scan execution by Multibranch pipeline. "Additional behaviour" for Multibranch pipeline doesn't have "Polling ignores commits to certain paths" option, while Pipeline or Freestyle jobs have. But I want to use Multibranch pipeline.
Solution described here doesnt work for me because if there will be new branch with changes only to "project1" then whenever Multibranch pipeline for "project2" will be triggered it will discover this new branch anyway and build it. Means for every new branch each of my Multibranch pipelines will be executed at least once no matter if there was changes to appropriate code or not.
Appreciate any help or suggestions how I can implement few Multibranch pipelines watching over same git repository but triggered only when appropriate pieces of code changed
This can be accomplished by using the Multibranch build strategy extension plugin. With this plugin, you can define a rule where the build only initiates when the changes belong to a sub-directory.
Install the plugin
On the Multibranch pipeline configuration, add a Build strategy
Select Build included regions strategy
Put a sub-folder on the field, such as subfolder/**
This way the changes will still be discovered, but they won't initiate a build if it doesn't belong to a certain set of files or folders.
This is the best approach I'm aware so far. But I think the best way would be a case where the changes doesn't even get discovered.
Edit: Gerrit Code Review plugin configuration
In case you're using the Gerrit Code Review plugin, you can also prevent new changes to be discovered by using a custom query:
I solved this by creating a project that builds other projects depending on the files changed. For example, from your repo root:
/Jenkinsfile
#!/usr/bin/env groovy
pipeline {
agent any
options {
timestamps()
}
triggers {
bitbucketPush()
}
stages {
stage('Build project A') {
when {
changeset "project-a/**"
}
steps {
build 'project-a'
}
}
stage('Build project B') {
when {
changeset "project-b/**"
}
steps {
build 'project-b'
}
}
}
}
You would then have other Pipeline projects with their own Jenkinsfile (i.e., project-a/Jenkinsfile).
I know that this post is quite old, but I solved this problem by changing the "include branches" parameter for SVN repositories (this can possibly also be done using the property "Filter by name (with wildcards)" for git repos). Instead of supplying only the actual branch name, I also included the subfolder. So instead of only supplying "trunk", I used "trunk/subfolder". This limits scanning to only that specific directory. Note that I have not yet fully tested this solution.
I have an open-source project, that resides in GitHub and is built using a build farm, controlled by Jenkins.
I want to build it branch-wise using a pipeline, but I don't want to store Jenkinsfile inside the code. Is there a way to accomplish this?
I have encountered the same issue as you. While the idea of having the build process as part of the code is good, there is information that the Jenkinsfile would include that are not intrinsic to the project build itself, but rather are specific to the build environment instance, which may change.
The way I accomplished this is:
Encapsulate the core build process in a single script (build.py or build.sh). This may call specific build tools like Make, CMake, Ant, etc.
Tell Jenkins via the Jenkinsfile to call a function defined in a single global library
Define the global Jenkins build function to call the build script (e.g. build.py) with appropriate environment settings. For example, using custom tools and setting up the PATH.
So for step 2, create a Jenkinsfile in your project containing just the line
build_PROJECTNAME()
where PROJECTNAME is based on the name of your project.
Then use the Pipeline Shared Groovy Libraries Plugin and create a Groovy script in the shared library repository called vars/build_PROJECTNAME.groovy containing the code that sets up the environment and calls the project build script (e.g. build.py):
def call() {
node('linux') {
stage("checkout") {
checkout scm
}
stage("build") {
withEnv([
"PATH+CMAKE=${tool 'CMake'}/bin",
"PATH+PYTHON=${tool 'Python-3'}",
"PATH+NINJA=${tool 'Ninja'}",
]) {
execute 'python build.py'
}
}
}
}
First of all, why do you not want a Jenkinsfile in your code? The pipeline is just as much part of the code as would be your build file.
Other then that, you can load groovy files to be evaluated as a pipeline script. You can do this either from a different location with the from SCM option and then checkout the actual code. But this will force you to manually take care of the branch builds.
Another option would be to have a very basic Jenkinsfile that merely checkouts an external pipeline.
You would get something like this:
node{
deleteDir()
git env.flowScm
def flow = load 'pipeline.groovy'
stash includes: '**', name: 'flowFiles'
stage 'Checkout'
checkout scm // short hand for checking out the "from scm repository"
flow.runFlow()
}
Where the pipeline.groovy file would contain the actual pipeline would look like this:
def runFlow() {
// your pipeline code
}
// Has to exit with 'return this;' in order to be used as library
return this;
Prior Jenkins2 I was using Build Pipeline Plugin to build and manually deploy application to server.
Old configuration:
That works great, but I want to use new Jenkins pipeline, generated from groovy script (Jenkinsfile), to create manual step.
So far I came up with input jenkins step.
Used jenkinsfile script:
node {
stage 'Checkout'
// Get some code from repository
stage 'Build'
// Run the build
}
stage 'deployment'
input 'Do you approve deployment?'
node {
//deploy things
}
But this waits for user input, noting that build is not completed. I could add timeout to input, but this won't allow me to pick/trigger a build and deploy it later on:
How can I achive same/similiar result for manual step/trigger with new jenkins-pipeline as prior with Build Pipeline Plugin?
This is a huge gap in the Jenkins Pipeline capabilities IMO. Definitely hard to provide due to the fact that a pipeline is a single job. One solution might be to "archive" the workspace as an "artifact" (tar and archive **/* as 'workspace.tar.gz'), and then have another pipeline copy the artifact and and untar it into the new workspace. This allows the second pipeline to pickup where the previous one left off. Of course there is no way to gauentee that the second pipeline cannot be executed out of turn or more than once. Which is too bad. The Delivery Pipeline Plugin really shines here. You execute a new pipeline right from the view - instead of the first job. Anyway - not much of an answer - but its the path I'm going to try.
EDIT: This plugin looks promising:
https://github.com/jenkinsci/external-workspace-manager-plugin/blob/master/doc/PIPELINE_EXAMPLES.md