Jenkins with Copy Artifact plugin: copy directory contents with subdirectories - jenkins

I have a structure of artifacts in another build:
/
/bundle/docs
/bundle/bin
/bundle/bin/scripts
I want to copy all files and sudirectories into the current job's workspace subfolder 'product1' from /bundle/bin. I expect to see in %WORKSAPCE%/product1 contents of /bundle/bin.
I've configured it like this:
Artifacts to copy: bundle/bin/**
But it creates %WORKSAPCE%/product1//bundle/bin instead.
Is it possible?

Seems like that's just how the plugin works. Your options are:
Keep the same configuration and manipulate directories later using sh mv (Linux) or cmd move (Windows) command. This is the workaround used in my environment.
Check the "Flatten directories" option (but this will mix together the /bundle/bin and /bundle/bin/scripts)
Improve the plugin and contribute your code to the community :-)

Related

Add predefined Jenkins jobs to Jenkins master in docker

I have now searched the Internet for 2 whole days without any progress. There might be so, that the answer is just a click away, but I cannot find it. I'm blinded. I work on Windows 10 by the way...
The problem is that I want to create a simple Dockerfile, that can create a Docker image with a Jenkins master with everything I want it to contain. For instance:
general Jenkins configuration
users
plugins
jobs
slaves
etc...
Now I at least some parts working... I'm able to get rid of the initial "hello welcome to jenkins"-thing, and define my first user + install plugins that i wrote down in a text file.
BUT when it comes to the freakin jenkins jobs it seems to get a bit more complicated.
Since my Dockerfile takes a "base image" from jenkins/jenkins:lts-alpine, it creates a volume for /var/jenkins_home/ which seems to be where the jobs are stored at. And it seems to be a bit problematic to copy files to this folder. I tried add a COPY instruction in my Dockerfile to copy folder & files that are created when I manually create a job in Jenkins, but it seems like Jenkins does not read them for some reason, even after restarting/reloading from disk. The thing is, it is actually working when copying the jobs after "installation". Like docker cp jobs-on-my-machine container:/var/jenkins_home/jobs, but I dont want a lot of extra stuff outside my Dockerfile. I want to keep it as simple as possible. With this solution I will most likely not even be able to commit this change since the docker diff show this output:
C:\Dev>docker diff 3
C /tmp
A /tmp/hsperfdata_jenkins
A /tmp/hsperfdata_jenkins/6
A /tmp/jetty-0.0.0.0-8080-war-_-any-6459623464825334329.dir
A /tmp/jna--1712433994
A /tmp/winstone4948624124562796293.jar
you see... Nothing have changed in /var/jenkins_home/jobs/ directory... Nothing trackable at all :(
Take a look at my Dockerfile content below:
# Pull the latest Jenkins docker image from Docker Hub.
# Currently Alpine is used because the Internet says its safer :)
from jenkins/jenkins:lts-alpine
# Disable Jenkins setup wizard normally showing up during initial startup
ENV JAVA_OPTS="-Djenkins.install.runSetupWizard=false"
# Copy the groovy script creating the first user into the
# run-on-startup-directory into the docker image
COPY **security.groovy** /usr/share/jenkins/ref/init.groovy.d/security.groovy
# Copy the plugins text file into the docker image
COPY plugins.txt /usr/share/jenkins/ref/plugins.txt
# Run the Jenkins default install-plugins script to install
# plugins defined in the plugins text file
RUN /usr/local/bin/install-plugins.sh < /usr/share/jenkins/ref/plugins.txt
# Add the predefined Jenkins jobs _inside_ folder 'jenkins-jobs' into Jenkins
COPY jenkins-jobs /var/jenkins_home/jobs/
security.groovy content:
#!groovy
import jenkins.model.*
import hudson.security.*
import jenkins.security.s2m.AdminWhitelistRule
def instance = Jenkins.getInstance()
def hudsonRealm = new HudsonPrivateSecurityRealm(false)
hudsonRealm.createAccount("admin", "admin")
instance.setSecurityRealm(hudsonRealm)
def strategy = new FullControlOnceLoggedInAuthorizationStrategy()
instance.setAuthorizationStrategy(strategy)
instance.save()
Jenkins.instance.getInjector().getInstance(AdminWhitelistRule.class).setMasterKillSwitch(false)
plugins.txt:
cloudbees-folder
bouncycastle-api
structs
script-security
workflow-step-api
scm-api
workflow-api
junit
antisamy-markup-formatter
workflow-support
workflow-job
token-macro
build-timeout
credentials
ssh-credentials
plain-credentials
credentials-binding
timestamper
durable-task
workflow-durable-task-step
matrix-project
resource-disposer
ws-cleanup
ant
gradle
pipeline-milestone-step
jquery-detached
jackson2-api
ace-editor
workflow-scm-step
workflow-cps
pipeline-input-step
pipeline-stage-step
pipeline-graph-analysis
pipeline-rest-api
handlebars
momentjs
pipeline-stage-view
pipeline-build-step
pipeline-model-api
pipeline-model-extensions
apache-httpcomponents-client-4-api
jsch
git-client
git-server
workflow-cps-global-lib
display-url-api
mailer
branch-api
workflow-multibranch
authentication-tokens
docker-commons
docker-workflow
pipeline-stage-tags-metadata
pipeline-model-declarative-agent
workflow-basic-steps
pipeline-model-definition
workflow-aggregator
github-api
git
github
github-branch-source
pipeline-github-lib
mapdb-api
subversion
ssh-slaves
matrix-auth
pam-auth
ldap
email-ext
mercurial
In case someone else find this question:
From the GitHub - jenkinsci/docker:
In such a derived image, you can customize your jenkins instance with hook scripts or additional plugins. For this purpose, use /usr/share/jenkins/ref as a place to define the default JENKINS_HOME content you wish the target installation to look like :
When jenkins container starts, it will check JENKINS_HOME has this reference content, and copy them there if required. It will not override such files, so if you upgraded some plugins from UI they won't be reverted on next start.
In case you do want to override, append '.override' to the name of the reference file. E.g. a file named /usr/share/jenkins/ref/config.xml.override will overwrite an existing config.xml file in JENKINS_HOME.
What I've done myself and the thing that is working for me is: copy the jobs directory found on your Jenkins instance to where you are able to run docker build. Place the jobs directory from the Jenkins instance in to another directory e.g jobs_jenkins. Meaning your will have $PWD/jobs_jenkins/jobs
Add the following line in your Dockerfile:
...
FROM jenkins/jenkins:lts-alpine
COPY jobs_jenkins /usr/share/jenkins/ref/
...
From my understanding I think the jenkins image itself run a "startup-script" utilizing /usr/share/jenkins/ref, and the stuff copied directly to /var/jenkins_home will not be present...
For more information read the README.md. (From the GitHub - jenkinsci/docker)

Get sourcecode into Jenkins WORKSPACE subdirectory

Is it possible to configure Jenkins to get source code into a subdirectory of a %WORKSPACE%? Right now the source gets pulled into %WORKSPACE% and for the build output I explicitly specify a directory outside of the %WORKSPACE%.
Ideally I would like to have something similar to this:
%WORKSPACE%\source for source code and %WORKSPACE%\artifacts for build outputs. Is it possible to have this configuration?
Create a 'run batch command' build step and use xcopy, this is presuming jenkins is running on a Windows machine, if it's a deployment directory then make it a post build step.
cd c:/
xcopy /Y "c:/program files 86/junkies/workspace/app" "c:/path to new directory"
This is just a guess at your directories, replace with correct ones, the /Y forces it to be overwritten every time it's copied.

Jenkins and SCP

I have set up a jenkins build and everything is working fine except the very last step.
The whole build creates a directory called: build
This directory contains a web-inf and all the files in it I would like to publish via SCP to a different location, so that all the content of the build/web-inf folder will become the content of the target folder.
The settings for jenkins scp plugin are (it is a post-build step):
source: build/web-inf/**
destination: public_html/
that results in:
public_html/build/web-inf/...
but should be:
public_html/...
(the keep hierarchy box is ticked)
how can I make that happen??
EDIT
I could solve the problem without any additional script. The solution is so simple that my question turned out to be stupid.
All I did was telling ant to copy all the webfiles to ./public_html instead of ./build/web-inf/ what made the jenkins scp copy all files from public_html to public_html exactly as it was intended to.
If your goal is just to SCP files generated during the build, and the plugin doesn't seem to be working (I couldn't see anything wrong in your configuration) you can use an "Execute shell" build step and type the scp command something like (try it in a shell first in your job's build directory to get the syntax right):
scp -r build/web-inf/* user#host:/destination-directory

Copy generated folder from one job to another in Hudson/Jenkins

I have two jobs in my Hudson configuration. Let's call them A and B.
Job A was created specifically to generate a folder application_home. This folder is a ready-to-be-used-in-installations-application-home-folder.
Job B is the "pack-all-together-for-installation-job". It needs to copy the application_home generated by job A to generate the installer. My problem is that after some investigation, I was not able to do this in a simple way.
I could use shell script, but then I would need to know job A path plus where its workspace is to get application_ home folder.
Is there a simpler way to do this?
EDIT
I know Copy Artifact Plugin. The problem is that it only copies artifacts. I need to copy the folder application_ home as it is, because it's already in the structure to be used in the installer. If there's a way to use this plugin to copy only the folder, I haven't found it.
EDIT 2. Answer:
Ok, you can do it using Copy Artifact Plugin. You need to
Set its configuration to "copy from WORKSPACE of latest completed build".
Set Artifacts to copy option the folder like this: target/application_home/**
Set Target directory to where you want to somethine like: installation_bundle_folder/application_home.
and it's done :)
You could try the Copy Artifact Plugin.
Then you could add a build step to "pack-all-together-for-installation-job" that would copy application_home to the packaging directory. There is an option to only include the latest stable build of Project A.
Another alternative is to have a post-build step for a successful Project A build that scripts the copy of the application_home over to where Project B will use it. You can use the WORKSPACE environment variable to get the absolute location. (See here for a list of environmental variables).

How to set ANT_PATH in Jenkins from outside of the workspace?

My file structure contains src/ folder with the project's source code, and this folder is the one I want to have in the jenkins' workspace.
However, I also have build folder which is needed for apache ant, and it is changing with every single "ant" command executed. The problem is this folder weights over 200mb. I don't want to end up pushing it to the repo everytime I run the "ant" command.
If someone who reads it has some experience with this - what's the best way to do it? Is it possible to pull src/ folder from the repo, and build/ folder from the system? But I guess it will be wrong because this way only I will be able to execute ant command...
What's the best way to set this?
Huh? You put build in your .hgignore file and then you'll not commit that directory or what's in it. That's the usual setup, but maybe I'm missing some nuance of your question.

Resources