Way to clone a job from one jenkins to another - jenkins

I have two Jenkins, both are master. Both have 5 salve Jenkins each. I have one job on first jenkins that needs to be cloned for each job.
I can clone the job on first jenkins and its slave but not on second master jenkins. Is there a way to clone a job from one jenkins to another?
I have one more question can I archive the job at some defined location other than master jenkins, May be on slave?

I assume you have a job called "JOB" on "Jenkins1" and you want to copy it to "Jenkins2":
curl JENKINS1_URL/job/JOB/config.xml | java -jar jenkins-cli.war -s JENKINS2_URL create-job
You might need to add username and password if you have turned on security in Jenkins. The jenkins-cli.war is available from your $JENKINS_URL/cli.
Ideally you should make sure you have the same plugins installed on both Jenkins1 and Jenkins2. More similar you can make the two Jenkins masters, the fewer problems you will have importing the the job.

For the second part of your question: slaves don't store any Jenkins configuration. All configuration is done on Master. There is a lot of backup plugins, some backup the whole Jenkins, some backup just job configuration, some backup individual jobs, export them to files, or even store/track changes from SCM such as SVN.
So "archiving job configuration to slave" simply makes no sense. But at the end of the day, a job configuration is simply an .xml file, and you can take that file and copy it anywhere you want.
As for the first part of the question, it's unclear what you want. Do you want to clone a job automatically (as part of another job's process), programmatically (through some script) or manually (through the UI, other means)?
Edit:
Go to your JENKINS_HOME directory on the server filesystem, navigate to the jobs folder, then select the specific job folder that you want.
Copy the config.xml to another server, this will create the same job with the same configuration (make sure your plugins are same)
Copy the whole job_name folder if you want to preserve history, builds, artifacts, etc

Related

Jenkins - Copy Artifacts from upstream job built in different node

There is a job controlled by Development team which built in a different node. I am on Testing team who want to take the artifacts and deploy on test device.
I can see those Artifacts from dev are stored in some path in dev's node. Does it means it must first archived in Jenkins master before I can copy it to my job?
I am using Copy Artifact plugin and constantly getting the error
Failed to copy artifacts from <dev-job> with filter: <path-in-dev-node>
*Some newbie question since i just moved from TeamCity
You probably want to use: Copy Artifact plugin.
Adds a build step to copy artifacts from another project.
Consider also, the Jenkins post-buid step "Archive the artifacts".
If you copy from the other job's workspace, what happens if another job is in progress or the workspace is wiped? That step copies them from the node to the master and stores a copy along with the build logs, etc. That makes them available via the UI as long as the build logs remain. It can take up space tho.
If you do use archive artifacts, consider using the system property jenkins.model.Jenkins.buildsDir to store all the build logs (and artifacts) outside of the jobs config directory. Some downtime and work required to separate the two (config / logs) .
You may also want to consider using a proper repository manager (Nexus / artifactory)
Finally, you may want to learn about using a Jenkins pipeline rather the relying on chained jobs, triggers or users and so forth. Why? 'cos it's much more controlled and easier to maintain.
ps: I'm not a huge fan of artifactDeployer, but it may work for you.
pps: you may want to review this in depth answer: Jenkis downstream job fails to find upstream artifacts

How to make Jenkins execute job only for the files that triggered build?

I have an svn repo and a certain Jenkins job for the stuff therein. Using Jenkins svn plugin's "include regions" feature, I can configure Jenkins to poll changes in certain folders or filetypes. But that is for triggering the job. When the actual job starts to execute, how do I know what were the files whose change triggered the build?
I can easily grep the answer out of svn log in a shell script if there is only one commit that triggers the build. But if there is an unknown number of commits causing my Jenkins job to start, I'm in trouble.
I'm asking this because I want my Jenkins job to run certain analysis ONLY for those files whose change triggered the build.
Multiple commits pushed at once also can execute the script. So I think you are in trouble already. So please maintain a file in job's workspace such that for every build at the end it will save its commit id. In your script, now check from that commit to current (HEAD) diff and check for changes in your files as per your constraints. And now run your job if all the conditions are met. Hope this helps.
I assume this may be fixed, but just in case, there is also the "Last Changes plugin" from Jenkins.
https://github.com/jenkinsci/last-changes-plugin
That makes a diff between what was in that environment, and what is about to be pushed and gives you the result.

Perform jenkins build on local files

We have a Jenkins server where I have already defined my job. It uses Perforce as SCM.
I would like to replicate all the steps that Jenkins takes to build the project but use the files in my local workspace instead. Basically, I would like to run a jenkins build locally based on a job defined on another server.
How would I do the same?
Something like what I created for my Perforce users might work for you -- I added a job in Jenkins that will grab shelved files (so, the user would need to shelve the files), create a build from there, then let the user know if it was successful (they also have the option of running tests or creating a deployable build). The gist of it is to request the shelved changelist #, then do this: "p4 unshelve -s %SHELVEDCL% " and proceed as usual. They use it when they feel like it; it's been useful. But it does require access to Jenkins.
1) Install Jenkins on your local workstation (if you have no already done so).
2) Copy the /Jenkins/jobs/ directory to the /Jenkins/jobs/ directory on your local workstation.
3) Fire it up and edit the Perforce workspace (and any other settings) as necessary.
IMO, you should probably takes these steps:
Create a new Jenkins job from the existing one. 2) Modify the job to
be a string "parameterized" job where you pass branch-name as the
parameter. You can do using "This build is parameterized" option in
the configuration of the job. 3) Under the configuration of the job,
for Source Code Management section, change the Branch Specifier to
use the String Parameter variable name (created from #2 above).
4) Create your feature branch on Perforce and make intended changes
there. 5) Run the newly created job with your branch as the parameter
to it.
Hope this helps.

Access Jenkins host drive, beside the job workspace

I would like to share byproducts of one jenkins job, with another one that run after.
I am aware that I can set "use custom workspace", but that would merge the jobs together; which is not what I want. I just need to move few files in a location, that are read by the next job.
So far I can't find out how you actually tell Jenkins jobs to look for a specific folder; since it does not have a concept of file system, beyond what is going on in the job workspace folder.
Is there a way to access the host file system, or declare a shared folder inside jenkins (like in the main workspace folder, which contains all the other jobs?), so I can copy and read files in it, from different jobs?
Where possible I would like to avoid plugins and extras; I would like to use what is included with Jenkins base.
I realize you want to avoid plugins, but the Jenkins-y way to accomplish this is to use the Copy Artifacts plugin, which does exactly what you want.
There are a variety of problems that you may run into when trying to manage the filesystem yourself. (How do you publish to a common location when running on different build nodes? How do you handle unsuccessful builds?) This solution uses Jenkins to track builds and artifacts. In the absence of a separate artifact repository, its a lot better than trying to manage it yourself.
To use Copy Artifacts:
As a Post-Build step, choose "Archive Artifacts" in the first job and enter the path(s) to the generated files.
Then in the second job, add a "Copy Artifacts from another project" build step to grab some or all files marked as artifacts in your first job. (By default, Jenkins will re-create the paths of the generated files in the second job's workspace, which may or may not be what you want, but you can change this behavior.)
Configure the Jenkins to run a Maven build, and deploy your artifacts with "mvn clean deploy" This will push it to an "artifact server" which you probably have, or if not, need to add / configure.
Then in your downstream job, also a Maven job, you configure it to depend on the same artifact that was published in the upstream job. This will trigger a download of the artifact from the artifact server and make it available to the build.

How do I trigger deploy after the successful build of a specific branch?

I have a Jenkins task that triggers on any changes made to a gitlab project.
There are a few situations I'd like to be able to set up, however I'm not sure how to best accomplish them. Most of it centers around being able to do the following:
Once the job is complete, I'd like to trigger another job that takes the contents of the first job's workspace (emptying out the initial one).
I'd like for a way to only run certain other jobs when the workspace contains a specific branch (automatically deploy develop branch to a preview environment).
"to trigger another job that takes the contents of the first job's workspace" see Shared workspace plugin:
This plugin allows to share workspaces by Jenkins jobs with the same SCM repos.

Resources