Whenever I create a new job on Jenkins, it creates 2 workspaces on the perforce server. (with a suffix -0 and -1)
Is it possible to tell Jenkins to only create a local workspace on the machine and not create the workspace on the Perforce server.
As I have many Jenkins jobs, it is soon going to be clutterd on the perforce server with so many workspaces.
Perforce is a centralized system where the central server is the source of truth for the state of each workspace. It's technically possible to pull files from Perforce without creating a tracked workspace, and you could technically rewrite the Jenkins plugin to do this, but practically speaking that's like surfing Wikipedia via curl because you don't want to clutter your desktop by installing a browser.
My recommendation is to configure Jenkins to prefix all of its workspaces with a conveniently filterable string like ~jenkins~ so you can easily ignore them, and then to move on with your life.
Related
I have around 100 linux servers that need to be added to a Jenkins master. The situation here is I need to add them by Copy Existing Node and the Jenkins master should not be shutdown/restart.
I don't want to do it manually for a hundred times. Is there any automation way to handle such request. Thank you in advance.
You could script this (self-automate). The Jenkins agent configuration files are located in the nodes subdirectory in the Jenkins home directory. You'd create a sub-directory for each node and inside that put a config.xml file for that nodes configuration. I recommend that you shutdown your Jenkins server while doing this, we've observed Jenkins deleting things when doing this while it is running. Use an existing agent's config.xml file for a template. Assuming all of your servers are configured the same, you need only update the name and host tags, which can be automated using sed.
Update with zero-downtime:
CloudBees has a support article for creating a node using the Rest API. If you'd prefer to use the Jenkins CLI, here's an example shell script. Neither of these approaches will require restarting Jenkins.
We have some jobs set up which share a workspace. The workflow for the various branches is:
Build a big honking C++ project called foo.
Execute several downstream tests, each of which uses the workspace of foo.
We accomplish this by assigning the Use custom workspace field of the downstream jobs to the build workspace.
Recently, we took one branch and assigned it to be build on a Jenkins slave machine rather than on the master. I was surprised to find that on master, the foo repository was cloned to $JENKINS_JOBS_PATH/FOO/workspace/foo_repo - while on the slave, the repository was cloned to $JENKINS_JOBS_PATH/FOO/foo_repo.
Is this by design, or have we somehow configured master and slave inconsistently?
Older versions of Jenkins put the workspace under the ${JENKINS_HOME}/jobs/JOB/workspace directories. After upgrading, this pattern stays with the Jenkins instance. New versions put the workspaces in ${JENKINS_HOME}/workspace/. I suspect the slaves don't need to follow the old pattern (especially if it is a newer slave), so the directories may not be consistent across machines.
You can change the location of the workspaces on the master in Jenkins -> Configure Jenkins -> Advanced.
I think the safe way to handle this... If you are going to use a custom workspace, you should use that for all of your jobs, including the first one that builds the big honking c++ project.
If you did this all in a pipeline, you can run all of this in a single job and have more control over where all the files are, and you have the option of stash and unstash, but if the files are huge, stash may not be the way to go.
You can omit 'Use custom workspace' option for each job and instead change master and/or slave workspace paths and use
%WORKSPACE%/../foo_repo path
or (that equal)
./../foo_repo path
In that case
%WORKSPACE% = [master or slave node workspace]/[job name]
and
%WORKSPACE%/../ = [master or slave node workspace]
I recently started working in a way for speeding up the building time of a relatively big software code base of my company. This code base is using RTC for source code management and after looking and trying I ended using jenkins for automation of the process. I started by creating my build server in a local machine and configure the repository through the RTC plugin, which works quite well using the poll SCM option and with the workspace repository-rtc option. However, I have to move this job to the official jenkins company server, but keeping the job execution in the original local PC. I have added the PC as a jenkins node and I have not had any problem with reaching it through jenkins, but my question/assumptions are the following:
It looks like the job is executing the RTC buildtoolkit from the slave (or at least I had to configured the RTC path in the node.
For some reason, it looks like the polling in jenkins is always looking for the repository in the master, even when I add a SCM pre step in which I can validate that the job is running in the slave system.
My question: Is there any way of ensuring the polling happening in the slave (without scripting or adding external solutions, just using RTC plugin)? For security reason I cannot add additional plugins to jenkins or to create anything in the master, I only got a free job to configure.
Thanks.
I am setuping a Jenkins environment to manage workflows of Python projects. This Jenkins install is running on a Windows 7 machine and I need to backup the Jenkins config to avoid potential loss of work in case of HDD failure (for example).
I tried the SCM sync configuration plugin but this one is not compatible with the Subversion plugin I use and caused Jenkins to display only a white screen when I activated it. So it is not usable.
I also tried the thinBackup. It works well but, due to Jenkins being ran as a local service, it is not able to save backups on a network drive (and backuping on the same drive than Jenkins is not very insteresting). You would think that I just have to run Jenkins with a network user, but in this case it would not have sufficient local privilèges.
I am thinking about creating a Batch (or Python) script which could deal with SVN to backup the Jenkins configuration by adapting what is described in this page but I am not very happy to write a SVN account password in a Batch (or Python) script which could potentially be seen by anybody.
So I would know if it exists an other way to achieve this Jenkins configuration backup.
Or at least, does it exists a way to perform svn commands without showing anybody a clear password?
The issues with the SCM sync configuration plugin sadden me, too. What we do with our Jenkins instances, is: we use thinBackup to run regular backups and store them in the default folder on the same HDD. Then we have a daily cron job rsync them with a folder on another HDD. So if Jenkins is running on Windows, you would probably achieve the same using the Windows Task Scheduler and cwRsync, for example.
I have two Jenkins, both are master. Both have 5 salve Jenkins each. I have one job on first jenkins that needs to be cloned for each job.
I can clone the job on first jenkins and its slave but not on second master jenkins. Is there a way to clone a job from one jenkins to another?
I have one more question can I archive the job at some defined location other than master jenkins, May be on slave?
I assume you have a job called "JOB" on "Jenkins1" and you want to copy it to "Jenkins2":
curl JENKINS1_URL/job/JOB/config.xml | java -jar jenkins-cli.war -s JENKINS2_URL create-job
You might need to add username and password if you have turned on security in Jenkins. The jenkins-cli.war is available from your $JENKINS_URL/cli.
Ideally you should make sure you have the same plugins installed on both Jenkins1 and Jenkins2. More similar you can make the two Jenkins masters, the fewer problems you will have importing the the job.
For the second part of your question: slaves don't store any Jenkins configuration. All configuration is done on Master. There is a lot of backup plugins, some backup the whole Jenkins, some backup just job configuration, some backup individual jobs, export them to files, or even store/track changes from SCM such as SVN.
So "archiving job configuration to slave" simply makes no sense. But at the end of the day, a job configuration is simply an .xml file, and you can take that file and copy it anywhere you want.
As for the first part of the question, it's unclear what you want. Do you want to clone a job automatically (as part of another job's process), programmatically (through some script) or manually (through the UI, other means)?
Edit:
Go to your JENKINS_HOME directory on the server filesystem, navigate to the jobs folder, then select the specific job folder that you want.
Copy the config.xml to another server, this will create the same job with the same configuration (make sure your plugins are same)
Copy the whole job_name folder if you want to preserve history, builds, artifacts, etc