We are using thinBackup (1.73) on our Jenkins for Windows (1.53x) and the backups appear to be working properly. I then have a daily Jenkins job that copies the output to the NAS.
Our Jenkins jobs use .properties files in their respective jobs directory for things such as source control, etc which don't get backed up as part of the thinBackup job. So, when running through a restore scenario, obviously jobs are now incomplete. Is there a way to get thinBackup to back up other files within the respective jobs directories?
The SCM Sync Configuration plugin looks like a better option, as it allows you to specify additional user-defined files to back up.
Related
Is there a way to configure the JaCoCo Jenkins Plugin coverage threshold through a shell script or API? For ex: I want to make an app to change code coverage threshold values for my Jenkins Jobs. How would I do it if I want my Jenkins instance abstracted?
Okay, turns out its a bit simple really. Plugin configurations are stored in an XML file. Global configurations in .jenkins root folder and job specific configurations in $HOME/.jenkins/jobs/{JOB_NAME}/config.xml.
Modify the config.xml file to store new configurations. This configuration file is exposed by each job at http://<SERVER>:<PORT>/jenkins/job/<JOB NAME>/config.xml. Since Jenkins loads this data at first load, you need to execute 'Reload Configuration From Disk' in Global configuration.
Since we're updating the XML from an API, you need to tell jenkins to reload configuration from an API as well. To do that, execute a shell to use jenkins_cli.jar's reload-configuration command.
Reference : Does anyone know how to reload hudson configuration without restarting?
I want to use Jenkins and store the configuration and the pipeline in my SCM(e.g. git). To do so, I created a directory, let's say "jobs" in the root of my project where I will store jobs.groovy files written as JobDSL plugin files.
Should I do all the things in a single job file, like fetching the source code, testing it, maybe building Docker images if necessary, then deploying on AWS cloud? Or for each operation, should I create different jobs? If so, then how can I create a pipeline using these job files?
look at jenkins configuration as code plugin. following link would be helpful
https://github.com/tomasbjerre/jenkins-configuration-as-code-sandbox
I would like to share byproducts of one jenkins job, with another one that run after.
I am aware that I can set "use custom workspace", but that would merge the jobs together; which is not what I want. I just need to move few files in a location, that are read by the next job.
So far I can't find out how you actually tell Jenkins jobs to look for a specific folder; since it does not have a concept of file system, beyond what is going on in the job workspace folder.
Is there a way to access the host file system, or declare a shared folder inside jenkins (like in the main workspace folder, which contains all the other jobs?), so I can copy and read files in it, from different jobs?
Where possible I would like to avoid plugins and extras; I would like to use what is included with Jenkins base.
I realize you want to avoid plugins, but the Jenkins-y way to accomplish this is to use the Copy Artifacts plugin, which does exactly what you want.
There are a variety of problems that you may run into when trying to manage the filesystem yourself. (How do you publish to a common location when running on different build nodes? How do you handle unsuccessful builds?) This solution uses Jenkins to track builds and artifacts. In the absence of a separate artifact repository, its a lot better than trying to manage it yourself.
To use Copy Artifacts:
As a Post-Build step, choose "Archive Artifacts" in the first job and enter the path(s) to the generated files.
Then in the second job, add a "Copy Artifacts from another project" build step to grab some or all files marked as artifacts in your first job. (By default, Jenkins will re-create the paths of the generated files in the second job's workspace, which may or may not be what you want, but you can change this behavior.)
Configure the Jenkins to run a Maven build, and deploy your artifacts with "mvn clean deploy" This will push it to an "artifact server" which you probably have, or if not, need to add / configure.
Then in your downstream job, also a Maven job, you configure it to depend on the same artifact that was published in the upstream job. This will trigger a download of the artifact from the artifact server and make it available to the build.
I have 15 Jenkins jobs configured in order to implement a specific flow. I am improving and editing these jobs as time goes by.
Is there a way to save all these jobs and its configurations to a repository or at least in the form of export jobs, save it and import when needed?
There are two plugins that will help you save Jenkins jobs, "SCM Sync configuration" and "JobConfigHistory" defined at the wiki.jenkins-ci.org website.
SCM Sync Configuration Plugin (which keeps the config in a SCM repository)
or
Job Config History Plugin (Saves copies of all job and system configurations)
The Job DSL Plugin allows to define the jobs in a DSL and store the DSL scripts in a SCM repo. The DSL increases the readability of the config files in contrast to the XML format.
For an intro, see the slides and video from the Configuration as Code: The Job DSL Plugin talk at the Jenkins User Conference 2015 in London.
You can move/copy jobs to another destination simply by copying directory with job (default path for jobs directories is /var/lib/jenkins/jobs).
You can get more info here - https://wiki.jenkins-ci.org/display/JENKINS/Administering+Jenkins
The Workflow feature may let you write your entire process as one (Groovy) script, which you can then maintain in your version control system alongside other sources.
I have two Jenkins, both are master. Both have 5 salve Jenkins each. I have one job on first jenkins that needs to be cloned for each job.
I can clone the job on first jenkins and its slave but not on second master jenkins. Is there a way to clone a job from one jenkins to another?
I have one more question can I archive the job at some defined location other than master jenkins, May be on slave?
I assume you have a job called "JOB" on "Jenkins1" and you want to copy it to "Jenkins2":
curl JENKINS1_URL/job/JOB/config.xml | java -jar jenkins-cli.war -s JENKINS2_URL create-job
You might need to add username and password if you have turned on security in Jenkins. The jenkins-cli.war is available from your $JENKINS_URL/cli.
Ideally you should make sure you have the same plugins installed on both Jenkins1 and Jenkins2. More similar you can make the two Jenkins masters, the fewer problems you will have importing the the job.
For the second part of your question: slaves don't store any Jenkins configuration. All configuration is done on Master. There is a lot of backup plugins, some backup the whole Jenkins, some backup just job configuration, some backup individual jobs, export them to files, or even store/track changes from SCM such as SVN.
So "archiving job configuration to slave" simply makes no sense. But at the end of the day, a job configuration is simply an .xml file, and you can take that file and copy it anywhere you want.
As for the first part of the question, it's unclear what you want. Do you want to clone a job automatically (as part of another job's process), programmatically (through some script) or manually (through the UI, other means)?
Edit:
Go to your JENKINS_HOME directory on the server filesystem, navigate to the jobs folder, then select the specific job folder that you want.
Copy the config.xml to another server, this will create the same job with the same configuration (make sure your plugins are same)
Copy the whole job_name folder if you want to preserve history, builds, artifacts, etc