Is there a way to store the whole build.xml file within Jenkins? - jenkins

Current Setup: Ant deployments to salesforce using Jenkins for CI. Pulls from BitBucket repository with paramaterized build.xml stored in the repo. The build properties are set individually for each job in Jenkins.
The Problem: Build properties could be modified if someone changes the build.xml to not reference the variables set in Jenkins. Developers have the ability to change the testlevel, which we would like to prevent. Also, if we ever need to modify the build.xml, we don't want to have to cascade the changes across all of our branches.
Is there a way to remove the build.xml file entirely from the repo and store everything in Jenkins?

I think you are looking for the Config File Provider plugin:
https://wiki.jenkins.io/display/JENKINS/Config+File+Provider+Plugin
I have only played with it a little, but I think it will do exactly what you want. You will just have to copy the file in to the workspace before you run your build.

Related

Jenkins workspace root getting overwritten or not accepted

I am implementing Jenkins into already established Perforce workflow.
Each of the workspaces we have in Perforce(and there is a lot of them) is using the Drive letter( for example D:\ ) as the root directory for the workspace.
I am using p4Plugin in Jenkins to sync the code before running the actual scripts. And Jenkins has it's own workspace which is being used every time I start to sync the code.
I tried using the Spec file, for workspace behavior in P4 Plugin, where I would specify the root to be D:\ but whenever it loads it will still create jenkins workspace root.
I also tried using the Static workspace behavior, and that will work, but the problem is that in order for that workflow to work, the person needs to create a workspace manually on the worker of jenkins setup, and then create the job, which is then defeating the purpose of using jenkins at first. Plus we need a workspace per job.
Which made me think, if I use an already existing workspace with D:\ being the root, and use a Temp workspace behavior in jenkins, that it will copy the root settings as well as other ones. But unfortunatelly it also sets the sync to be to the jenkins workspace.
In short, all I want is to be able to use the D:\ drive to sync all the code instead of putting them into the jenkins root directory and syncing the code to the project folders inside.(ex C:\JenkinsData\syncProject...)
That's the design of the p4 plugin. It puts the workspace where jenkins asks us to.
See property jenkins.model.Jenkins.workspacesDir here: https://wiki.jenkins.io/display/JENKINS/Features+controlled+by+system+properties
I don't think the default in that wiki is correct.
On all your master and slaves, you can try to change that to just D:\
That assumes your client view definitions (right hand side) will not overlap.
Otherwise:
A "form-in client" trigger script can alter the root. The script should only change jenkins relevant clients, so you'll need to pass something to the script in the trigger definition to signify that it is for a jenkins job. Examples could be a client naming convention and/or the clientip.
Your Perforce Admin, if that's not you, will have to assist.

Export file generated by a project

I have a Java project that is running as a job on a Jenkins server. The project is generating a file that is being stored locally on the server in the respective project folder. Currently, to get hold of this file, I am logging into the Jenkins server and get it manually. I would like to make this file available for download directly through the Jenkins job somehow. Not sure if there is a way to do that though. Is there a plug in that might add this functionality or is there any other way to achieve that?
Archive the file. For a freestyle job, that would be a "Post Build Step". For a pipeline job, use archiveArtifacts.
You definitely don't want to rely it on it being in a workspace on the Jenkins agent machine somewhere. The workspace directories can move and change, or be removed.

Jenkins ArtifactDeployer simple creates a new dir in base dir?

I am in Jenkins and using ArtifactDeployer. The console output tells me that the remote repo is http:// myrepo but all it does is create a new folder in my base directory which I also specify in this plugin. It correctly finds only one file to copy but strangely just creates a new directory and copies it in there. I thought this would enable me to deploy artifacts to another server... Can I do that?
No you can not do that with artifacts deployer but there are other ones you can use - read below:
Jenkins provides by default a feature for archiving artifacts generated by the build of a Jenkins Job. These artifacts are archived in the JENKINS_HOME directory. However, this directory contains also tool configurations (global and job configurations). Therefore, there is no separation between infrastructure data, jobs data and generated elements. It is often considered to be a bad practice and it doesn't help to manage it from an administrator point of view.
Unfortunately, it's not possible to extend the 'archived artifacts' feature to archive artifacts in a location other than JENKINS_HOME.
The main goal of the ArtifactDeployer plugin is to archive artifacts in your desired locations (other than JENKINS_HOME directory).
There are many Jenkins plugins close to ArtifactDeployer such as CopyArtifact plugin or CopyArchiver plugin for publishing artifacts from Jenkins resources (from the current workspace, from the old builds of the same job or other jobs, ...) to remote locations with the protocol file://
There are also others plugins for managing other protocols such as ftp://, ssh:///.

Build multiproject Gradle on Jenkins

I have a Gradle multiproject hosted in Mercurial repo. I would like to setup my Jenkins in such a way, that if I commit changes into only 1 subproject, then only that subproject will be built and published to my Nexus repo.
Can somebody give me a hint? Or is it at all possible?
We sort of have this working.
We create a project in Jenkins for each gradle subproject. And in the Jenkins configuration we build only the subproject by doing something like:
gradle clean :<subproject>:build
We still have the problem that the job is fired for all checkins to the entire project. I would to configure Jenkins to build only when there's checkin to the subproject, but don't know how to specify this.
Leaving our final solution for the future here.
We created a separate Jenkins job for each subproject. Jenkins' Mercurial plugin allows to specify "modules":
Reduce unnecessary builds by specifying a comma or space delimited list of "modules" within the repository. A module is a directory name within the repository that this project lives in. If this field is set, changes outside the specified modules will not trigger a build (even though the whole repository is checked out anyway due to the Mercurial limitation.)
This way our jobs are triggered only when change occurred in the monitoring sub-project.
I guess you need to create a project in jenkins for each subproject.
Other option would be to find if there is a way to intercept the repo sync and see what subproject has changed and do the build dynamically.

Jenkins - Is it necessary to have a repository for running a multi configuration project

I am new to Jenkins. I am trying to configure Jenkins to run my Selenium tests on multiple browsers. So I thought multi-configuration project would be a best choice. I am using Sauce labs for performing cross-browser testing.
My selenium source code is on my local system directory. I have not uploaded the same to any of the repositories. I have configured a multi-configuration project with a custom workspace pointing to my local source code, and selected "none" in Source code management section.
Now, when I build the job, the job creates workspace for each browser combination. Eg: <project workspace>\SELENIUM_DRIVER\Windows 2003firefox16 and <project workspace>\SELENIUM_DRIVER\Windows 2003internet explorer8. But the files are not copied to each of these workspaces automatically. I need to copy my files manually into these directories for it to work.
Is it necessary to have Repositories like SVN, CVS or Git for this to work. Or is there a way I can run the build from my local system?
For this to work, a Repository is not required,
but you do need to have some good way to access your artifacts and selenium code.
Suggest you drop the artifacts on a shared drive as a preliminary step,
and also put your selenium source-code on a shared drive, as a practice -
this will allow you to run multiple tests from multiple machines.
Cheers

Resources