I am just starting with Jenkins 1.487 and wanted to integrate Jenkins in my Ant project. But while configuring it, I can't find any way to make Jenkins re-use an already checked out codebase, instead of downloading a fresh copy relative to its workspace root. Is there a way to do that ?
I tried to specify a custom workspace manually (where my codebase was already checked out), and clicked on 'Build now'. The result was that it wiped out my current checked out code saying
"Checking out a fresh workspace because there's no workspace at /home/daud/Work
Cleaning local Directory ."
Not even a warning..
If you really want to build from an existing checkout somewhere on the file system, then do not use "Source Code Management" section of Jenkins. Leave it as "none"
Go straight to the "Build" section
Click "Add Build Step"
Select Invoke Ant"
Click Advanced
And under "Build File", provide a full path to the ant build file on your file system. You would have to include the drive letter (if on Windows) or a leading / (if on Linux) to break from the Workspace (by default, this path is relative to Workspace). Or use a lot of ../../../.. if needed.
But like others have said, this is not the way a CI system is supposed to be used
The idea behind Jenkins and CI is that it works on a fresh copy of the codebase. Every build done by Jenkins should not depend on any external preconditions and it should be reproducible.
You might want to try using the Clone Workspace SCM Plugin for Jenkins. It will allow you to zip up the workspace from one job and use it to create the workspace for another one. I've used this for downstream jobs that need to act on the work from a previous job.
This is also helpful if you're using something like Git for source control and want to avoid a second Git clone (or SVN checkout). Furthermore, you can limit the content of zip file that is used to recreate the workspace, for example to avoid carrying unnecesary files (e.g. the .git or .svn directories) downstream.
Related
I am implementing Jenkins into already established Perforce workflow.
Each of the workspaces we have in Perforce(and there is a lot of them) is using the Drive letter( for example D:\ ) as the root directory for the workspace.
I am using p4Plugin in Jenkins to sync the code before running the actual scripts. And Jenkins has it's own workspace which is being used every time I start to sync the code.
I tried using the Spec file, for workspace behavior in P4 Plugin, where I would specify the root to be D:\ but whenever it loads it will still create jenkins workspace root.
I also tried using the Static workspace behavior, and that will work, but the problem is that in order for that workflow to work, the person needs to create a workspace manually on the worker of jenkins setup, and then create the job, which is then defeating the purpose of using jenkins at first. Plus we need a workspace per job.
Which made me think, if I use an already existing workspace with D:\ being the root, and use a Temp workspace behavior in jenkins, that it will copy the root settings as well as other ones. But unfortunatelly it also sets the sync to be to the jenkins workspace.
In short, all I want is to be able to use the D:\ drive to sync all the code instead of putting them into the jenkins root directory and syncing the code to the project folders inside.(ex C:\JenkinsData\syncProject...)
That's the design of the p4 plugin. It puts the workspace where jenkins asks us to.
See property jenkins.model.Jenkins.workspacesDir here: https://wiki.jenkins.io/display/JENKINS/Features+controlled+by+system+properties
I don't think the default in that wiki is correct.
On all your master and slaves, you can try to change that to just D:\
That assumes your client view definitions (right hand side) will not overlap.
Otherwise:
A "form-in client" trigger script can alter the root. The script should only change jenkins relevant clients, so you'll need to pass something to the script in the trigger definition to signify that it is for a jenkins job. Examples could be a client naming convention and/or the clientip.
Your Perforce Admin, if that's not you, will have to assist.
I am using the cleanup feature in Jenkins, which delete the previous build directory and create a new one every time. This is great, except that I need to maintain certain files in the build directory, so I am trying to delete just the source folder that contain the code.
The problem is that when the build start, the first thing that happen is the git checkout of the code, which means that if I put a delete command in the jenkins script area, it will delete the directory that was checked out, and that obviously won't work.
Is there a way to tell Jenkins to perform commands before the git checkout happen? Or to cleanup selectively the build folder, so Jenkins know what to keep and what to delete?
Use the pre-scm-buildstep plugin. It will let you do all sorts of things prior to touching your SCM.
I have renamed a Jenkins Job from the Jenkins GUI. I changed the Project name in the Configure menu and hitting Save afterwards.
However the workspace name for this Jenkins job has not been changed. What I am finding is upon the job execution a new workspace is getting created with this given new name and none of the contents of the old workspace is getting copied.
So the issue is contents of the old workspace is not copied to the new workspace.
What should I do instead?
I know there are several questions in SO in this area. However those do not answer my question.
Renaming job in jenkins/hudson
Rename a job in Jenkins
So please check this before marking this question as a duplicate.
I was able to workaround this is issue using the Use custom workspace option.
To change this location, I need to choose configure job and click on the Advanced button in the Advanced Project Options section.
After opening the settings, you will find some more configuration options for your job. Look for the Use custom workspace option on the right hand side and check the box.
Reference: Jenkins: Change Workspaces and Build Directory Locations
Workspaces are volatile by nature and may reside on a build node which has gone offline, therefore your build job should not rely on files being present in the workspace. However sometimes you will benefit from a speed-up by reusing unchanged files existing in workspace and decide not to clean them.
When you start a build, a new workspace is (as you noted) created, this is the correct behaviour, you should not need to store files in your workspace between builds but set up your system to load all sources from your vcs. This way you will always be able to make a fresh build from source, there are also a few options available to clear the workspace from old files.
If you do not want to populate the workspace from a source code addon you can always use the custom shell script feature to run a few shell commands to copy the needed files.
I have a repo that has 2 subfolders $/Repo/project and $/Repo/thirdparty. I need to pull both of those into Jenkins for a single build. Naturally I tried just pulling $/Repo, but this gives me a bunch of other projects along with false polls (it will build every time ANYTHING is checked into $/Repo). I have tried using the multi-scm plugin which works, but does not save the configuration (annoying, but not unusable). I tried using the regular tfs plugin and manually putting the calls for the other repo into a windows command (this did not work even through i bound them them to different folders).
What is the best way to approach this? Some sort of subjob that pulls third party? Fix the multiple scm plugin? Is there some tfs command or trigger to pull a different repo when you pull a project?
I was able to get this working with a job pipeline. It's kinda hacky, but it works.
The program I'm trying to build uses $/Department/Framework/Main (as workspace\Framework), and $/Department/Products/TheProgram/Main (as workspace\TheProgram).
I created three jobs in Jenkins, each "downstream" of the other:
Framework-Get: normal source code triggering on TFS' Project Path of $/Department/Framework/Main. No build step.
TheProgram-Get: normal source code triggering on TFS' Product Path of $/Department/Products/TheProgram. No build step.
TheProgram-Build: No source code control. But the build steps xcopy's the source from the above two steps. Then, you can run a normal build step.
TheProgram-Build's first build step is a windows batch command:
REM ====================================
REM First Get the Framework folder:
rmdir /s/q Framework
mkdir Framework
xcopy /y /q /e ..\..\Framework-Get\Workspace\Framework Framework
REM ====================================
REM Then Get the TheProgram Folder:
rmdir /s/q TheProgram
mkdir TheProgram
xcopy /y /q /e ..\..\TheProgram-Get\Workspace\TheProgram TheProgram
The second build step was a simple call to ant. But you could use msbuild or whatever you like here.
The TFS pluging for Jenkins currently does not support checking out the sources from multiple locations. multiple-scm-plugin might be the answer, but as You pointed out in the question - it's really not an option at this point. There are really, as far I can see it, only to possible solutions for you to test out:
Create a workspace within TFS that will include all the neccesary imports. I use this functionality in my every-day encounters with TFS, although I have never a chance to use that with Jenkins plugin. It might work, it might not.
You can use, and please - this is a quite serious option, at least for me - git. There is a git-tfs and import all of the required projects into git repository. And having them in git will open a bunch of possibilities for you, including using separate repos for every folder, using git modules, git externals... and so on. So, at least for me, it is a valid option, although it seems like an ugly workaround at the first look...
The TFS plugin supports the ability to cloak folders in your $\Repo that you are not interested in. Checkins to cloaked folders will not trigger a build. Unfortunately that may be a lot of folders and you are only interested in two - you would need to maintain the list of cloaked folders as new ones are added.
We avoid the TFS plugin and instead scripted the setup of our TFS workspaces via powershell step using the tfs commandline. Each build specifies the folders it wants and the script takes care to cloak/uncloak the remainder.
My solution to this is to create two Jobs, one that just download your dependency and another to make the build.
In my case I managed the build with Maven properties, for example:
pom.xml
<properties>
<my.dir>../MyDir</wsdl.dir>
</properties>
Jenkins Build
Goals clean package -U -Dmy.dir=${WORKSPACE}/../../another-build/workspace/MyDir
I had to create a workaround myself for Jenkins. This was achieved using both TF and the powershell Snapin Microsoft.TeamFoundation.PowerShell.
Basically the workflow is as follows :
Get-TFsWorkspace (Powershell : To check for the workspace)
TF Workspace /new (To Create a workspace)
TF Workfold /unmap (use this to remove the default $/ mapping which is made during workspace creation)
TF Workfold /map (To map specific locations, ie $/Repo/project)
TF Scorch (to remove any artifacts if there are any)
TF Get (To get the code)
There may be other methods that people have, but this would allow you to use the tf Workfold /cloak functionality as well.
Cheers,
Hope this helps.
Can confirm that that Multiple SCM 0.5 works with the Team Foundation Server plug-in 4.0
The polling does seem to break however.
I'm already finishing my project build automation :) with Hudson and Nant.
My project structure is something like
$/Project
build.scripts
script1.build
script2.build
build.properties.xml
Code
Project1
Project2
So Hudson downloads from the root $/Project to the workspace folder.
And everything is ok since the build.scripts are in the workspace, I run them very easily, however what is bugging me is the fact that since the build scripts are inside the workspace, then I can't program Hudson to run automatically either based on time or changes because it will always detect changes to the files (note build.properties.xml which I check out and check in at build time to store some stats).
Where do you recommend these files to go in and still get the advantage of having them source-controlled?
What I ended up doing is to NOT check-in changes to those files. I changed my CI workflow to create another file (local to the workspace only) where the changes are written to.
This way, I still get the last build info written somewhere to pick it up, and avoid the issue of Jenkins detecting the change.
PS: I changed from Hudson to Jenkins since I saw that most plugins ran away from the former. The transition was too smooth to be true.