I'm using Jenkins on Windows with the Windchill RV&S (Integrity) configuration management plugin. I've noticed that from time to time, it deletes the #tmp directories for project workspaces, but it won't recreate them. i.e. if I have this directory structure:
C:\PTC\workspace\Project1
C:\PTC\workspace\Project1#tmp
Then sometimes the build will fail when C:\PTC\workspace\Project1#tmp goes missing. I don't know why Jenkins deletes it. Maybe it happens when I enable the cleanCopy option in my pipeline? The build failure message is usually something like this:
C:\PTC\workspace\Project1\build>make
..\..\path\to\compiler ..\..\path\to\source_file.c
#error compiler ..\..\path\to\source_file.c:0 can't create C:\PTC\workspace\Project1#tmp\s634.cx1
..\..\path\to\source_file.c:
make: Error code 1
So I re-create C:\PTC\workspace\Project1#tmp manually and the build works again. How do I tell Jenkins to create these #tmp directories automatically?
Edit:
My builds are happening daily (#midnight). They'll work fine for about a month or so, and then on the same day, all the #midnight builds will fail. Each #midnight build will lose its respective #tmp directory and every file in its workspace will be gone (but all the directories will remain). To recover from this, I have to manually recreate the #tmp directory and specify the cleanCopy option (otherwise Windchill RV&S won't restore the missing files in the empty workspace directories).
Related
I have many jenkins jobs using a shared library
#Library('my_shared_library') _
On the folder of each build of each project using the library, there's a local copy of it, for example:
jenkins/jobs/my_project/jobs/builds/10/libs/my_shared_library
That means, it's copied on each build's history.
Plus, I see it also on the project's workspace, under:
jenkins/workspace/my_project#libs/my_shared_library
On this last directory there's sometimes several copies of this, like :
my_shared_library
my_shared_library#2
my_shared_library#tmp
...
Question is, is there some special way of handling this so that there'll be less copies? Or, why's there a copy on the project's workspace and one on the build history?
This ends up creating lots of files, which caused my jenkins to run out of inodes.
I am confused on how to move the "Build Record Root Directory". Right now this is my configuration:
So on disk, for a multibranch pipline it looks like this:
Workspace:
Jobs:
So for my job "WindToolsService" the configuration ${ITEM_ROOTDIR}/builds results in the following path:
C:\Jenkins\jobs\WindToolsService\branches\develop\builds
Now I want to move the entire C:\Jenkins\jobs directory to a new disk, what is the correct setting to use?
We have tried the following but it didn't work, the C:\Jenkins\jobs folder was recreated when I tried a build a job.
Copy the contents of “C:\Jenkins\jobs” to “E:\builds”
Rename “C:\Jenkins\jobs” to “C:\Jenkins\jobs_temp” to make sure new setting works
Restart Jenkins
What am I missing here?
Update
Ok - this is not really possible, at least the way I want to do it.
The original path structure is : C:\Jenkins\jobs\JOBNAME\branches\BRANCHNAME\builds but the new path is E:\builds\JOBNAME\BRANCHNAME\builds. This mean I cant just change the setting and cut and paste the contents of the jobs folder, the directory structure is different. Also the jobs folder contains other job information apart from the build records.
I would be cleaner and easier just to move the entire jenkins installation
The jobs folder contains some other information besides the build history, for example it contains the actual job definition (more info here). Additionally the option Build Record Root Directory only affects the builds located in the job folder. So, you will need to keep the jobs folder and all the sub-folders for the jobs (but not the builds folder in all the job folders).
Apparently Jenkins expands the variables ITEM_FULL_NAME and ITEM_ROOTDIR differently. This means that it's not a simple matter of copying the jobs folder to the new location, especially when you use multibranch pipelines, matrix jobs, organisation folders or normal folders. For more info see discussion in comments
Anyway, with that in mind, here is an updated answer (old one is below):
Since it isn't straight forward as copying a folder, the folder structure needs to be transformed, this means that it is best to let Jenkins do the move. The following groovy script will move the builds folders to the correct place by expanding the different macros. The script is executed in the Script Console (Manage Jenkins -> Script Console). Change the newRootBuildsFolder to what ever location you need to move to.
newRootBuildsFolder = new File("E:\builds")
def visitJob(job) {
if (job instanceof Job) {
def oldBuildsFolder = job.getBuildDir()
def newBuildsFolder = new File(newRootBuildsFolder, Jenkins.expandVariablesForDirectory('${ITEM_FULL_NAME}', job.getFullName(), job.getRootDir().getPath()))
new FilePath(oldBuildsFolder).copyRecursiveTo("**", new FilePath(newBuildsFolder))
}
if (job instanceof ItemGroup) {
for (item in job.getItems()) {
visitJob(item)
}
}
}
visitJob(Jenkins.instance)
println "Done"
Please note that this may take several minutes or hours if the instance is big with loads with build history. You can monitor the progress of the script through the file system, i.e. go to the new builds folder and check which jobs has been moved.
After this script has been executed, the builds folder will need to be update in the Jenkins configuration;
Go to Manage Jenkins -> Configure System and change Build Record Root Directory to E:\builds\${ITEM_FULL_NAME}\builds
Restart Jenkins
OLD ANSWER:
Now, with that in mind, the following should work:
Wait until no builds are ongoing, and prevent new builds from starting during the remainder of these procedures.
Copy the contents of C:\jenkins\jobs to E:\builds.
Go to Manage Jenkins -> Configure System and change Build Record Root Directory to E:\builds\${ITEM_FULL_NAME}\builds
Restart Jenkins
Start a new build and ensure that the new build information is written to the folder under E:\builds\. E.g. if we have the job X and previous build was 123, then when we start a new build for X, build 124, then the folder E:\builds\X\builds\124 should be created, and the folder C:\jenkins\jobs\X\builds\124 should not be created.
Later on, when you see that all is working when, then you can delete the old builds folders under C:\jenkins\jobs\**\.
I am actually trying to ignore a particular file to be included in the build. Actually it is a thumb.db file which is automatically created whenever images are encountered in a folder. My solution contains the image folder. So basically, whenever the build is triggered from JENKINS, it will create the thumb.db file.
Is there any way, I can ignore the Thumb.db file from getting created via JENKINS?
I can switch off the thumb.db file from creation by switching it off from my windows, but I have to do it every time a build is created from JENKINS. So I want to ignore the thumb.db file from creation.
Below is the Job creation flow in my JENKINS from my current project:
SCM
I have used the Team Foundation Server Plugin and have mentioned my SERVER URL and PROJECT PATH
POST BUILD steps simply create the build to the staging location(folder Location)
Any help is appreciated.
Thanks,
AFAIK, on windows you can turn it off or on for the whole system and not for particular folders.
So you have to choose. On or off.
If you still want to keep it on, you'l have to handle this in your Jenkins build scripts. Remove any thumbs.db files after build is done.
If your output is an archive, make sure to exclude thumbs.db from it. All archives support an exclude flag.
I hope this helps.
Sorry, for the confusing title. I am totally new to Jenkins and have been handed over Jenkins to maintain which was set-up by someone else.
This is Jenkins Master slave config. I have 1 Master and 3 Slaves.
When I create a new job by "copying an existing" job, the new job works fine and no issues.
QUESTION: I see that in Jenkins workspace, this new job is creating a folder with the name of the original job that it was copied from. Why it is not creating a folder with the name of the new job instead?
Now, this is certainly not a show stopper for me, but it seems that Jenkins is creating a folder in workspace for each job that is run. And hence this particular folder is causing some confusion (although notional it is).
Hence, could you help me find out why the new job is creating a workspace folder with the name of original job it was copied from.
BTW, above issue was seen on the Jenkins slave.
It can be solved by configuring the correct building workspace in jenkins job.
General > Advanced > custom workspace > "give your correct workspace"
I had the same problem:
copied some jenkins project and wondered about hard coded workspace paths
Console output of the copied project. Job failed due to missing D: drive.
12:30:44 java.io.IOException: Failed to mkdirs: D:\TEAMS\WORKSPACE\RELEASE_1_1
The problem i had: the 'Advanced project options' were not expanded and the configure GUI had an enormous width, that i didn´t see the button to expand and show the 'advanced' settings.
In fact (thanks to sti): the original project had some hard coded workspace path.
One possibility is that you accidentally triggered the wrong job. You could change the job to print the directory where it executes by adding something like:
echo "XXX $JOB_NAME running in directory $WORKSPACE"
into the build step script. Then look for XXX in the build console log.
Second possibility is that you found an old workspace of the original job. Jenkins leaves workspaces lying around just in case it needs them again so it does not have to make them from scratch.
Third possibility is the original job is configured to use a hard-coded path as workspace. (Custom workspace). If you clone such a job, it would be a good idea to change the hard-coded path. An even better idea would be to let Jenkins manage the workspace and it's naming.
And finally, if all the other possibilities have been checked, you may have found a bug. You could look for it in https://issues.jenkins-ci.org/ and create a bug report if it is a new one.
When an ANT build step fails in my build I'd like to archive the logs in order to determine the problem. The relevant logs, however, are not located in the workspace, so I have to use a full path to them.
The standard artifact archiving feature does not work well with full paths, so first I have to copy the logs into the workspace within some build step so that I can later archive them. I do not want to incorporate the copying code into the original ANT script (it does not really belong there). On the other hand, since the build step fails the build I can't execute the code that copies the artifacts into the workspace as a separate build step as it is never reached.
I am considering using ANT -keep-going option, but how will I then fail the build?
Any other ideas (artifact plugins that handle full paths gracefully, for example)?
Update: I've worked around the problem by creating a symbolic link in the workspace to the directory that contains the files to be archived. Kludgy, but effective.
I would recommend using Flexible Publish plugin in conjunction with the Conditional Build Step plugin.
The Flexible Publish plugin allows you to schedule build steps AFTER the build steps have normally run. This allows you to catch both successful and failed builds and execute something - say a script that copies the files from OUTSIDE the workspace to INSIDE the workspace. The Conditional BuildSet plugin allows conditionalizing the steps so that they only run when the build fails. Using these two plugins, you can copy the files into the workspace upon failure, then archive them with the usual Jenkins mechanisms.