I'm already finishing my project build automation :) with Hudson and Nant.
My project structure is something like
$/Project
build.scripts
script1.build
script2.build
build.properties.xml
Code
Project1
Project2
So Hudson downloads from the root $/Project to the workspace folder.
And everything is ok since the build.scripts are in the workspace, I run them very easily, however what is bugging me is the fact that since the build scripts are inside the workspace, then I can't program Hudson to run automatically either based on time or changes because it will always detect changes to the files (note build.properties.xml which I check out and check in at build time to store some stats).
Where do you recommend these files to go in and still get the advantage of having them source-controlled?
What I ended up doing is to NOT check-in changes to those files. I changed my CI workflow to create another file (local to the workspace only) where the changes are written to.
This way, I still get the last build info written somewhere to pick it up, and avoid the issue of Jenkins detecting the change.
PS: I changed from Hudson to Jenkins since I saw that most plugins ran away from the former. The transition was too smooth to be true.
Related
So I'm running into a specific issue, I have a Jenkins Declarative Pipeline (from an SVN hosted Jenkinsfile) that is configured to not run concurrent builds and abort previous builds when a new build is triggered.
This works perfectly fine, however the problem I am running into is that Jenkins will re-checkout the whole repository to an #2 suffixed workspace directory for the subsequent build (this ONLY happens when a build is automatically aborted after a new one is triggered, if the first build ends successfully, it re-uses the same directory).
I've seen a ton of threads stating that this is by design, but from what I can see that's only when concurrent builds are enabled, but since it's not I'm confused as to what could cause Jenkins to not re-use the same workspace directory?
If the "why" I require this is necessary, I have a few large repositories (for Unreal Engine games specifically), that I need to build and as an optimization measure for the time in compiling, cooking and uploading the game, it makes perfect sense to cancel old builds but instead Jenkins decides to clean checkout 10+GB of game code and assets (20+ in the case of some other games) in another folder becuase it can't reuse a folder that's not having a job/build executed in it already 😅.
Happy to accept all possible solutions/suggestions as I'm getting a lil' tired of pulling my hair out.
I was facing the same issue with my pipeline. I tried deleting the aborted builds and restarted jenkins. I also deleted the #2 directories in my workspace and only kept the main directory out there. Post this,I didn't face the same issue. This could happen because of the jenkins cache. Make sure that your workspace is correctly reflecting the directory name mentioned in your jenkins file.
I am new to the world of scripting with TFS2015. I created a script that builds all of the projects within my solution (it is a rather large solution) and puts it out in a shared folder (where each project has its own subfolder).
I would like to create a separate script for each project that simply copies the bin folder from the shared and pastes it out on my Test environment. I rarely need to deploy everything, so the idea is one build...multiple deploys.
However, when I run my deploy script using the Copy Files step it is doing another build. Although it copies the files that I expect, it is after a full build that creates the folder structure for the build.
Am I able to make the Copy Files step NOT do a Build?
Here is the steps that my script is curently doing:
As you can see, there is only one step (Copy Files) but it still does the Get sources and copies everything into a new folder on the build box like so (where the number keeps incrementing up with each run of the script):
I just want to copy the files from the Source to the Target and not do a build or Get Sources.
It looks like you're still on TFS 2015 RTM or Update 1. Which is already pretty old technology if you compare it to the lifetime of the new build system which was introduced with this version.
TFS 2015 update 2 has introduced a similar system to the Build pipelines to orchestrate Releases. This doesn't require you to map any workspaces or git repositories and can act on the artefacts of your builds or simply on the contents of file shares.
It makes sense that a Build has to build something and in order to build something, it has to get the things to build. If you're actually not building something, then you're probably deploying or releasing or packaging something else. Hence the distinction between Build and Release pipelines.
TFS 2017+ has an option to disable the syncing of sources. Primarily to allow people to get the sources themselves in creative ways (e.g. a custom powershell script that invokes git.exe).
My primary advice would be to upgrade to TFS 2018 update 3 or at least TFS 2017 update 3.1, worst case TFS 2015 update 4.1. The fact that versions older than update 2015.4.1 have a known XSS scripting security bug may be reason enough to convince your organisation to perform this update.
Barring that option you're left with one solution:
Link your build definition either to a git repository with only a single commit (If I remember correctly the 2015 agent still crashes when syncing an empty Git repo) or link it to a TFVC repository and set the workspace settings to cloak everything. This essentially causes the build to sync an empty folder, which it can cache, before calling your powershell script.
JENKINS
I am noticing that the every time I run one of my jobs in Jenkins, there are two files created in the /workspace/build/distributions dir. The two files have the extensions of .tar and .tgz. Every time, I run the job, another set of these files are created. So, if I run the job 3 times, there will be 6 files all together. I have noticed that during the dependency check phase, these artifacts slow things down. Therefore, I wanted to remove them automatically before each time this job runs. I have attempted the configs in the image below. In addition, I have tried the workspace cleanup plugin and that completely deleted the workspace. That is definitely not what I wanted.
Therefore, what would be the best way to go about this.
What scm plugin are you using? Some of the scm plugins allow you to clean the workspace before an update (e.g. SVN's "Emulate clean checkout" and Git's "Clean before checkout" options).
If you're not using a scm plugin, can you remove the files in a batch/shell script during the first build step?
Or perhaps you can go about it from the reverse direction. Can you get rid of the files as the last build step of the job? That way, they are gone when the next build comes along.
I have renamed a Jenkins Job from the Jenkins GUI. I changed the Project name in the Configure menu and hitting Save afterwards.
However the workspace name for this Jenkins job has not been changed. What I am finding is upon the job execution a new workspace is getting created with this given new name and none of the contents of the old workspace is getting copied.
So the issue is contents of the old workspace is not copied to the new workspace.
What should I do instead?
I know there are several questions in SO in this area. However those do not answer my question.
Renaming job in jenkins/hudson
Rename a job in Jenkins
So please check this before marking this question as a duplicate.
I was able to workaround this is issue using the Use custom workspace option.
To change this location, I need to choose configure job and click on the Advanced button in the Advanced Project Options section.
After opening the settings, you will find some more configuration options for your job. Look for the Use custom workspace option on the right hand side and check the box.
Reference: Jenkins: Change Workspaces and Build Directory Locations
Workspaces are volatile by nature and may reside on a build node which has gone offline, therefore your build job should not rely on files being present in the workspace. However sometimes you will benefit from a speed-up by reusing unchanged files existing in workspace and decide not to clean them.
When you start a build, a new workspace is (as you noted) created, this is the correct behaviour, you should not need to store files in your workspace between builds but set up your system to load all sources from your vcs. This way you will always be able to make a fresh build from source, there are also a few options available to clear the workspace from old files.
If you do not want to populate the workspace from a source code addon you can always use the custom shell script feature to run a few shell commands to copy the needed files.
I am just starting with Jenkins 1.487 and wanted to integrate Jenkins in my Ant project. But while configuring it, I can't find any way to make Jenkins re-use an already checked out codebase, instead of downloading a fresh copy relative to its workspace root. Is there a way to do that ?
I tried to specify a custom workspace manually (where my codebase was already checked out), and clicked on 'Build now'. The result was that it wiped out my current checked out code saying
"Checking out a fresh workspace because there's no workspace at /home/daud/Work
Cleaning local Directory ."
Not even a warning..
If you really want to build from an existing checkout somewhere on the file system, then do not use "Source Code Management" section of Jenkins. Leave it as "none"
Go straight to the "Build" section
Click "Add Build Step"
Select Invoke Ant"
Click Advanced
And under "Build File", provide a full path to the ant build file on your file system. You would have to include the drive letter (if on Windows) or a leading / (if on Linux) to break from the Workspace (by default, this path is relative to Workspace). Or use a lot of ../../../.. if needed.
But like others have said, this is not the way a CI system is supposed to be used
The idea behind Jenkins and CI is that it works on a fresh copy of the codebase. Every build done by Jenkins should not depend on any external preconditions and it should be reproducible.
You might want to try using the Clone Workspace SCM Plugin for Jenkins. It will allow you to zip up the workspace from one job and use it to create the workspace for another one. I've used this for downstream jobs that need to act on the work from a previous job.
This is also helpful if you're using something like Git for source control and want to avoid a second Git clone (or SVN checkout). Furthermore, you can limit the content of zip file that is used to recreate the workspace, for example to avoid carrying unnecesary files (e.g. the .git or .svn directories) downstream.