We have several automated build scripts, some of which are run automatically every 2 hours, and some of which are only ever run manually.
If a build script is started manually while another is already running, it can cause...problems. Such as merging untested branches into the production branch.
I'd like to prevent this happening again, and the simplest solution in my mind is to have each build script start by checking that another is not currently running.
Is there a way in ant to directly check if another ant instance/script is currently running?
If not, what's the simplest way to add such a check? My first thought is a file created at the beginning and deleted at the end of a build. I'd prefer a way that handles user-cancelled builds nicely, but it's not necessary. It needs to work if a build succeeded and if a build failed (but was not killed by the user).
If these are separate Ant processes, then I think the only solution is to define a lockfile of some sort that each Ant process needs to acquire before it can continue.
Perhaps the tempfile task could be used for this?
Actually, a sort-of semaphore based on a directory might be better because the tempfile really is a unique tempfile. The first thing your script does is use mkdir to create a shared resource directory name, but it only does this if the directory does not exist.
Upon exit it invokes delete on this shared resource name.
The idea is that the content and name of the directory is meaningless -- it only serves as an "IPC" cooperative locking mechanism.
This isn't particularly elegant, but I think your only other option is to set up a build server that handles scheduled and continuous builds based on various triggers. One that many people use is Jenkins (or has it been renamed?)
[Update]
Perhaps Do I have a way to check the existence of a directory in Ant (not a file)? would do the trick?
To be honest, this approach may work in the short term, but it just moves the problem around. Instead of resetting unit test results you'll be removing lockfiles by hand to get builds working again. My advice is to set up a CI build system, but I recognize this is a fair amount of work (and introduces a whole different set of future problems.)
I’m using TFS 2012 Build and running into an error
Access to the path is denied
The solution being built contains about 15 projects of which a number are using the Castle.Components.Validator.2.5.0 assembly. I have seen other posts that talk about the TFS Build Access Denied errors, but they generally refer to having simultaneous builds running. In this case only one build runs at a time. Also, the error occurs when the server is restarted or the build has not run for some time.Once a build is run and fails, the next one succeeds and each one after that succeeds again until the build hasn’t been run for a while or the server is restarted. Although we can get around this, it is a manual headache. Here is the error:
C:\WINDOWS\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (3513): Unable to copy file "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll". Access to the path 'D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll' is denied.
When looking at the log file you can see that the build is trying to copy the file twice. Because the first one has a lock on the file, the second one fails and thus the build fails. Here is a snippet of the log file that shows what is happening:
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
5>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\MvcContrib.Mvc3.FluentHtml-ci.3.0.96.0\lib\MvcContrib.FluentHtml.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\MvcContrib.FluentHtml.dll".
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\RhinoMocks.3.6\lib\Rhino.Mocks.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Rhino.Mocks.dll".
Any help on how to fix this would be greatly appreciated.
As others mentioned, this happens when performing multithreaded builds with a common destination directory and the file copy task happens to encounter a simultaneous conflict with a copy task running for a different project.
Normally this should result in a "file used by another process" exception (which is handled and retried by the file copy task) but sometimes the file operation results in an "Access is denied" exception instead. (I'm still not sure why)
Some suggest that you should "solve the duplication", but I don't see that as being feasible for cases where all the projects need to directly reference a library like log4net.
Obviously one way to prevent the issue is to explicitly run msbuild with /p:BuildInParallel=false or /m:1 or /maxcpucount:1 (or omit the argument entirely) to force single-threaded mode.
However, in TFS 2013, the default build template automatically always passes /m (use all cores) to msbuild, which silently overrides any single-thread setting you can manually pass in. (Determined by my own experimentation and examining diagnostic logs)
Another workaround I attempted was to manually pass /p:AllowedReferenceRelatedFileExtensions=none to msbuild, which prevents all pdb and xml files from being copied from referenced libraries. (Since for a while I only ever saw xml files having this issue.) But then I kept having problems with log4net.dll.
The ultimate workaround that I used was one I discovered by decompiling the source code for Microsoft.Build.Tasks.Copy:
if (hrForException == -2147024891)
{
if (!Copy.alwaysRetryCopy)
throw;
else
this.LogDiagnostic("Retrying on ERROR_ACCESS_DENIED because MSBUILDALWAYSRETRY = 1", new object[0]);
}
If error -2147024891 (0x80070005 access is denied) occurs, the Copy task will check a special variable to see if it should retry. That value is set via an environment variable:
Copy.alwaysRetryCopy = Environment.GetEnvironmentVariable("MSBUILDALWAYSRETRY") != null;
After setting the environment variable MSBUILDALWAYSRETRY = 1 (and restarting the build server), the problem went away. And I also periodically started seeing "Retrying on ERROR_ACCESS_DENIED..." as warnings in the build logs, proving that the setting was taking effect, (instead of the builds merely coincidentally succeeding).
(Note that this environment variable is not well documented, use as appropriate.)
Update: Apparently TFS 2015 no longer overrides your /m:1 with /m (even on legacy/XAML build definitions), which should make /m:1 a valid fix again.
It looks like there are two projects copying the same file. Depending on the timing, they sometimes happen at the same time, resulting in the failure. You have to trace the node id back to find the source project. See http://blogs.msdn.com/b/buckh/archive/2012/01/21/a-tool-to-find-duplicate-copies-in-a-build.aspx for more details and code that may track it down for you.
As Buck Hodges and Nimblejoe have rightly said, this is mostly due to TFS running multiple MSBuild processes by default to build your projects.
You can override it in the build definition in Process -> 3. Advanced -> MSBuild Arguments by adding the MSBuild argument /p:BuildInParallel=false
This can also happen if you have a build agent's folder open.
I also had same problem. I got error messages that related to cannot copy since access to path denied. In my case all my dll's and xml files and so on are place at
D:\TFS\Example\Bin\Debug folder.
I right clicked on Bin folder and clicked Properties and saw that Read-only check box is checked under Attributes.
I un-checked Read only check box and cliked Apply and clicked OK on the new popup that is shown.
I went back to Visual Studio and build my solution which was giving me error messages.
Voilaa.. This time it build successfully without errors.
I donot know whether this is perfect but I did this to solve my issue.
To work around this problem I had to remove the "ReadOnly" flag on the source directory
Then in the build definition set Clean Workspace
to None
Like Ziggler, I solved this problem with building a project by removing the 'read only' property of the bin folder in my project. It is only happening to XML files stored in a /packages/ directory that is common to the solution that contains this project. The 'bin' folder is not checked into source control. I am still stumped as to the root cause of the problem.
I found the same problem which occurred after the build tried to overwrite files in the "Working Directory" it had created in a previous attempt to build. (set in the Agent)
I resolved this by manually deleting the output folder it created (in my case [Working Directory]\Binaries) before attempting the build.
This can be done automatically by changing the Build Definition. Under Process---2.Basic---Clean Workspace set this to the Outputs option
Here's a variation of this problem which I had to deal with:
I couldn't figure out why my build kept failing on an "Access to the path is denied" error, even though I had added things like /p:BuildInParallel=false and /p:OverwriteReadOnlyFiles=true to the MSBuild Arguments of my XAML build. The cause turned out to be a "Post-build event command line" in my Project's properties.
After changing
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false
to
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false;OverwriteReadOnlyFiles=true
the error went away.
One possible cause is if you have the bin or obj folders for class libraries checked-in into TFS. Deleting the bin or obj folders of the projects from TFS will resolve this issue if that is the case.
I was having this problem and chose to ignore it because I didn't want to sacrifice build performance for the sake of getting rid of some benign error messages by NuGet. However, I seem to have stumbled across a solution while trying to solve another problem, and I think it is related. I think the order of fetching of NuGet packages is related to the build order of projects in the solution. So if this has somehow become disjointed, then NuGet may be the first casualty before you run into build errors where you start getting "Metadata file 'XXX.dll' could not be found" errors which annoyingly require you to build again until the build succeeds (as described here).
So, I believe the solution is to follow the steps described in the accepted answer to the aforementioned question. Or, follow the more comprehensive steps in one of the alternative answers. In other words, disable building of all projects, restart VS, then re-enable building of all projects. This will (normally) resolve build order. And that should hopefully resolve the NuGet issue. Please let me know if this fixes it for anyone.
I had this issue, with TFS 2015.
It turned out to be because the build Agent was running under the default (NETWORK SERVICE) credentials, which didn't have write permissions on the target folder.
Once I'd removed the Agent and reinstalled it with credentials it worked.
It did have me trawling through the logs for a while, checking and unchecking the multi-proc box and even restarting the build server in my hunt.
Check the obvious stuff first...
For me, it was that the build agent wasn't started in an administrator powershell.
MSBuild arguments:- /tv:14.0 /t:Rebuild /m:1 /p:RunCodeAnalysis=false /p:TreatWarningsAsErrors=false /p:OverwriteReadOnlyFiles=true /p:BuildInParallel=false /p:AllowedReferenceRelatedFileExtensions=none
strong text
Set false to Clean workspace
Go to build agent and remove read only from mapped folder.
as a lot of people have already stated before, this happens when building projects in parallel. Project A and B both referencing 3rd Party Library C (Copy Local) will cause this when they are build at the same Time - side by side.
The real problem is, that TFS Build 2012 and below are configured that when building a solution, the whole output of the solution is copied to a single folder. Thats where the pains of parallel builds are having their origins.
Since TFS 2013 you can easily solve this by setting the "Output location" in the build definition to "PerProject". This forces the build services to behave like a local msbuild run where the setings regarding the output locations are read from the corresponding project files. So the output is written to the bin folders under each project.
For TFS 2012 and below this article (+linked articles) will help you getting the same result as with TFS 2013:
http://blog.stangroome.com/2012/05/10/override-the-tfs-team-build-outdir-property-net-4-5/
I resolved a very similar issue by closing all open instances of Visual Studio, re-opening the solution and building it again.
We have an ASP.NET MVC project that we want to create a publish package from during an automated build. The build is using the unmodified default template with Arguments /p:DeployOnBuild=True /p:CreatePackageOnPublish=True.
If I do a WebDeploy directly to a server it is working fine (if I change /p:CreatePackageOnPublish to false) but I would prefer to just create a package that I can deploy during a Lab build.
The error message looks like this:
TF270002: An error occurred copying files from 'C:\Builds\19\Binaries'
to '\nas\Build\Drop\MyProject\MyProject_Development.Test\20120209.1'.
Details: The specified path, file name, or both are too long. The
fully qualified file name must be less than 260 characters, and the
directory name must be less than 248 characters.
The first part of the problem was the build folder path was too long (274 characters) but after changing the working directory from $(SystemDrive)\Builds\$(BuildAgentId)\$(BuildDefinitionPath) to $(SystemDrive)\Builds\$(BuildDefinitionId) it's down to 230 characters as the longest path so it should be ok.
The problem now seems to be the path in the drop folder, even though it's root path is not that long by itself \\nas\Build\Drop\MyProject, the build name and Build Number Format quickly adds to the length MyProject_Development.Test\MyProject_Development.Test_20120208.1. After that all them nested paths create really deep folder structures _PublishedWebsites\MyProject.Web_Package\Archive\Content\C_C\Builds\19\Sources\MyProject\Source\MyProject.Web\obj\Debug\Package\PackageTmp\Content\ui-lightness\Images\ui-bg_diagonals-thick_18_b81900_40x40.png.
So is there any way to get around this problem? I shortened the build number format from $(BuildDefinitionName)_$(Date:yyyyMMdd)$(Rev:.r) to $(Date:yyyyMMdd)$(Rev:.r) to save a few characters but it's not enough. I guess we could shorten the build name a bit but it would break the naming convention used (Ok, that would not be a really big problem but it would be annoying!) and still it would feel like a short term solution.
What else is there to do?
The short answer is the path length limitation is really annoying, and you're going to have to spend some (more) time tweaking your file/folder structure to make this work.
For example instead of \nas\Build\Drop\MyProject, just do \nas\Build\Drop (or \nas\Builds) since the project name is also in the build name.
Flatten the folder structure in your projects (do you really need a Source folder under MyProject?).
Also, go vote for the UserVoice suggestion for the TFS team to fix the path length limitations: http://visualstudio.uservoice.com/forums/121579-visual-studio/suggestions/2156195-fix-260-character-file-name-length-limitation
I know the question is old, but I faced the same problem and I devised to solution to this, although it errs more on the preventing the problem from ever occurring rather than fixing the existing path length condition. It can then be applied once the issue has been - manually - resolved.
Please note that it applies to TFS under git. A similar approach could be devised for TFSVC, although it would have to be run after the code is merged.
Essentially, it's a short script to be run as part of the PR build. It enforces that no file added or modified has a path longer than the one you allow.
It is described in this blog post
I'm an ant novice but my expectation is that the build script in h5bp would use last modified info to ensure that it only generated new files when it needed to. This does not happen - everything seems to run on every invocation of ant build . Is there a way I can prevent this? Is it a design feature of the h5bp build script?
I've extended project.xml to ftp to my server when I need to (and also to copy to my development server) which I find really useful. However since the images to be copied from the publish folder have new dates, even though unchanged, they are all ftp'd up each time which is slow and unnecessary. FWIW I'm running this on Windows 7 (uploading to a Linux/Apache server).
Looking through the build.xml file, I see plenty of overwrite settings - most to "no" and a few to "yes", so I guess some conscious decisions have been made. Appreciate understanding why.
Grateful for any suggestions.
Thanks
Abo
PS Apart from this, the build script is just great!
So, I think I can answer some of my own question.
The publish (and intermediate) directories are deleted on each build, so the published image files are always regenerated.
Anyone know if there's a reason that the img directory cannot be left in publish and intermediate?
Thanks
IT generates a new build from the source each time you build one. Leaving old images would just result in lot of unnecessary files.
Perhaps you could set a flag when the image files are FTPed for the first time, and then on not upload the images unless specifically asked to do so?
I have a vcproj file that includes a simple pre-build event along the lines of:
Helpertask.exe $(ProjectDir)
This works fine on developer PCs, but when the solution is built on our TFS 2008 build server under MSBuild, $(ProjectDir) is either blank or points to an unrelated folder on the server!
So far the best workaround I have managed is to hard code the developer and server paths instead:
if exist C:\DeveloperCode\MyProject HelperTask.exe C:\DeveloperCode\MyProject
if exist D:\BuildServerCode\MyProject HelperTask.exe D:\BuildServerCode\MyProject
This hack works in post-build steps but it doesn't work for a pre-build step (the Pre-build task now does nothing at all under MSBuild!)
Do you have any ideas for a fix or workaround? I have very little hair left!
$(MSBuildProjectDirectory) worked for me
I think your problem may be related to how items are initalized. An items include attribute is evaluated at the begining of a build. So if you depend on files that are created in the build process you must declare these as dynamic items. Dynamic items are those defined inside of a target, or by using the CreateItem task. I've detailed this on my blog MSBuild: Item and Property Evaluation.
Sayed Ibrahim Hashimi
My Book: Inside the Microsoft Build Engine : Using MSBuild and Team Foundation Build
I think the problem is that build server's workspace probably isn't initialized properly.
I just kept getting problems with this - I tried many different approaches but they all failed in mysterious ways.
Once $(ProjectDir) started behaving properly again, the pre-build step stopped executing the command (I added echo commands above and below it - they were both executed, but the program in between them was not. No errors or output of any kind were generated to indicate why it failed).
I don't know if this is a dodgy server of if MSBuild is having a laugh.
I've given up now. I gave the build server a big kick and have changed tack: We now run this tool offline (manually) and check in the results for the build server to use. So much for an automated build :-( If only MSBuild would run solutions in the same way as Visual Studio does - it's maddening that it sets up the environment completely differently (different paths coming out of the solution variables, ouptus redirected into different folders so you can't find them where they're supposed to be, etc)
I branched an existing project and $(ProjectDir) kept the old directory in the newly branched code. But that's because I had some compiling errors. Once every project in the solution compiled without errors, $(ProjectDir) changed to the correct path.
Carlos A Merighe