TFS Online Build Fails on local Build Server with TF270016 / TF270002 - tfs

We're using Visual Studio Online, but we have local Build Controller and Build agent. This has been running fine for the past 6 months or so, but just this week the builds have consistently failed.
The software itself appears to build successfully, and the tests also seem to pass, but it fails due to an error during the publication of the log files (see error below).
The build uses an unmodified Default Template, and is setup so that it "does not copy output files to a drop folder" (in the Build Defaults of the build definition).
After a few hours of head-banging this feels like some sort of permissions thing, but I have no idea how to go about debugging, or verifying this assumption.
Can anyone offer any suggestions, or better yet, a solution! :-)
One other thing to note is that we have been mucking about with our users in Visual Studio Online to change some accounts from Basic to Stakeholder accounts in order to reduce costs. I'm wondering if we've also managed to remove a critical account or permission that has caused this...?
Error
An error occurred while copying diagnostic activity logs to the drop location.
Details: TF270002:
An error occurred copying files from
'C:\Users\tfs\AppData\Local\Temp\BuildAgent\5498\Logs\2853\LogsToCopy\ActivityLog.AgentScope.5498.xml'
to
'ActivityLog.AgentScope.5498.xml'.
Details: BadRequest: Bad Request
An error occurred while copying diagnostic activity logs to the drop location.
Details: TF270002:
An error occurred copying files from
'C:\Users\tfs\AppData\Local\Temp\BuildController\4592\Logs\2853\LogsToCopy\ActivityLog.xml'
to
'ActivityLog.xml'.
Details: BadRequest: Bad Request
Edit
One thing to note is that this error is consistent across all builds for different C# projects that are executed through the same build controller. I've tried removing and re-registering the controller, restarting the build service and the build server itself.

we are also experiencing similar issue. We have not done any changes to VSO permissions, so I doubt it is that.
Two things that coinside this:
1. There was an update to VSOnline during the same timewindow that this issue appeared
2. Our local build controllers/agents were updated with latest Patch Tuesday updates

So the solution to this (in my case anyway) was to upgrade the Build controller software to v12 (TFS 2013).

Related

VSTS, create build definition gets AllowScriptsAuthAccess error

long time listener, first time caller!
I've spent two days searching for an answer to this so hopefully someone here may be able to help.
I've set up a personal/free VSTS instance and created a project.
One of the first tasks I want to do is setup the build pipeline, so create a new pipeline, define the agent pool as VS2017, connect to my Github repo etc, all of which is fine.
Next I try to add an Agent Job, again choosing VS2017 as the agent. With no other options chosen, if I try to save the build definition I get the following error message (and cannot save it);
The AllowScriptsAuthAccess build option is not supported in API versions greater than 4.0.
Allow scripts to access the OAuth token is unchecked on the Agent job configuration under phases and on the Build/Options tab (slider set to disabled)
I've googled and searched for all sorts of stuff to try and find someone with the same problem but it's almost like I'm the first to discover this - which is highly unlikely!! It has almost driven me to using Bing to search for a solution, but let's not get carried away.
Any ideas or suggestions would be greatly appreciated!
So it turns out that turning off the "New YAML pipeline creation experience" and "New Navigation" under preview features fixes the problem, insofar as I can now create and save a build pipeline without the error.
Also, if you have "Build YAML Pipelines" enabled under preview features for the Organisation, you get the "View YAML" link that I was missing also.
Thanks all for your help. I'd be interested to know the root cause of this still. I'll update the Microsoft support ticket with the same and post back here if they have any insights.
There's an similar issue here:https://developercommunity.visualstudio.com/content/problem/123012/getting-multiconfiguration-build-option-not-suppor.html
Seems the build template was broken. So, you can try with other build templates or starting over with an empty template, then add the needed tasks manually to check if that works.
Besides, you can try below things:
Clean the caches on your client machine, also clean the browser
caches, then check it again. See How to clear the TFS cache on
client machines.
Create a new team project and create a new build pipeline within the
new team project to check if that works
I am assuming this is a bug in the VSTS system and it will likely be fixed soon. But for the time being, I found a workaround:
I was also getting the AllowScriptsAuthAccess error and struggled with it for hours. I don't think any of the configuration settings you mentioned have anything to do with it (free account, GitHub, OAuth token unchecked).
To solve it, I converted the Agent Job to YAML (which is as easy as clicking "View YAML" in the upper right). Save the code to a file named .vsts-ci.yml, and save this in the root folder of your solution. Commit/push the new file, then queue the build. (Note that the conversion to YAML is one-way, so you may want to Clone your build.)
That should get rid of the AllowScriptsAuthAccess error. After that I had to add a few variables, but then it's just a matter of following the error messages.
I hope this helps. Sorry I can't answer this more authoritatively. Please post a comment if I am missing any steps.
I had this issue and it turned out that I didn't have Build Admin permissions in VSTS for the project. Not a very helpful error message for this.

Could not copy. The process cannot access the file because it is being used by another process

I am getting the following error when running my build on Visual Studio Online (using the built-in Build Controller):
C:\Program Files
(x86)\MSBuild\14.0\bin\amd64\Microsoft.Common.CurrentVersion.targets
(3962): Could not copy
"d:\a\src\MySolution\MyProject\Trunk\packages\Microsoft.Data.Edm.5.6.4\lib\net40\Microsoft.Data.Edm.xml"
to "..\Build\bin\Release\Microsoft.Data.Edm.xml". Beginning retry 1 in
1000ms. The process cannot access the file
'..\Build\bin\Release\Microsoft.Data.Edm.xml' because it is being used
by another process.
It is never the same file either but it seems to always be either an xml or dll from the packages folder.
EDIT: I'm not sure if it is worth mentioning, but I do have multiple workspaces and multiple build definitions using this repository.
I found the problem. Completely unrelated to the error above.
I went into the msbuild log files and found this:
Failed to produce diagnostics extension's config for
MyRole\diagnostics.wadcfgx. Error : Could not find a part of the path
'd:\a\src...\MyRole\diagnostics.wadcfgx'. Done Building Project
"d:\a\src...\MyCloudProject.Cloud.ccproj" (Publish target(s)) --
FAILED.
I was missing a file in source control.
I do wonder why this error did not bubble up into my build summary. And where did that initial error come from?
I am using TFS with Using Visual Studio 2013 and have been able to work around this issue by closing all open documents that I want to check-in (seems VS locked itself out) and/or resolving conflicts. The error message is sufficiently vague so as to be useless as to the actual cause of the check-in failure.
Update 02 November 2016:
I'm not sure why VS 2013 and TFS don't play nice together via the Team Explorer Check-in Pending Changes button, but it consistently fails to launch the conflict resolver, a key piece of the check-in process.
The following works for me on VS 2013 and TFS hosted on a SQLServer Express 2014 database:
1. Launch the Source Explorer: Team Explorer tab -> Source Explorer
2. Navigate to your solution repository
3. Then proceed to do the following for each project that you want to check in:
a. Right click project
b. Check in pending changes
c. Resolve conflicts and repeat steps 3a and 3b until no pending changes remain for the project

Team foundation server's automated build is not getting the latest code

I have setup a build controller etc and the builds were failing, I have fixed these now and the build failed properly - as in because of an error.
I have fixed the error and checked the code back in but now the code is not being extracted, although sometimes one folder of many is.
I have deleted the code from the build machine and requeued a build but it keeps failing. It complains that it cannot find the solution that I specified as the build solution.
I have checked the check box to build even if nothing has changed.
Have I missed a setting somewhere for extracting the code?
TFS version is 2012 Express
Visual Studio version is 2010 Professional
I had this issue recently with TFS 2012. I think it boils down to this:
In the lastest build definition files, it appears that it performs a Clean task before updating the workspace. This means that if you do something that causes the Clean part of the build to fail, it will never download the new files in order to fix it.
Recently, I was making big changes to my build file and inevitably made a lot of mistakes, I found that if one of these mistakes caused the Clean to break, I had to go onto the Build server and change the file manually to get it working again.
Does this sound like it might be the same issue?
There are several properties in your build definition you can check. I would start with setting the "Clean Workspace" to All to ensure the correct code is being pulled down and built.
There are other checks you can look at as well like the agent set for the build and the "GetVersion" property. Check the below link out. It should be able to help you in more detail.
Define a Build Process that is Based on the Default Template

TFS 2012 Build "Access to Path Denied"

I’m using TFS 2012 Build and running into an error
Access to the path is denied
The solution being built contains about 15 projects of which a number are using the Castle.Components.Validator.2.5.0 assembly. I have seen other posts that talk about the TFS Build Access Denied errors, but they generally refer to having simultaneous builds running. In this case only one build runs at a time. Also, the error occurs when the server is restarted or the build has not run for some time.Once a build is run and fails, the next one succeeds and each one after that succeeds again until the build hasn’t been run for a while or the server is restarted. Although we can get around this, it is a manual headache. Here is the error:
C:\WINDOWS\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (3513): Unable to copy file "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll". Access to the path 'D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll' is denied.
When looking at the log file you can see that the build is trying to copy the file twice. Because the first one has a lock on the file, the second one fails and thus the build fails. Here is a snippet of the log file that shows what is happening:
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
5>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\MvcContrib.Mvc3.FluentHtml-ci.3.0.96.0\lib\MvcContrib.FluentHtml.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\MvcContrib.FluentHtml.dll".
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\RhinoMocks.3.6\lib\Rhino.Mocks.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Rhino.Mocks.dll".
Any help on how to fix this would be greatly appreciated.
As others mentioned, this happens when performing multithreaded builds with a common destination directory and the file copy task happens to encounter a simultaneous conflict with a copy task running for a different project.
Normally this should result in a "file used by another process" exception (which is handled and retried by the file copy task) but sometimes the file operation results in an "Access is denied" exception instead. (I'm still not sure why)
Some suggest that you should "solve the duplication", but I don't see that as being feasible for cases where all the projects need to directly reference a library like log4net.
Obviously one way to prevent the issue is to explicitly run msbuild with /p:BuildInParallel=false or /m:1 or /maxcpucount:1 (or omit the argument entirely) to force single-threaded mode.
However, in TFS 2013, the default build template automatically always passes /m (use all cores) to msbuild, which silently overrides any single-thread setting you can manually pass in. (Determined by my own experimentation and examining diagnostic logs)
Another workaround I attempted was to manually pass /p:AllowedReferenceRelatedFileExtensions=none to msbuild, which prevents all pdb and xml files from being copied from referenced libraries. (Since for a while I only ever saw xml files having this issue.) But then I kept having problems with log4net.dll.
The ultimate workaround that I used was one I discovered by decompiling the source code for Microsoft.Build.Tasks.Copy:
if (hrForException == -2147024891)
{
if (!Copy.alwaysRetryCopy)
throw;
else
this.LogDiagnostic("Retrying on ERROR_ACCESS_DENIED because MSBUILDALWAYSRETRY = 1", new object[0]);
}
If error -2147024891 (0x80070005 access is denied) occurs, the Copy task will check a special variable to see if it should retry. That value is set via an environment variable:
Copy.alwaysRetryCopy = Environment.GetEnvironmentVariable("MSBUILDALWAYSRETRY") != null;
After setting the environment variable MSBUILDALWAYSRETRY = 1 (and restarting the build server), the problem went away. And I also periodically started seeing "Retrying on ERROR_ACCESS_DENIED..." as warnings in the build logs, proving that the setting was taking effect, (instead of the builds merely coincidentally succeeding).
(Note that this environment variable is not well documented, use as appropriate.)
Update: Apparently TFS 2015 no longer overrides your /m:1 with /m (even on legacy/XAML build definitions), which should make /m:1 a valid fix again.
It looks like there are two projects copying the same file. Depending on the timing, they sometimes happen at the same time, resulting in the failure. You have to trace the node id back to find the source project. See http://blogs.msdn.com/b/buckh/archive/2012/01/21/a-tool-to-find-duplicate-copies-in-a-build.aspx for more details and code that may track it down for you.
As Buck Hodges and Nimblejoe have rightly said, this is mostly due to TFS running multiple MSBuild processes by default to build your projects.
You can override it in the build definition in Process -> 3. Advanced -> MSBuild Arguments by adding the MSBuild argument /p:BuildInParallel=false
This can also happen if you have a build agent's folder open.
I also had same problem. I got error messages that related to cannot copy since access to path denied. In my case all my dll's and xml files and so on are place at
D:\TFS\Example\Bin\Debug folder.
I right clicked on Bin folder and clicked Properties and saw that Read-only check box is checked under Attributes.
I un-checked Read only check box and cliked Apply and clicked OK on the new popup that is shown.
I went back to Visual Studio and build my solution which was giving me error messages.
Voilaa.. This time it build successfully without errors.
I donot know whether this is perfect but I did this to solve my issue.
To work around this problem I had to remove the "ReadOnly" flag on the source directory
Then in the build definition set Clean Workspace
to None
Like Ziggler, I solved this problem with building a project by removing the 'read only' property of the bin folder in my project. It is only happening to XML files stored in a /packages/ directory that is common to the solution that contains this project. The 'bin' folder is not checked into source control. I am still stumped as to the root cause of the problem.
I found the same problem which occurred after the build tried to overwrite files in the "Working Directory" it had created in a previous attempt to build. (set in the Agent)
I resolved this by manually deleting the output folder it created (in my case [Working Directory]\Binaries) before attempting the build.
This can be done automatically by changing the Build Definition. Under Process---2.Basic---Clean Workspace set this to the Outputs option
Here's a variation of this problem which I had to deal with:
I couldn't figure out why my build kept failing on an "Access to the path is denied" error, even though I had added things like /p:BuildInParallel=false and /p:OverwriteReadOnlyFiles=true to the MSBuild Arguments of my XAML build. The cause turned out to be a "Post-build event command line" in my Project's properties.
After changing
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false
to
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false;OverwriteReadOnlyFiles=true
the error went away.
One possible cause is if you have the bin or obj folders for class libraries checked-in into TFS. Deleting the bin or obj folders of the projects from TFS will resolve this issue if that is the case.
I was having this problem and chose to ignore it because I didn't want to sacrifice build performance for the sake of getting rid of some benign error messages by NuGet. However, I seem to have stumbled across a solution while trying to solve another problem, and I think it is related. I think the order of fetching of NuGet packages is related to the build order of projects in the solution. So if this has somehow become disjointed, then NuGet may be the first casualty before you run into build errors where you start getting "Metadata file 'XXX.dll' could not be found" errors which annoyingly require you to build again until the build succeeds (as described here).
So, I believe the solution is to follow the steps described in the accepted answer to the aforementioned question. Or, follow the more comprehensive steps in one of the alternative answers. In other words, disable building of all projects, restart VS, then re-enable building of all projects. This will (normally) resolve build order. And that should hopefully resolve the NuGet issue. Please let me know if this fixes it for anyone.
I had this issue, with TFS 2015.
It turned out to be because the build Agent was running under the default (NETWORK SERVICE) credentials, which didn't have write permissions on the target folder.
Once I'd removed the Agent and reinstalled it with credentials it worked.
It did have me trawling through the logs for a while, checking and unchecking the multi-proc box and even restarting the build server in my hunt.
Check the obvious stuff first...
For me, it was that the build agent wasn't started in an administrator powershell.
MSBuild arguments:- /tv:14.0 /t:Rebuild /m:1 /p:RunCodeAnalysis=false /p:TreatWarningsAsErrors=false /p:OverwriteReadOnlyFiles=true /p:BuildInParallel=false /p:AllowedReferenceRelatedFileExtensions=none
strong text
Set false to Clean workspace
Go to build agent and remove read only from mapped folder.
as a lot of people have already stated before, this happens when building projects in parallel. Project A and B both referencing 3rd Party Library C (Copy Local) will cause this when they are build at the same Time - side by side.
The real problem is, that TFS Build 2012 and below are configured that when building a solution, the whole output of the solution is copied to a single folder. Thats where the pains of parallel builds are having their origins.
Since TFS 2013 you can easily solve this by setting the "Output location" in the build definition to "PerProject". This forces the build services to behave like a local msbuild run where the setings regarding the output locations are read from the corresponding project files. So the output is written to the bin folders under each project.
For TFS 2012 and below this article (+linked articles) will help you getting the same result as with TFS 2013:
http://blog.stangroome.com/2012/05/10/override-the-tfs-team-build-outdir-property-net-4-5/
I resolved a very similar issue by closing all open instances of Visual Studio, re-opening the solution and building it again.

Why won't my 2008 Team Build trigger on developer check-ins despite CI being enabled

I have a Team Foundation Server 2008 Installation and a separate machine with the Team Build service.
I can create team builds and trigger them manually in Visual Studio or via the command line (where they complete successfully). However check ins to the source tree do not cause a build to trigger despite the option to build every check in being ticked on the build definition. Update: To be clear I had a fully working build definition with the CI option enabled.
The source tree is configured is a pretty straight forward manner with code either under a Main folder or under a Branch\branchName folder. Each branch of code (including main) has a standard Team Build definition relating to the solution file contained within. The only thing that is slightly changed from default settings is that the build server working folder; i.e. for main this is Server:"$\main" Local:"c:\build\main" due to path length.
The only thing I've been able to guess at (possible red herring) is that there might be some oddity with the developer workspaces. Currently each developer maps Server:"$\" to local:"c:\tfs\" so that there is only one workspace for all branches. This is mainly to avoid re-mapping problems that some of the developers had previously gotten themselves into. But I can't see how this would affect CI.
UPDATE: Ifound the answer indirectly; please read below
Ok I have found the answer myself after several dead ends. In the end I fixed this unintentionally while fixing another issue. Basically we had just turned on the automatic execution of unit tests for our builds. The test would run sucessfully but then immediately the build would bomb out with a message saying it was unable to report to the build drop folder.
What was happening was that while the Build service runs under one account and has a set of rights; some of the functionality is actually driven through the TFSService account. fter wading a heap of permissions I had my tests being reported. Then I noticed that builds had started to trigger on check-ins; I can't tell you exactly which permission fixed this but hopefully this answer will at least set people down the right path.
One other note a few of the builds started failing due to conflicting workspace mappings - this was a separate issue that I resolved by deleting some obsolete workspaces using the Attrice Sidekicks for Team Foundation tool.
Hope this helps somebody else.
Select your team project from team explorer, then right click on the Builds folder. Select a new build definition and then select the trigger tab. Move the radio button to "Build each check-in (more builds)"
More info can be found here
MSDN How to: Create a Build Definition
Are there any errors in the log on the TFS application server? Anything that indicates that it tried to fire but failed?

Resources