TFS 2010 Build, constant drop location, random access issue - tfs

We are using TFS 2010 Build to deliver libraries on a fixed location. ( \\server\product-R0\latest )
Other team projects reference the library from this location.
On my build process I check if Build and unit tests passed, if it's ok I:
Transform web/app.config
Delete the latest folder using a "DeleteDirectory" activity
Create the latest folder using a "CreateDirectory" activity
Copy the binaries in the folder using "CopyDirectory" activity
I delete the folder first because if we rename an assembly the old one won't be deleted.
The issue is random and happen 40% of the time:
TF270002 : An error occurred copying files from
'D:\Builds\1\FooTeam\BarService\Binaries' to
'\\nas\Builds\BarService-R0\Latest'.
Details : Access to the path
'\\nas\Builds\BarService-R0\Latest\SomeFile.dll'
is denied.
If you launch the build several times it work.
I've try the usual dumb idea of "putting sleeps between steps to see what happens" but it don't solve the problem, it just seems to reduce the probability of it happening.
It's like TFS try to copy while still deleting the directory, some times it hangs on the directory creation step.
Anyone? Thank you!

The most elegant solution is to create a link instead of copying, something like
mklink /J D:\Drops\MyBuild_LatestGood D:\Drops\MyBuild_2014-06-13
Plus: No copy involved, same ACLs.
Caveats: this command works only locally, when the Drop share is located on the Build server. There are options also in the case of a NAS, as long as you are allowed to execute remote commands (e.g. SSH).
Another option is to create a network share on the desired folder, even if the disk is remote, as long as it reside on a Windows server.

Related

Visual Studio 2013, TFS is very slow

When I originally installed VS Ultimate 2013 everything was fine but for the last month or so it's been a dog.
The source control explore in my Visual Studio 2013 install is very slow. Just clicking on a node and the act of displaying the node contents takes 20+ seconds.
Everyone else on the team is ok so it's not the TFS server it's just my install.
I assumed it was some addin I'd installed into VS so disabled them but no luck.
Any ideas?
Having tried all suggestions, unloaded all add ons, tried to reinstall VS, removed all extra workspaces etc. the answer to my problem was to unmap my workspace and then remap it.
Problem solved. Not got a clue what the underlying fault was.
In my case, the only way to get rid of the lag was to change my workspace location from "local" to server. You can do this under the advanced options for your workspace.
The 'full blast' solution that worked for me was;
remove workspace
delete all source code
rebuild the workspace
rebuild solution
Only takes a few minutes more than just rebuilding the workspace (see #DaveF's answer) but gave me a bit more confidence that everything hangs together.
Had this happen to me a few times now, so there are some things I'd like to add to the accepted answer.
I work in a place where we have a lot of VS solutions with a lot of files in them. Microsoft's guidelines suggest that you shouldn't be using a local workspace if its going to have more than 100,000 items in it. So you could prevent this problem entirely by:
Not using local workspaces
Making sure never to map enough folders into a single workspace that it gets over 100,000 files associated with it.
Periodically declaring "TFS bankruptcy" and unmapping everything.
For me, the drawback of having to use strict locking and not having offline access makes #1 unacceptable. I'm going to try harder to do #2, but honestly #3 is what I've been living by.
Its kind of like early Windows, where every year or so you had to just reinstall the OS to remove all the accumulated cruft.
Cleaning local folders helped: See 'Team Explorer - Pending Changes', under 'Excluded Changes' it said: 'Detected: 50000 add(s)'. Click it to see path to folders.
This make me crazy too for over six months until I found this instruction. Now, my VSO is fling. Note: this information I copy from somebody. Would like to give them credit but I cannot remember how did I find this.
You can fix this problem of TFS by editing registry.
Navigate to key
HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0\TeamFoundation\SourceControl\Proxy
and then change value of URL to any dummy website like 'www.abcdummy.com'
Restart VS after editing registry key value.
I had the same problem, it kept me busy for a week or so, but after investigating my complete setup i found the following:
Within my ASP.NET application, i had an image directory and an image cache directory, with lots of images in them. (+200.000). Both were not included in my VS project, but still Visual Studio / TFS tripped over this.
First i found, that when checking in some files (which took over 10 minutes when the problem existed), in 'Team Explorer - Pending Changes', under 'Excluded Changes' it said: 'Detected: 50000 add(s)'.
Trying to get rid of this the 'normal way', by opening that 'Promote Candidate Changes' window and setting these files to be ignored, still didn't do much.
But after moving those image directories to some other location, outside my project, all problems disappeared.
Of course i had to add those moved directories as virtual directories to still see my images.
I cleaned my workspace of unnecessary projects and it ran better. I think vh_click is on to something with the 50,000 ads thing. TFS keeps track of all your edits and over time with tons of projects, undos, and craziness you could amasse a large set of which TFS has to chug. Get out the Clorox, the Comet or whatever else you clean with and dump some junk or move it to some archive folder or backup drive.
Cleaning up the Workspace was the solution for me, when opening visual studio 2015 the Source Control Window will remain in a Loading phase, I had 2 workspaces name and name_1 and I removed both.
No need to delete the entire folder, though , keep in mind that if you do delete the workspace and have the files, you will need to force the get latest to be on the safe side
Getting Latest was soooooo slow. I was using a Colleagues PC and had deleted his Workspace.
After an hour waiting to get latest I got an error and realised my User Account didn't have Full Control on the folder, giving Write Access made Get Latest run x1000 faster:
Just to throw another solution in the mix! I had the same problem which seemed to be caused by several layers of working folders configured in my workspace (some overlapping ones too).
The issue was resolved by going to Manage Workspaces, then Edit and then removing the additional folder bindings.
in short "Run it as an administrator".
No one of those solution does work at all, I even search on this link:
Why is Visual Studio 2013 very slow?
In vain, just do this ONE simple step:
Go to your visual studio path, usually installed on this path:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE namely the file "devenv.exe" , then right click on it, click "Run an administrator" ===> then open your visual studio project.
So, you can just send a shortcut of "devenv.exe" to the desktop to easily run it as an administrator each time.
Have ^_^ Fun
You can keep workspace on local and change your workspace to this. I did that and my TFS speed was great :
1- Remove all mapped folders in workspace "Edit":
2- Change workspace folder to parent all of mapped folders:
I hope it is useful for you.

TFS MSBuild Copy Files from Network Location Into Build Directory

We are using TFS to build our solutions. We have some help files that we don't include in our projects as we don't want to grant our document writer access to the source. These files are placed in a folder on our network.
When the build kicks off we want the process to grab the files from the network location and place them into a help folder that is part of source.
I have found an activity in the xaml for the build process called CopyDirectory. I think this may work but I'm not sure what values to place into the Destination and Source properties. After each successful build the build is copied out to a network location. We want to copy the files from one network location into the new build directory.
I may be approaching this the wrong way, but any help would be much appreciated.
Thanks.
First, you might want to consider your documentation author placing his documents in TFS. You can give him access to a separate folder or project without granting access to your source code. The advantages of this are:
Everything is in source control. Files dropped in a network folder are easily misplaced or corrupted, and you have no history of changes to them. The ideal for any project is that everything related to the project is captured in source control so you can lift out a complete historical version whenever one is needed.
You can map the documentation to a different local folder on your build server such that simply executing the "get" of the source code automatically copies the documentation exactly where it's needed.
The disadvantage is that you may need an extra CAL for him to be able to do this.
Another (more laborious) approach is to let him save to the network location, and have a developer check the new files into TFS periodically. If the docs aren't updated often this may be an acceptable compromise.
However, if you wish to copy the docs from the net during your build, you can use one of the MSBuild Copy commands (as you are already aware), or you can use Exec. The copy commands are more complicated to use because they often populated with filename lists that are generated from outputs of other build targets, and are usually used with solution-relative pathnames. But if you're happy with DOS commands (xcopy/robocopy), then you may find it much easier just to use Exec to run an xcopy/robocopy command. You can then "develop" and test the xcopy command outside the MSBuild environment and then just paste it into the MSBuild script with confidence that it will work - much easier than trialling copy settings as part of your full build process.
Exec is documented here. The example shows pretty well how to do what you want, but in your case you can probably just replace the Command attribute with the entire xcopy/robocopy command (or even the name of a batch file) you want to use, so you won't need to set up the ItemGroup etc.

The workspace version table contains an unknown schema version

Just started up Visual Studio 2012 and opened my solution which is in source control with Team Foundation Server 2012 Express and encountered this, any ideas? Can't get latest, can't check in, everything appears checked out :( Basically my workspace is unusable right now.
TF400018: The local version table for the local workspace MY-PC;My
User could not be opened. The workspace version table contains an
unknown schema version.
There is only one post I could find on the net, and the answers are pretty vague.
I had the same issue, and I just fixed it on mine.
If you don't mind re-map all your projects, you can try follow:
Click the box in "Workspace".
Click on "Workspaces".
Delete the workspace profile you're currently using
Re-connect to TFS open "Source Control"
Be aware that you may lose all your TFS mappings, you may need to re-map all your projects from TFS. Backup your changes that not checked in yet.
cycle6 is correct, but it isn't clear that you will not lose your pending check-in list if you follow some additional steps.
Click the box labelled "Workspace".
Click on "Workspaces".
Delete the corrupt workspace profile, accepting the warning.
Re-connect to TFS and open "Source Control Explorer"
Create a new workspace
One by one, map your projects to the same folder as before
You will be presented with a list of conflicts, where you have matching writable files in the folder already.
Choose "Keep local copy" for each file you had checked out before, and "Take Server Version" for any files changed by other members of the team that you didn't have the latest version for. This might take a while depending on the length of the list, but it is worth comparing versions for any file you are unsure of.
You will be left with your solution and all pending items marked as checked out, with your work preserved.
I did the following steps and it solved the issue:
deleted the hidden folder named $tfs and then
in the Visual Studio, Solution Explorer: Right click on the solution node > the Source Control > Get Specific version > latest version
If you already have multiple instances of Visual Studio open.
Close all of them . [in some cases you need to log out from windows & log back in OR restart ]
Rename the $tf folder with any other name (eg. $tft)
Start Visual Studio, to see your issue fixed. :)
Hope this helps.
Sometimes this happens when you are running out of disk space.
Try to see if you have very low space, eg. < 10 MB.
If that so, try to clean up your windows Temp folder. See if that solve this issue
It's a misleading message to an extent.
What has happened is that the internal data structures of the workspace have become corrupt.
The ends up as the code (in the tf command, Visual Studio, et al.) to load those data structures failing to load from the relevant files, which becomes an error about a schema version problem.
In the case that I experienced, this was because the machine hosting the workspace ran out of disc space while doing operations upon the workspace of various kinds (check-outs, check-ins, adding pending changes — it was actually a bunch of workspaces being used by TFS 2017 build agents and multiple active builds).
This corrupted parts of the data that are held in the files under the hidden $tf subdirectory (it always being a local workspace on a TFS 2017 build agent), because source control wasn't able to rewrite/extend these files.
Other answers here discuss partly retaining some of the files, based upon more specific knowledge of what has not been corrupted (such as preserving the internal files storing pending changes if one wasn't creating any pending changes), but the basic idea is that one needs to reset all of the stuff in $tf to a sane state of some kind.
In my case, I had the disadvantage of multiple potential causes and no consistent knowledge of which parts of $tf were corrupted, but I conversely had some advantages:
It being a TFS build, arranged to build from the build agent's s (source) directory into its a (artifact staging) and b (binaries) directories, there were not masses of non-source-controlled object and other files in the workspace (which is the s directory) that would have ended up as pending additions.
There were not any pending changes (to actual source files) worthwhile to preserve. I could afford to lose all information about source files, and indeed all current locally-stored information about the workspace, and simply run the build again with a fresh sane and largely unpopulated workspace. I did not even need to restore source files and directories for the whole workspace, as the first task in any TFS ("vNext") build is a "Get Sources" task that uses (variously) tf vc scorch, tf vc undo, and tf vc get to check out the right source version.
So simply, in Developer PowerShell (Visual Studio being installed on the build machine):
Remove-Item -Recurse -Force 'X:\Agents\07\_work\1138\s'
tf vc get 'X:\Agents\07\_work\1138\s'
(Note that one can always get at the tf command in some way on a TFS build machine. Every build agent has a local helper copy of tf.exe and its ancillary DLLs in its VSTS "OM" subdirectory.)
I possibly could have omitted the tf vc get step, but having had trouble with "Get Sources" in the past I do not trust it to robustly cope with arbitrary manual external alterations, such as no s directory when the build isn't configured to outright delete that entire directory itself (as it can be but was not here).
For the same reason, Microsoft's own "agent maintenance" (another way to clean things up) is quite dodgy, and ends up leaking workspaces on the TFS server (which I have raised a bug with Microsoft about).
There is simple workaround. Remove local mapping to folder where is the sources (Advanced -> Remove Mapping, or just rename or delete mapped folder. After that you will be able to connect to tfs. Download the project again.
If you already have multiple tfs instances of Visual Studio open.
1.) Open File -> Source Control -> Manage Workspaces
2.) Delete all tfs map
3.) Then select folder maps
For the same issue in eclipse: Find the folder $tf and delete it.
You will find the $tf folder in the workspace directory. If not then search for the $tf folder.
Once you have found it, delete it.
In my case, none of the other answers helped - the problem was occurring on a machine that didn't have Visual Studio and no matter how I tried to get rid of the bad workspace data it never worked. After working with procmon a bit, I discovered another critical folder that might be the source of this error: C:\Users\All Users\Microsoft Team Foundation Local Workspaces\ (it might also be under C:\ProgramData (on my system, 'All Users' is a symlink to that folder, but not sure if this is typical.) In this folder there are sub-folders named like guids that contain some other folders, one per workspace it appears. In my case, some of the data in these folders was old and some was corrupt. Once I deleted the bad workspace folders, all my problems disappeared. You might also want to delete the Cache folder as identified in the comments of this post, but that didn't help me (didn't seem to hurt though, either.)
Alternatively, you could just backup your current workspace to a different location, re-create your workspace, and copy the files back that you had made changed to. VS should detect the newest files and automatically check out these files allowing you to check in the newer versions that you copied back from your backup.
What worked for me is, delete the local folder(s), restart your machine, then map the projects again. Any pending changes you have just save them somewhere else temporarily.

Copy files to another folder during check in (TFS Preview)

I have the following scenario: The company edits aspx/xml/xslt files and copy manually to the servers in order to publish them. So, no build is done. For the sake of control we've decided to adopt TFS Preview since it tracks the version, who edited and so on. Needless to say, it works like a charm. :)
The problem is that since we are unable to build the apps we can't set a build definition to automate the copy of files to another place which, as I've stated before, is done manually.
My question is: Is it possible to copy the files to another place (a folder in a server or local) during the check in? If so, how? (remember, we don't build. so we can't customize the build process...)
You have two options.
1) Create a custom check in policy. I'm not familiar with this process enough to give you any pointers, but I believe it can be done.
2) Create a custom build template, and use that for your builds. You should be able to wipe the build template down to nothing, and then only add the copy operation to it. This is probably the route I would take. Get started here.
You mention you are using TFSPreview, which is hosted on the cloud so it won't be able to access any machines in your network unless you're prepared to open up your firewalls :).
You can copy source controlled files around the TFS Instance ([say into a Source Controlled Drop F1) and then check this out after the build completes.
Start by familiarising yourself with customising the TFS Build Process.
When you're up to speed, you need to look at adding a "Copy" Activity in the Workflow to move the files to the drop folder.

TFS 2012 Build "Access to Path Denied"

I’m using TFS 2012 Build and running into an error
Access to the path is denied
The solution being built contains about 15 projects of which a number are using the Castle.Components.Validator.2.5.0 assembly. I have seen other posts that talk about the TFS Build Access Denied errors, but they generally refer to having simultaneous builds running. In this case only one build runs at a time. Also, the error occurs when the server is restarted or the build has not run for some time.Once a build is run and fails, the next one succeeds and each one after that succeeds again until the build hasn’t been run for a while or the server is restarted. Although we can get around this, it is a manual headache. Here is the error:
C:\WINDOWS\Microsoft.NET\Framework64\v4.0.30319\Microsoft.Common.targets (3513): Unable to copy file "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll". Access to the path 'D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll' is denied.
When looking at the log file you can see that the build is trying to copy the file twice. Because the first one has a lock on the file, the second one fails and thus the build fails. Here is a snippet of the log file that shows what is happening:
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
5>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\Castle.Components.Validator.2.5.0\lib\NET40\Castle.Components.Validator.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Castle.Components.Validator.dll".
2>_CopyFilesMarkedCopyLocal:
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\MvcContrib.Mvc3.FluentHtml-ci.3.0.96.0\lib\MvcContrib.FluentHtml.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\MvcContrib.FluentHtml.dll".
Copying file from "D:\Builds\12\Foo\Check-In Build\Sources\packages\RhinoMocks.3.6\lib\Rhino.Mocks.dll" to "D:\Builds\12\Foo\Check-In Build\Binaries\Rhino.Mocks.dll".
Any help on how to fix this would be greatly appreciated.
As others mentioned, this happens when performing multithreaded builds with a common destination directory and the file copy task happens to encounter a simultaneous conflict with a copy task running for a different project.
Normally this should result in a "file used by another process" exception (which is handled and retried by the file copy task) but sometimes the file operation results in an "Access is denied" exception instead. (I'm still not sure why)
Some suggest that you should "solve the duplication", but I don't see that as being feasible for cases where all the projects need to directly reference a library like log4net.
Obviously one way to prevent the issue is to explicitly run msbuild with /p:BuildInParallel=false or /m:1 or /maxcpucount:1 (or omit the argument entirely) to force single-threaded mode.
However, in TFS 2013, the default build template automatically always passes /m (use all cores) to msbuild, which silently overrides any single-thread setting you can manually pass in. (Determined by my own experimentation and examining diagnostic logs)
Another workaround I attempted was to manually pass /p:AllowedReferenceRelatedFileExtensions=none to msbuild, which prevents all pdb and xml files from being copied from referenced libraries. (Since for a while I only ever saw xml files having this issue.) But then I kept having problems with log4net.dll.
The ultimate workaround that I used was one I discovered by decompiling the source code for Microsoft.Build.Tasks.Copy:
if (hrForException == -2147024891)
{
if (!Copy.alwaysRetryCopy)
throw;
else
this.LogDiagnostic("Retrying on ERROR_ACCESS_DENIED because MSBUILDALWAYSRETRY = 1", new object[0]);
}
If error -2147024891 (0x80070005 access is denied) occurs, the Copy task will check a special variable to see if it should retry. That value is set via an environment variable:
Copy.alwaysRetryCopy = Environment.GetEnvironmentVariable("MSBUILDALWAYSRETRY") != null;
After setting the environment variable MSBUILDALWAYSRETRY = 1 (and restarting the build server), the problem went away. And I also periodically started seeing "Retrying on ERROR_ACCESS_DENIED..." as warnings in the build logs, proving that the setting was taking effect, (instead of the builds merely coincidentally succeeding).
(Note that this environment variable is not well documented, use as appropriate.)
Update: Apparently TFS 2015 no longer overrides your /m:1 with /m (even on legacy/XAML build definitions), which should make /m:1 a valid fix again.
It looks like there are two projects copying the same file. Depending on the timing, they sometimes happen at the same time, resulting in the failure. You have to trace the node id back to find the source project. See http://blogs.msdn.com/b/buckh/archive/2012/01/21/a-tool-to-find-duplicate-copies-in-a-build.aspx for more details and code that may track it down for you.
As Buck Hodges and Nimblejoe have rightly said, this is mostly due to TFS running multiple MSBuild processes by default to build your projects.
You can override it in the build definition in Process -> 3. Advanced -> MSBuild Arguments by adding the MSBuild argument /p:BuildInParallel=false
This can also happen if you have a build agent's folder open.
I also had same problem. I got error messages that related to cannot copy since access to path denied. In my case all my dll's and xml files and so on are place at
D:\TFS\Example\Bin\Debug folder.
I right clicked on Bin folder and clicked Properties and saw that Read-only check box is checked under Attributes.
I un-checked Read only check box and cliked Apply and clicked OK on the new popup that is shown.
I went back to Visual Studio and build my solution which was giving me error messages.
Voilaa.. This time it build successfully without errors.
I donot know whether this is perfect but I did this to solve my issue.
To work around this problem I had to remove the "ReadOnly" flag on the source directory
Then in the build definition set Clean Workspace
to None
Like Ziggler, I solved this problem with building a project by removing the 'read only' property of the bin folder in my project. It is only happening to XML files stored in a /packages/ directory that is common to the solution that contains this project. The 'bin' folder is not checked into source control. I am still stumped as to the root cause of the problem.
I found the same problem which occurred after the build tried to overwrite files in the "Working Directory" it had created in a previous attempt to build. (set in the Agent)
I resolved this by manually deleting the output folder it created (in my case [Working Directory]\Binaries) before attempting the build.
This can be done automatically by changing the Build Definition. Under Process---2.Basic---Clean Workspace set this to the Outputs option
Here's a variation of this problem which I had to deal with:
I couldn't figure out why my build kept failing on an "Access to the path is denied" error, even though I had added things like /p:BuildInParallel=false and /p:OverwriteReadOnlyFiles=true to the MSBuild Arguments of my XAML build. The cause turned out to be a "Post-build event command line" in my Project's properties.
After changing
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false
to
%WinDir%\Microsoft.NET\Framework\v4.0.30319\MSBuild.exe[SNIP]
/P:Configuration=$(ConfigurationName);[SNIP]
;AutoParameterizationWebConfigConnectionStrings=false;OverwriteReadOnlyFiles=true
the error went away.
One possible cause is if you have the bin or obj folders for class libraries checked-in into TFS. Deleting the bin or obj folders of the projects from TFS will resolve this issue if that is the case.
I was having this problem and chose to ignore it because I didn't want to sacrifice build performance for the sake of getting rid of some benign error messages by NuGet. However, I seem to have stumbled across a solution while trying to solve another problem, and I think it is related. I think the order of fetching of NuGet packages is related to the build order of projects in the solution. So if this has somehow become disjointed, then NuGet may be the first casualty before you run into build errors where you start getting "Metadata file 'XXX.dll' could not be found" errors which annoyingly require you to build again until the build succeeds (as described here).
So, I believe the solution is to follow the steps described in the accepted answer to the aforementioned question. Or, follow the more comprehensive steps in one of the alternative answers. In other words, disable building of all projects, restart VS, then re-enable building of all projects. This will (normally) resolve build order. And that should hopefully resolve the NuGet issue. Please let me know if this fixes it for anyone.
I had this issue, with TFS 2015.
It turned out to be because the build Agent was running under the default (NETWORK SERVICE) credentials, which didn't have write permissions on the target folder.
Once I'd removed the Agent and reinstalled it with credentials it worked.
It did have me trawling through the logs for a while, checking and unchecking the multi-proc box and even restarting the build server in my hunt.
Check the obvious stuff first...
For me, it was that the build agent wasn't started in an administrator powershell.
MSBuild arguments:- /tv:14.0 /t:Rebuild /m:1 /p:RunCodeAnalysis=false /p:TreatWarningsAsErrors=false /p:OverwriteReadOnlyFiles=true /p:BuildInParallel=false /p:AllowedReferenceRelatedFileExtensions=none
strong text
Set false to Clean workspace
Go to build agent and remove read only from mapped folder.
as a lot of people have already stated before, this happens when building projects in parallel. Project A and B both referencing 3rd Party Library C (Copy Local) will cause this when they are build at the same Time - side by side.
The real problem is, that TFS Build 2012 and below are configured that when building a solution, the whole output of the solution is copied to a single folder. Thats where the pains of parallel builds are having their origins.
Since TFS 2013 you can easily solve this by setting the "Output location" in the build definition to "PerProject". This forces the build services to behave like a local msbuild run where the setings regarding the output locations are read from the corresponding project files. So the output is written to the bin folders under each project.
For TFS 2012 and below this article (+linked articles) will help you getting the same result as with TFS 2013:
http://blog.stangroome.com/2012/05/10/override-the-tfs-team-build-outdir-property-net-4-5/
I resolved a very similar issue by closing all open instances of Visual Studio, re-opening the solution and building it again.

Resources