When I originally installed VS Ultimate 2013 everything was fine but for the last month or so it's been a dog.
The source control explore in my Visual Studio 2013 install is very slow. Just clicking on a node and the act of displaying the node contents takes 20+ seconds.
Everyone else on the team is ok so it's not the TFS server it's just my install.
I assumed it was some addin I'd installed into VS so disabled them but no luck.
Any ideas?
Having tried all suggestions, unloaded all add ons, tried to reinstall VS, removed all extra workspaces etc. the answer to my problem was to unmap my workspace and then remap it.
Problem solved. Not got a clue what the underlying fault was.
In my case, the only way to get rid of the lag was to change my workspace location from "local" to server. You can do this under the advanced options for your workspace.
The 'full blast' solution that worked for me was;
remove workspace
delete all source code
rebuild the workspace
rebuild solution
Only takes a few minutes more than just rebuilding the workspace (see #DaveF's answer) but gave me a bit more confidence that everything hangs together.
Had this happen to me a few times now, so there are some things I'd like to add to the accepted answer.
I work in a place where we have a lot of VS solutions with a lot of files in them. Microsoft's guidelines suggest that you shouldn't be using a local workspace if its going to have more than 100,000 items in it. So you could prevent this problem entirely by:
Not using local workspaces
Making sure never to map enough folders into a single workspace that it gets over 100,000 files associated with it.
Periodically declaring "TFS bankruptcy" and unmapping everything.
For me, the drawback of having to use strict locking and not having offline access makes #1 unacceptable. I'm going to try harder to do #2, but honestly #3 is what I've been living by.
Its kind of like early Windows, where every year or so you had to just reinstall the OS to remove all the accumulated cruft.
Cleaning local folders helped: See 'Team Explorer - Pending Changes', under 'Excluded Changes' it said: 'Detected: 50000 add(s)'. Click it to see path to folders.
This make me crazy too for over six months until I found this instruction. Now, my VSO is fling. Note: this information I copy from somebody. Would like to give them credit but I cannot remember how did I find this.
You can fix this problem of TFS by editing registry.
Navigate to key
HKEY_CURRENT_USER\Software\Microsoft\VisualStudio\12.0\TeamFoundation\SourceControl\Proxy
and then change value of URL to any dummy website like 'www.abcdummy.com'
Restart VS after editing registry key value.
I had the same problem, it kept me busy for a week or so, but after investigating my complete setup i found the following:
Within my ASP.NET application, i had an image directory and an image cache directory, with lots of images in them. (+200.000). Both were not included in my VS project, but still Visual Studio / TFS tripped over this.
First i found, that when checking in some files (which took over 10 minutes when the problem existed), in 'Team Explorer - Pending Changes', under 'Excluded Changes' it said: 'Detected: 50000 add(s)'.
Trying to get rid of this the 'normal way', by opening that 'Promote Candidate Changes' window and setting these files to be ignored, still didn't do much.
But after moving those image directories to some other location, outside my project, all problems disappeared.
Of course i had to add those moved directories as virtual directories to still see my images.
I cleaned my workspace of unnecessary projects and it ran better. I think vh_click is on to something with the 50,000 ads thing. TFS keeps track of all your edits and over time with tons of projects, undos, and craziness you could amasse a large set of which TFS has to chug. Get out the Clorox, the Comet or whatever else you clean with and dump some junk or move it to some archive folder or backup drive.
Cleaning up the Workspace was the solution for me, when opening visual studio 2015 the Source Control Window will remain in a Loading phase, I had 2 workspaces name and name_1 and I removed both.
No need to delete the entire folder, though , keep in mind that if you do delete the workspace and have the files, you will need to force the get latest to be on the safe side
Getting Latest was soooooo slow. I was using a Colleagues PC and had deleted his Workspace.
After an hour waiting to get latest I got an error and realised my User Account didn't have Full Control on the folder, giving Write Access made Get Latest run x1000 faster:
Just to throw another solution in the mix! I had the same problem which seemed to be caused by several layers of working folders configured in my workspace (some overlapping ones too).
The issue was resolved by going to Manage Workspaces, then Edit and then removing the additional folder bindings.
in short "Run it as an administrator".
No one of those solution does work at all, I even search on this link:
Why is Visual Studio 2013 very slow?
In vain, just do this ONE simple step:
Go to your visual studio path, usually installed on this path:
C:\Program Files (x86)\Microsoft Visual Studio 12.0\Common7\IDE namely the file "devenv.exe" , then right click on it, click "Run an administrator" ===> then open your visual studio project.
So, you can just send a shortcut of "devenv.exe" to the desktop to easily run it as an administrator each time.
Have ^_^ Fun
You can keep workspace on local and change your workspace to this. I did that and my TFS speed was great :
1- Remove all mapped folders in workspace "Edit":
2- Change workspace folder to parent all of mapped folders:
I hope it is useful for you.
Related
I am having issues trying to check in changes to TFS hosted on Visual Studio Online. Started just last week. I am running Visual Studio Professional 2017 version 15.5.2.
When I try to check in changes, I get this error:
C:\My\Workspace\Path\Project\File.cs: Download of item $/Workspace/Path/Project/file.cs was not completed. Perform a get operation to correct.
Okay, whatever. Sounds simple enough.
So I go to the problem file and do a Get Latest Version.
When I do that, I then get this error:
Source Control Explorer
The network path was not found.
The output in the Output window after attempting the get is this:
Conflict C:\...\...\...\...\...\Program.cs - Unable to perform the get operation because you have a conflicting edit
Automatically resolved conflict: edit: C:\...\...\...\...\...\Program.cs as TakeTheirs
The network path was not found.
I have read a number of posts, and tried a number of things. None have fixed the issue. Things I have tried...
Delete the TFS cached under AppData. Did nothing to help.
Disconnect VS from the TFS project, then delete the hidden $tf folder under my local workspace and then reconnect VS to the team project and re-get everything. The initial re-gets all worked. But once I made some changes to a project and then tried to check it in, it started with these errors again.
Anyone have any other ideas? The next step I see in my future is having to uninstall and reinstall visual studio, but I'm REALLY trying to avoid that.
So, the solution is Deleteing and recreating the workspace.
You will meet similar issues when a workspace is messed up.
Generally, you can try below item to fix such an issue:
Disconnect any instance, close VS, then delete the cache folder located at: e.g. %localappdata%\Microsoft\Team Foundation\7.0\Cache, then restart the VS, connect to TFS/VSTS again.
Remap the workspace to a new folder
Delete the old workspace and create a new one, map it.
By the by, ... I had this issue again today, and had a slightly different fix. When I went to delete my workspace, I found that a prior aborted new project had somehow added additional working folder mappings to my workspace. So I deleted those, after which VS said it needed to restart. So I let it, and now everything seems to be working fine.
Just figured I'd let people know, in case they find that situation, so they needn't go through with the whole other drastic measures, to check this first and try it if it applies.
Just started up Visual Studio 2012 and opened my solution which is in source control with Team Foundation Server 2012 Express and encountered this, any ideas? Can't get latest, can't check in, everything appears checked out :( Basically my workspace is unusable right now.
TF400018: The local version table for the local workspace MY-PC;My
User could not be opened. The workspace version table contains an
unknown schema version.
There is only one post I could find on the net, and the answers are pretty vague.
I had the same issue, and I just fixed it on mine.
If you don't mind re-map all your projects, you can try follow:
Click the box in "Workspace".
Click on "Workspaces".
Delete the workspace profile you're currently using
Re-connect to TFS open "Source Control"
Be aware that you may lose all your TFS mappings, you may need to re-map all your projects from TFS. Backup your changes that not checked in yet.
cycle6 is correct, but it isn't clear that you will not lose your pending check-in list if you follow some additional steps.
Click the box labelled "Workspace".
Click on "Workspaces".
Delete the corrupt workspace profile, accepting the warning.
Re-connect to TFS and open "Source Control Explorer"
Create a new workspace
One by one, map your projects to the same folder as before
You will be presented with a list of conflicts, where you have matching writable files in the folder already.
Choose "Keep local copy" for each file you had checked out before, and "Take Server Version" for any files changed by other members of the team that you didn't have the latest version for. This might take a while depending on the length of the list, but it is worth comparing versions for any file you are unsure of.
You will be left with your solution and all pending items marked as checked out, with your work preserved.
I did the following steps and it solved the issue:
deleted the hidden folder named $tfs and then
in the Visual Studio, Solution Explorer: Right click on the solution node > the Source Control > Get Specific version > latest version
If you already have multiple instances of Visual Studio open.
Close all of them . [in some cases you need to log out from windows & log back in OR restart ]
Rename the $tf folder with any other name (eg. $tft)
Start Visual Studio, to see your issue fixed. :)
Hope this helps.
Sometimes this happens when you are running out of disk space.
Try to see if you have very low space, eg. < 10 MB.
If that so, try to clean up your windows Temp folder. See if that solve this issue
It's a misleading message to an extent.
What has happened is that the internal data structures of the workspace have become corrupt.
The ends up as the code (in the tf command, Visual Studio, et al.) to load those data structures failing to load from the relevant files, which becomes an error about a schema version problem.
In the case that I experienced, this was because the machine hosting the workspace ran out of disc space while doing operations upon the workspace of various kinds (check-outs, check-ins, adding pending changes — it was actually a bunch of workspaces being used by TFS 2017 build agents and multiple active builds).
This corrupted parts of the data that are held in the files under the hidden $tf subdirectory (it always being a local workspace on a TFS 2017 build agent), because source control wasn't able to rewrite/extend these files.
Other answers here discuss partly retaining some of the files, based upon more specific knowledge of what has not been corrupted (such as preserving the internal files storing pending changes if one wasn't creating any pending changes), but the basic idea is that one needs to reset all of the stuff in $tf to a sane state of some kind.
In my case, I had the disadvantage of multiple potential causes and no consistent knowledge of which parts of $tf were corrupted, but I conversely had some advantages:
It being a TFS build, arranged to build from the build agent's s (source) directory into its a (artifact staging) and b (binaries) directories, there were not masses of non-source-controlled object and other files in the workspace (which is the s directory) that would have ended up as pending additions.
There were not any pending changes (to actual source files) worthwhile to preserve. I could afford to lose all information about source files, and indeed all current locally-stored information about the workspace, and simply run the build again with a fresh sane and largely unpopulated workspace. I did not even need to restore source files and directories for the whole workspace, as the first task in any TFS ("vNext") build is a "Get Sources" task that uses (variously) tf vc scorch, tf vc undo, and tf vc get to check out the right source version.
So simply, in Developer PowerShell (Visual Studio being installed on the build machine):
Remove-Item -Recurse -Force 'X:\Agents\07\_work\1138\s'
tf vc get 'X:\Agents\07\_work\1138\s'
(Note that one can always get at the tf command in some way on a TFS build machine. Every build agent has a local helper copy of tf.exe and its ancillary DLLs in its VSTS "OM" subdirectory.)
I possibly could have omitted the tf vc get step, but having had trouble with "Get Sources" in the past I do not trust it to robustly cope with arbitrary manual external alterations, such as no s directory when the build isn't configured to outright delete that entire directory itself (as it can be but was not here).
For the same reason, Microsoft's own "agent maintenance" (another way to clean things up) is quite dodgy, and ends up leaking workspaces on the TFS server (which I have raised a bug with Microsoft about).
There is simple workaround. Remove local mapping to folder where is the sources (Advanced -> Remove Mapping, or just rename or delete mapped folder. After that you will be able to connect to tfs. Download the project again.
If you already have multiple tfs instances of Visual Studio open.
1.) Open File -> Source Control -> Manage Workspaces
2.) Delete all tfs map
3.) Then select folder maps
For the same issue in eclipse: Find the folder $tf and delete it.
You will find the $tf folder in the workspace directory. If not then search for the $tf folder.
Once you have found it, delete it.
In my case, none of the other answers helped - the problem was occurring on a machine that didn't have Visual Studio and no matter how I tried to get rid of the bad workspace data it never worked. After working with procmon a bit, I discovered another critical folder that might be the source of this error: C:\Users\All Users\Microsoft Team Foundation Local Workspaces\ (it might also be under C:\ProgramData (on my system, 'All Users' is a symlink to that folder, but not sure if this is typical.) In this folder there are sub-folders named like guids that contain some other folders, one per workspace it appears. In my case, some of the data in these folders was old and some was corrupt. Once I deleted the bad workspace folders, all my problems disappeared. You might also want to delete the Cache folder as identified in the comments of this post, but that didn't help me (didn't seem to hurt though, either.)
Alternatively, you could just backup your current workspace to a different location, re-create your workspace, and copy the files back that you had made changed to. VS should detect the newest files and automatically check out these files allowing you to check in the newer versions that you copied back from your backup.
What worked for me is, delete the local folder(s), restart your machine, then map the projects again. Any pending changes you have just save them somewhere else temporarily.
When I open a solution for the first time after it has been downloaded from TFS, it (Vs2010) is unable to find the NuGet.targets file.
I've checked TFS and it's marked as downloaded, and it exists on the file system.
If I try to open the solution directly from TFS again, it suddenly works.
I feel this is the reason why my automated builds are also failing.
Has anyone come across this issue before?
Ran into this Friday and on another machine today.
For the machine on Friday I copied the .nuget directory, since I didn't have one.
For the machine today it had the .nuget directory and copying it from another machine didn't resolve the issue. Opening it from TFS's Source Control Explorer didn't work either.
We then followed the steps on Opening project in Visual Studio fails due to nuget.targets not found error (enable Package Restore on the solution) and it worked without issue.
Hadn't run into this before last week, and it's just one project of many, with none of the others having this problem.
When Visual Studio downloads solutions from TFS (double click sln file in solution explorer) it appears to download files one by one and load them up. Unfortunately it seems to try opening project files before it downloads the .nuget directory, which is why it can't find the file. The last thing it appears to do is download that file, which explains why it is on disk but gave the error. If you reopen the solution it's already there and works fine.
When TFS Build server downloads a solution to build, it does so on the solution directory instead. Which means it will get the .nuget directory before it tries to build so it shouldn't cause issues on the build server.
I believe this is a bug in Visual Studio, it really should download all the solution items first. Although it would be nice if it had the same behaviour as TFS Builds.
A work around for this issue is to get latest on the solution folder before you open the solution for the first time. Not ideal but it works.
I'd also suggest logging a bug with either the nuget or visual studio team, however I suspect they're probably already aware of it.
I had this problem trying to run through the tutorial at http://www.windowsazure.com/en-us/develop/net/tutorials/multi-tier-web-site/2-download-and-run/
Turns out the zip file the source code was in extracts into a folder containing commas, which I don't think msbuild liked. Moving it into a more safely named directory helped.
Try these steps
Install Nuget.
Right click on the solution and select "Enable NuGet Package
Restore".
Click Ok on the warning.
Close and re-open the solution.
Why Why WHY doesn't TFS's get latest work consistently?
You would have thought that feature would have been tested thoroughly.
What I have to do is, get specific version, then check both overwrite writetable files + overwrite all files.
Is my local setup messed up or you do this also?
TFS redefined what "Get Latest" does. In TFS terms, Get Latest means get the latest version of the files, but ignore the ones that the server thinks is already in your workspace. Which to me and just about everyone else on the planet is wrong.
See this link: http://blogs.microsoft.co.il/blogs/srlteam/archive/2009/04/13/how-get-latest-version-really-works.aspx
The only way to get it to do what you want is to Get Specific Version, then check both of the "Overwrite ..." boxes.
Sometimes Get specific version even checking both checkboxes won't get you the latest file. You've probably made a change to a file, and want to undo those changes by re-getting the latest version. Well... that's what Undo pending changes is for and not the purpose of Get specific version.
If in doubt:
undo pending check in on the file(s)
do a compare afterwards to make sure your file matches the expected version
run a recursive 'compare' on your whole project afterwards to see what else is different
keep an eye on pending changes window and sometimes you may need to check 'take server version' to resolve an incompatible pending change
And this one's my favorite that I just discovered :
keep an eye out in the the Output window for messages such as this :
Warning - Unable to refresh R:\TFS-PROJECTS\www.example.com\ExampleMVC\Example MVC\Example MVC.csproj because you have a pending edit.
This critical message appears in the output window. No other notifications!
Nothing in pending changes and no other dialog message telling you that the file you just requested explicitly was not retrieved! And yes - you resolve this by just running Undo pending changes and getting the file.
TFS, like some other source control providers, such as Perforce, do this, as the system knows what the last version you successfully got was, so get latest turns into "get changes since x". If you play by its rules and actually check things out before editing them, you don't confuse matters, and "get latest" really does as it says.
As you've seen, you can force it to reassess everything, which has a much greater bandwidth usage, but behaves closer to how SourceSafe used to.
It's hard to respond to a statement without examples of how it's not working, but it's crucial to understand that TFVC (in "Server Workspace" mode, which was the mechanism prior to TFS 2012) does not examine the state of your local filesystem. TFVC Server Workspaces are a "checkout-edit-checkin" type of system where this is by-design, an intentional decision made to massively reduce the amount of file I/O required to determine the state of your workspace. Instead, the workspace information is saved on the server.
This allows TFVC Server Workspaces to scale to very large codebases very efficiently. If you are in a multi-gigabyte code base (like Visual Studio or the Windows source tree) then your client does not need to scan your local filesystem, looking for files that may have changed, because the contract you have with TFS is that you will explicitly check a file out when you want to edit it.
You are expected to not mark a file as write-only and change it without explicitly checking it out first. If you go down this route, then the server does not know that you have made changes to your file, and performing a "Get Latest" operation will not update your local workspace, because you haven't told the server that you've made changes.
If you do subvert this mechanism then you can use the tfpt reconcile command to examine your local workspace for changes that you have made locally.
If you find yourself using "Get Specific Version" and selecting the "force" and "overwrite" options, then it is very likely that you are in the habit of bypassing all of the enforcements that TFS has implemented to keep you from hurting yourself, and you should probably consider TFVC Local Workspaces.
TFVC Local Workspaces provide an "edit-merge-commit" type of version control system, which means that you do not need to explicitly check files out before editing them and they are not read-only on-disk. Instead, you simply need to edit the file, and your client will scan the filesystem, notice the change, and present this as a pending change.
TFVC Local Workspaces are recommended for small projects that do not require fine-grained permissions control, since they present a much nicer workflow. You are not required to be online, and you do not have to explicitly check files out before editing them.
TFVC Local Workspaces are the default in TFS 2012, and if they are not enabled for you, then you should ask your server administrator. (Organizations with very large codebases or strict auditing requirements may disable TFVC Local Workspaces.)
Eric Sink's excellent book Version Control By Example outlines the differences between checkout-edit-checkin and edit-merge-commit systems and when one is more appropriate than the other.
The Professional Team Foundation Server 2013 book also provides excellent information about the differences between TFVC Server Workspaces and TFVC Local Workspaces. The MSDN documentation and blogs also provide detailed information:
Decide between using a local or a server workspace
Server workspaces vs. local workspaces
Team Foundation Server – Trying to understand Server versus Local Workspaces
Team Foundation Server (TFS) keeps track of its local copy in a hidden directory called $TF.When you issue the "get Latest Version", TFS looks into this folder and see weather I have the latest copy or not. If it does it will not download the latest copy. It does not matter if you have the original file or not. In fact you might have deleted the entire folder (as in my case) and TFS won't fetch the latest copy because it does not look into the actual file but the hidden directory where it records changes. The flaw with this design is, anything done outside the system will not be recorded in TFS. For example, you may go into Windows explorer, delete a folder or file and TFS wont recognize it. It will be totally blind. At least I would expect there Windows would not let you delete this file but it does!
One way to enforce the latest copy is to delete the hidden $TF folder manually. To do that, go to command prompt and navigate to the root folder where you project was checked out and issue this command
rd/s $tf // remove $TF folder and everything inside it
If you want to just check the hidden folder, you can do it using
dir /ah // display hidden files and folders
Note: If you do it, the tf will think you do not have any local copy even though you have it in files and it will sync up everything again.
Caution: Use this method at your own risk. Please do not use it on critical work.
"Get latest version" by default will only download the files that have changed on the server since the last time you ran "Get latest version". TFS keeps track of the files you download so it doesn't spend time downloading the same version of the files again. If you are modifying the files outside of Visual Studio, this can cause the consistency problems it sounds like you are seeing.
Unfortunately, there has to be one or more bugs in TFS 2008, since this problem regularly crop up on developer machines and build servers where I work as well.
I can do Get Latest, I can see in the history list of the project that there have been commits after I last did a Get Latest, I have not touched the files on disk in any way, but after the "Get Latest" function has completed, when I check the TFS tab, some of the files still says that they're not the latest version.
Obviously TFS is able to determine that I have old files locally, since the list says so. Yet, Get Latest fails to do that, get the latest version. If I do what you did, use the Get Specific version, and check the two checkboxes at the bottom of the dialog, then the files are retrieved.
We changed our build servers to always use the Get Specific version type of function instead, so this part now works, but since our build server (TeamCity) also relies on checking if there have been changes to the files in order to kick off a build, sometimes it lapses into a "nothing changed, nothing to see here, move along" mode and does nothing until we forcibly run the build configuration.
Note that I have experienced this problem on a machine that is never touched, except for get latest + build, both manually, so there's nothing tampering with the files. It's just TFS getting confused.
One time this cropped up I verified that the files on disk was indeed binary identical to the version previously retrieved, so no manual tampering had been done with the files.
Also, I fail to see how TFS can "know" whether files have changed on disk or not without actually looking at the contents. If one part of TFS can see that the files are indeed not the latest version, then the Get Latest version should absolutely be able to get the latest version. This in reference to comments to other answers here.
It might because you are login TFS as the same user, and the workspace name (based on machine name by default) is also the same, so TFS thinks your are on the same machine and same workspace, thus you already have the latest version of the files, so it wont get them for you.
try rename your machine, and create a new workspace as a new machine.
Go with right click: Advanced > Get Specific Version. Select "Letest Version" and now, important, mark two checks:
The checks are:
Overwrite writeable files that are not checked
Overwrite all files even if the local version matches the specified version
WHen I run into this problem with it not getting latest and version mismatches I first do a "Get Specific Version" set it to changeset and put in 1. This will then remove all the files from your local workspace (for that project, folder, file, etc) and it will also have TFS update so that it knows you now have NO VERSION DOWNLOADED. You can then do a "Get Latest" and viola, you will actually have the latest
I had the same issue with Visual Studio 2012. No matter what I did, it didn't get the code from TFS source control.
In my case, the cause was mappings a folder + subfolder from the source control separately but to the same tree in my local HD.
The solution was removing the subfolder mapping using the "manage workspaces" window.
Most of the issues I've seen with developers complaining that Get Latest doesn't do what they expect stem from the fact that they're performing a Get Latest from Solution Explorer rather than from Source Control Explorer. Solution Explorer only gets the files that are part of the solution and ignores anything that may be required by files within the solution, and therefore part of source control, whereas Source Control explorer compares your local workspace against the repository on the server to determine which files are needed.
It could happen when you use TFS from two different machines with the same account, if so you should compare to see changed files and check out them then get latest then undo pending changes to remove checkout
This worked for me:
1. Exit Visual Studio
2. Open a command window and navigate to the folder: "%localappdata%\Local\Microsoft\Team Foundation\"
3. Navigate to the sub folders for every version and delete the sub folder "cache" and its contents
4. Restart Visual Studio and connect to TFS.
5. Test the Get Latest Version.
In my case, Get specific version, even checking both check boxes and undoing all pending changes didn't work.
Checked the work spaces. Edit current workspace. Check all paths.
The solution path was incorrect and was pointing to a deleted folder.
Fixed the path and get latest worked fine.
Every time this happens to me (so far) is because I have local edits pending on the .csproj project file. That file seems to keep a list of all the files included in the project. Any new files added by somebody else are "not downloaded" because they are not in my locally edited (now stale) project file. To get all the files I first have to undo pending changes to the .csproj file first then "get all". I do not have to undo other changes I have made, but I may have to go back and include my new files again (and then the next guy gets the same problem when he tries to "get all"...)
It seems to me there is some fundamental kludginess when multiple people are adding new files at the same time.
(this is in .Net Framework projects, maybe the other frameworks like Core behave differently)
just want to add TFS MSBuild does not support special characters on folders i.e. "#"
i had experienced in the past where one of our project folders named as External#Project1
we created a TFS Build definition to run a custom msbuild file then the workspace folder is not getting any contents at the External#Project1 folder during workspace get latest. It seems that tfs get is failing but does not show any error.
after some trial and error and renaming the folder to _Project1. voila we got files on the the folder (_Project1).
Tool:
TFS Power Tools
Source:
http://dennymichael.net/2013/03/19/tfs-scorch/
Command:
tfpt scorch /recursive /deletes C:\LocationOfWorkspaceOrFolder
This will bring up a dialog box that will ask you to Delete or Download a list of files. Select or Unselect the files accordingly and press ok. Appearance in Grid (CheckBox, FileName, FileAction, FilePath)
Cause:
TFS will only compare against items in the workspace. If alterations were made outside of the workspace TFS will be unaware of them.
Hopefully someone finds this useful. I found this post after deleting a handful of folders in varying locations. Not remembering which folders I deleted excluded the usual Force Get/Replace option I would have used.
I encountered the same problem:
My development server was corrupted and restored, but the information restored was from a few days ago.
TFS was updated that all the files are up to date, but in practice my files were correct a few days ago!
Nothing I did helped. get latest did not get the latest version.
At the end I got specific varision from a month ago. my files were updated accordingly, and then I did get latest.
And it worked. the files have been updated.
I have a Project on CodePlex which is using TFS and I am using the TFS Plugin for Visual Studio. Now I have copied this project and worked on another PC without TFS and done some refactoring. Foolishly, I have then just used copy/paste and manual text editing to merge my changes, expecting that TFS just picks up the changes.
Apparantly, that is not the case.
Here is a screenshot of my local directory:
My Local TFS http://img259.imageshack.us/img259/2897/tfslocal.jpg
Notice how some files are missing the lock symbol - those are missing. If you look at the current TFS Tree on Codeplex, there are some files which do not exist locally anymore, i.e. WikiPlexExtensions.cs in the main folder.
Is there any way to easily tell TFS to compare my local to the remote repository and pick up the changes? I could re-add the local files using "Exclude from local project" and re-adding them, and I could create the "deleted" files as empty files just to delete them, but if I can avoid the manual messing around that would be good as well :)
The easiest way is to exploit VS 2008's "online" feature. Basically you want to set your solution offline, then bring it online while connected to the proper Codeplex server. TFS should figure out the rest.
Feature overview: http://msdn.microsoft.com/en-us/teamsystem/bb898913.aspx
Tweaking the settings by hand: http://blogs.msdn.com/benryan/archive/2008/07/09/using-tfs-2008-power-tools-to-modify-server-s-offline-state.aspx
To compare local and server folders, you can check out TFS Power Tool. After installing it, you can bring up the source control explorer, right click on the server folder and then select 'Compare'. Folder difference window will display the differences. You can also right click on the differences to see available commands such as 'Get Latest' to update your local folder for example. Check out Bryan Harry's blog post on the power tool
I don't think there is an easy fix... What I've done in the past is back up those files that I have edited, then do a "Get Latest Version..." for the files I edited. This should change the files back to being read-only etc... Now, check out the files the regular way and paste the backups you had into the checked out files. Obviously this really only works when there are a couple of files you have edited.
TFS (in Visual Studio) has a "Reconcile" command for this, see Microsoft documentation, or this answer with steps.
BTW: This command may not haven been existing at time of original question, but this question came first when I was searching.