Moving TFS workspace to another PC without re-downloading - tfs

I have a TFS workspace which I need to move to my new PC. I have copied the whole folder structure over and ensured that the workspace is mapping to the correct folders. However the "Latest" column for every file displays as "Not downloaded". How can I reconcile this such that TFS is aware that the files match the server version?
The standard answer seems to be to re-download the whole thing. Unfortunately the repository is huge, my connection is unreliable, and I have monthly download quotas. Is there anything in the command-line tools or power tools that can make it compare file hashes or similar and realise that the files are identical?
Thanks.

There's a binary metadata file inside the working copy that stores the mapping of every path in your repo, to the path on the filesystem.
It uses absolute paths - so unless your new project folder occupies the exact same location as it did on the original computer they won't match.
Because it's a binary format, you can't do something simple like mass replace the paths with a text editor or sed.

Related

Is there a way to see the size of files in Plastic Cloud Edition?

I am trying to do an audit of the files I have in Plastic Cloud to see if there's anything in there that is hogging up space that we don't need in there, and I'm not sure exactly how to do that. What I have in my local workspace is not the same as what's in the cloud (because of ignore lists, if I'm understanding correctly?) so is there any way I can view what's in the cloud, or am I thinking about this all the wrong way?
The working copy downloaded in your machine is just the last version of the project, but the cloud storage includes all the history of the project (every previous revision for each file in different branches...)
From the branch explorer, you can right-click any changeset and select "Browse repository in this changeset". This way, you can check the tree of this specific changeset (including file paths and file sizes).
There is a recent feature that allows you to arrchive old file revisions in the cloud and this way, reduce the database usage. Please check: https://forum.plasticscm.com/topic/21799-trimming-to-reduce-size-of-the-database/?do=findComment&comment=45397

What do all the options on GetOptions mean?

The MSDN documentation lists four options, with limited explanation:
Overwrite "Overwrite existing writable files if they conflict with the downloaded files." Does this apply to all files, or just ones we've told TFS we've edited?
GetAll "Gets all files." What files does TFS not normally get?
Preview "Executes a get without modifying the disk." This one seems pretty clear.
Remap "Remaps existing items on the disk to the server items where the content and disk location are not changing." I have no idea what this means.
Overwrite: will blindly overwrite writable files that you have not pended for edit. If you have marked a file as 'writable' then you have violated the contract with TFS and it assumes that you have done this for a good reason (eg, modifying the file without taking a checkout, because you were working offline). This will generally produce a writable conflict on the file, but if you specify this flag, then the writable file will be overwritten.
This only applies to server workspaces (local workspaces are always writable). This has no effect on files that you have pended for edit. Get will always produce conflicts for files that are edited locally and updated on the server; if you want to update files that are checked out, you must undo the checkout (or resolve the conflict with TakeTheirs).
Get All: will download every file and update it, even if TFS believes that the local version is the same as the remote version and that downloading a new version would be a noop. TFS tracks every version that you have locally, as well as remotely, so this is only useful if you edit files locally without checking them out.
If you have kept them writable, then then - as mentioned above - this will be a writable conflict. If you have then marked them read-only then TFS assumes that you have not made any changes and will not bother updating them when you do a get (because it knows the file contents haven't changed). If you have manually changed the file contents, then marking this will update those files to the server version.
Preview: will just fire events and provide results that indicate what would be downloaded with the given parameters.
Remap: is a clever option that allows you to perform an in-place branch switching (which is very common with some version control systems that branch at the repository level - like Git - but somewhat complicated in TFVC.)
Consider that you have mapped $/Foo/main to C:\Foo, and done a get latest. If you update your working folder mappings so that $/Foo/branches/feature now points to C:\Foo, then issue a get with Remap, then the server will download only the changed files between main and branches/feature, so it's an inexpensive way to update your local workspace to a feature branch.
(If you're looking for an example, this functionality exists in the command-line interface and in Team Explorer Everywhere but not in Visual Studio.)

TFS MSBuild Copy Files from Network Location Into Build Directory

We are using TFS to build our solutions. We have some help files that we don't include in our projects as we don't want to grant our document writer access to the source. These files are placed in a folder on our network.
When the build kicks off we want the process to grab the files from the network location and place them into a help folder that is part of source.
I have found an activity in the xaml for the build process called CopyDirectory. I think this may work but I'm not sure what values to place into the Destination and Source properties. After each successful build the build is copied out to a network location. We want to copy the files from one network location into the new build directory.
I may be approaching this the wrong way, but any help would be much appreciated.
Thanks.
First, you might want to consider your documentation author placing his documents in TFS. You can give him access to a separate folder or project without granting access to your source code. The advantages of this are:
Everything is in source control. Files dropped in a network folder are easily misplaced or corrupted, and you have no history of changes to them. The ideal for any project is that everything related to the project is captured in source control so you can lift out a complete historical version whenever one is needed.
You can map the documentation to a different local folder on your build server such that simply executing the "get" of the source code automatically copies the documentation exactly where it's needed.
The disadvantage is that you may need an extra CAL for him to be able to do this.
Another (more laborious) approach is to let him save to the network location, and have a developer check the new files into TFS periodically. If the docs aren't updated often this may be an acceptable compromise.
However, if you wish to copy the docs from the net during your build, you can use one of the MSBuild Copy commands (as you are already aware), or you can use Exec. The copy commands are more complicated to use because they often populated with filename lists that are generated from outputs of other build targets, and are usually used with solution-relative pathnames. But if you're happy with DOS commands (xcopy/robocopy), then you may find it much easier just to use Exec to run an xcopy/robocopy command. You can then "develop" and test the xcopy command outside the MSBuild environment and then just paste it into the MSBuild script with confidence that it will work - much easier than trialling copy settings as part of your full build process.
Exec is documented here. The example shows pretty well how to do what you want, but in your case you can probably just replace the Command attribute with the entire xcopy/robocopy command (or even the name of a batch file) you want to use, so you won't need to set up the ItemGroup etc.

TF GET Won't Repopulate Directory

I set up my TFS Workspace and did TF GET, which downloaded huge quantities of files, exactly as expected.
Then I wanted to trim my local copy down to just the folders I was supposed to work with. So I deleted all the others on my local copy of the workspace (not using TF commands, just Windows and DOS commands)
One of those directories has been added to my list of things to work on.
How do I get that directory back without re-downloading everything?
C:
cd \TFS
TF GET Zebra
(All Files are up to date)
All my non-existant files are apparently up to date with the extant files on the TF server.
I am baffled how FileExists is equal to FileNotExists!
After trying about twenty or so variations of this approach, I surrendered.
C:
cd \TFS
TF GET /FORCE
It's downloading a metric ton of stuff. As far as I can tell, it's downloading everything. Funny that the format of the output is different than my original TF GET (but that's really a distraction from the question, which is repeated below).
How do I get TFS to only download the current stuff from the Zebra folder and thus repopulate it so what I have actually matches the server?
And I suppose a side question - If deleting the files without telling TFS about it was the wrong way to clear up the space on my local disk, what should my approach have been?
Sorry if the question is noob-like. But I guess when it comes to TFS I am, in fact, a noob.
Using TFS2010 you will have a server-based workspace, which means the server stores information about your workspace. These information include which files with which version you already downloaded. If you delete them using command line or Windows Explorer, TFS wouldn't recognize your changes. If you then do a "Get Latest" it checks the actual versions of files with the version you should (!) have. So if there is nothing new, TFS will not sent you anything, because it should be already there.
You can use the force option, so you would get the newest files no matter what is already in your workspace (or not)
C:
cd \TFS
TF GET Zebra /FORCE
The problem you face is a result of your workspace mapping, but this depends on how you are working. For example I'm TFS Admin, so I have one workspace for a whole TeamProjectCollection and never did a recursive Get Latest, I only get those files/folders I need. You could do the same with your TeamProject, but then you always need to use VS Source Control Explorer or command line for specific folders, what is not very handy.
I would suggest to create a workspace for the branch/folder you are working and cloak the folders you don't need. In this case you can do a GetLastest on your root, but still only get the files you are interested in.

is it possible to get a file from TFS into a local unmapped folder?

Looking at the docs for tf get I think the answer is no. Still, I could be wrong. I'd like to have a file that's mapped to C:\Projects\MyProject\SQL\myScript.sql. I'd then like to run a batch file that gets several files (including that one) from the repository, puts them into a local temp folder, runs them, then deletes them.
It's the first part that's the issue: I think that TFS won't let you get files into a folder without remapping the source folder to point to that folder first.
So I suppose I need to remap the remote folder to point to a new local folder (C:\Temp\Scripts for instance) then get the files, then map the folder back to where it was. Seems like an extra step that helps nothing.
Have a look at tf view to see if that does what you need. Otherwise, the usual pattern for doing stuff similar to this is to create a new workspace using the tf workspace command, map the folder, do a get then destroy the workspace.
Even better for you would probably to use the .NET API VersionControlServer.DownloadFile() - especially if you are doing his from a powershell script rather than a simple batch file.

Resources