I checked out a binary file in TFS - it's a source code file used by a proprietary system.
When I come to check it in I discover someone else has since committed a change. Since it's a binary file I cannot do a diff but worse, I can't even remember when I checked the file out e.g. which revision of the file I'm working on, to see how many commits I need to figure out.
How can I find this out?
You could try tf status command with /format:detailed to see whether you can get the information you want:
tf stat itemspec /format:detailed
Related
On one of our builds we are kicking off some automated process which is checking out and checking in some files automatically.
This all works rather well, but at this time we are running the checkin command which looks like the following
tf.exe checkin /force /comment:"foo" /noprompt /bypass /override:"bar"
All of the files with a Pending status will get checked in.
I'd like to make this script a bit more specific and only checkin the files (2 in total) which we actually change during the build, so we know for sure no files will get checked in 'by accident'.
I've already seen we can specify a single filename with the checkin command, but doing so we will get 2 different changesets in TFS, instead of 1. We would really like to have 1 changeset, containing both changed files as the changes belong to eachother.
Any ideas on how to approach this?
Minor addition / Short term solution
For the moment I've solved our 'problem' by specifying the folder where our modified files are located, which kind of looks like this
tf.exe checkin "/my/folder/location/" /recurse /force /comment:"foo" /noprompt /bypass /override:"bar"
Note the folder location and the /recurse parameter added.
You simply separate the files by spaces:
tf.exe checkin file1.ext file2.ext /force /Comment:"foo" /noprompt /bypass /override:"bar"
The documentation is not clear about this point but it might be a general specification of an itemspec that it can be multiple items.
See similar question about checkout: Is there a way to check-out multiple files from various folders in TFS in a single operation
As mentioned by others you might run into problems with the command line being longer than the system supports, in which case you might need to look at other solutions.
cmd.exe has a limit on how long a command can be. Using the version control API, or simply 'tf checkin /i' (no arguments) is likely to be a better choice than passing lots of long filenames.
It's normal if a file becomes automatically checked out due to a change, and if ultimately the contents of the file are changed back to it's original state. At that point you would see the message about identical contents upon comparison. You could also use tfpt uu /noget /r * command to ignore Files which are identical to the originals. You'll need to have TFS Power Tools installed for this to work. Note: there is no TFS Power tool 2017.
For more details please refer below two links:
Visual Studio TFS shows unchanged files in the list of pending changes
Can TFS Pending Changes show files that are truly changed like SourceGear Vault?
I have a bunch of files in the 'added' state across many folders that were accidentally deleted from the file system. How can I easily either undo them or convert all of them to a 'delete' status? I'd prefer not to have to manually undo each file one at a time.
What I've tried so far:
In the Pending Changes window, using the Undo command for each missing file is tedious. Since the window does not identify which files are missing, I have to compare this window to the file explorer and compare the contents of each folder.
The answers for this similar question don't apply to me because my files are in the 'added' state, so comparing my workspace to the server will not identify these missing files.
I've looked through the TFS Power Tools for something to identify missing files but haven't found anything that directly addresses missing files.
The tfpt online command doesn't address missing files in the 'added' state.
If I read you correctly, in this case your underlying file system and what TFS thinks is on your file system have gotten out of sync.
The best, easiest, way I know how to rectify this is to undo all your changes, then redo the adds, deletes, and edits that you actually require. I know that with many files this will be a pain, but let me reiterate: I mean the best, easiest way and not the fun, effortless way (which I don't think exists)!
Do a 'clean' in your project within Visual Studio, then delete any bin/ and obj/ folders in the source.
Then undo all changes for your project.
If you are using TFS < 2013, I would recommend the use of Team Foundation Power Tools online functionality. For instance:
cd \dev\path\to\project\root\
tfpt online /adds /diff /deletes /recursive .
Otherwise, if you are using TFS 2013, then you can use the built-in 'reconcile' functionality (cannot find a web URL for this, the 2010 docs are incorrect, use 'tf reconcile /?' for a description):
cd \dev\path\to\project\root\
tf reconcile /adds /deletes /diff /recursive /noignore /promote .
With TFS Power Tools installed, run this command from a VS command prompt in the appropriate folder:
tf reconcile /deletes /diff /recursive /noignore /promote
This will display a list of pending changes. The missing files will all be selected with a new status of 'delete'. Click Promote to save the change, then try again to check in.
This is a slight variation of d3r3kk's answer but without the /add flag which causes more files to be selected than necessary.
I'm using TFS and VS 2012 and my project is in a broken state and I can't figure out why. I'd like to go back to a previous version of my solution when I know it worked and make changes on that working version. However, when I choose to check out a specific changeset, it seems to me like it's only changing the files that were changed in that changeset. When I use git and check out a revision, my code looks exactly like it did at that revision. Files that didn't yet exist at that revision are removed, files that did exist have contents as they were at that revision, etc. But I can't seem to do the same in TFS. I can't figure out how to get all of the files (and only the files) in the state that they existed when a particular changeset was checked in. Am I missing something? Any help REALLY appreciated.
Try using the Advanced option when you right click on a solution or folder in Source Control:
Then when the dialog appears, check both check boxes so the version you have is overwritten with the specific version you want by selecting Changeset from the ddl and entering the changeset you are after...
This should overwrite the existing solution files with the specific version.
If you have trouble doing it over top of existing files, delete the source on your local machine first and get the specific version after that.
A changeset is just the files checked in at one time, not a snapshot of the whole system. You want to use labels for that. A label will mark all the files in their present state, just as you describe Git doing.
Find the changeset you want and "Get This Version" to only get the changed files.
Manually check out each file for edit in Source Control Explorer to match the changeset.
Now the previous changeset's edits can be checked in.
NOTE: This is MUCH quicker than getting the entire repo using "Get Specific Version."
I set up my TFS Workspace and did TF GET, which downloaded huge quantities of files, exactly as expected.
Then I wanted to trim my local copy down to just the folders I was supposed to work with. So I deleted all the others on my local copy of the workspace (not using TF commands, just Windows and DOS commands)
One of those directories has been added to my list of things to work on.
How do I get that directory back without re-downloading everything?
C:
cd \TFS
TF GET Zebra
(All Files are up to date)
All my non-existant files are apparently up to date with the extant files on the TF server.
I am baffled how FileExists is equal to FileNotExists!
After trying about twenty or so variations of this approach, I surrendered.
C:
cd \TFS
TF GET /FORCE
It's downloading a metric ton of stuff. As far as I can tell, it's downloading everything. Funny that the format of the output is different than my original TF GET (but that's really a distraction from the question, which is repeated below).
How do I get TFS to only download the current stuff from the Zebra folder and thus repopulate it so what I have actually matches the server?
And I suppose a side question - If deleting the files without telling TFS about it was the wrong way to clear up the space on my local disk, what should my approach have been?
Sorry if the question is noob-like. But I guess when it comes to TFS I am, in fact, a noob.
Using TFS2010 you will have a server-based workspace, which means the server stores information about your workspace. These information include which files with which version you already downloaded. If you delete them using command line or Windows Explorer, TFS wouldn't recognize your changes. If you then do a "Get Latest" it checks the actual versions of files with the version you should (!) have. So if there is nothing new, TFS will not sent you anything, because it should be already there.
You can use the force option, so you would get the newest files no matter what is already in your workspace (or not)
C:
cd \TFS
TF GET Zebra /FORCE
The problem you face is a result of your workspace mapping, but this depends on how you are working. For example I'm TFS Admin, so I have one workspace for a whole TeamProjectCollection and never did a recursive Get Latest, I only get those files/folders I need. You could do the same with your TeamProject, but then you always need to use VS Source Control Explorer or command line for specific folders, what is not very handy.
I would suggest to create a workspace for the branch/folder you are working and cloak the folders you don't need. In this case you can do a GetLastest on your root, but still only get the files you are interested in.
We are just in the process of migrating our TFS repo to Mercurial as we've had enough of TFS. Unfortunately TFS has thrown us one last curve ball before it lets us go. We've wrote a script that we intend to have "get" each changeset (including timestamp, check-in comment etc) and then add them to the Mercurial repo and check it in.
Unfortunately TFS is acting very strange when we execute the tf get * /version:C111 /overwrite command. It immediately returns "All files are up to date." But this is impossible. The workspace folder is empty! And viewing the details for the 111 changeset quite clearly shows that the changeset contains "stuff" i.e. the repo is certainly not empty.
What could be causing this?
TF will return "All files are up to date" if the itemspec you pass in is not found. If you don't include an absolute path, a relative path is assumed.
For example if you send
tf get myFile.cs /version:1009 /force
it looks in the current directory for myFile.cs, which doesn't exist, so it returns "All files are up to date." What we really want is
tf get C:\myproject\myFile.cs /version:1009 /force
Same thing with wildcards, eg
tf get D:\project\* /version:C111 /overwrite
Check out the itemspec link for more info.
You should try /all instead of /overwrite, this will force it to get all files, not just the ones it remembers getting to this workspace on the previous get.
MSDN Reference for Get
Instead of "Get latest version", you can "Get specific version" of type "Latest version" and check the "Overwrite all files even if the local version matches the specified version" checkbox. That will force a get latest.
I've had this same issue before, and after pulling my hair out, the only thing that corrected it for us was to un-map the workspace, delete all the local files, and then remap the workspace to disk - TFS would finally get a fresh copy of the files then.
We were using TFS 2005, for what it's worth - I'd be sad to hear that this situation still arises with newer versions. If you find another solution, please post it here, as I'd love to know how you resolved it.
I tried with /force, /recursive, /all options and still had the problem ("All files are up to date.") Eventually I realized the problem was due to mapping. So I deleted mapping and recreated.
My old mapping (incorrect) was done with a wildcard:
tf workfold $/* C:\DEV
So when I listed work folders (tf workspaces /format:detailed) it showed up like this:
Working folders:
$//*: C:\DEV
When I remapped as below, the get command started working:
tf workfold $/ C:\DEV
and the mapping was showing like this:
Working folders:
$/: C:\DEV
This can happen if you do not have adequate permissions to the source. I was able to see the entire source tree, all files, but I could not get the most recent version. I guess this is permission flexibility taken to the extreme (absurd?). To verify the issue was not workstation or mapping related, I tried looking at a code file on the team pages and received:
Image demonstrating lack of access to source file
I just had to fix this problem:
Get Tfs power tools. You can also get it from tools > Add-in manager inside visual studio.
It will require you to close visual studio to complete installation.
Once complete, open a command prompt in admin mode.
cd to your branch/solution directory.
run tfpt scorch (tfpt.exe comes with the power tools, if you don't see it, reinstall)
If it finds stuff missing, it will open up a dialog. Just hit next or ok and it will overwrite anything that does not match the server.
You can always add the "/force" parameter to TF GET to force it to get all files regardless of what it thinks you have in your local workspace (it maintains the versions of all of your workspace files on the server).
It looks like there are multiple ways to trigger this issue. In my case it was dealing with passing a relative path to a script that generated an absolute path and then passed that path to tf.exe. This is a Windows scripting problem more than anything else, but output from tf.exe is confusing.
Really what you'd like to see tfs return is "File not Found" instead of "All files are up to date".
In addition to the other suggestions made here, also double-check what you're passing to tf.exe by re-writing the command with echo first. If you're coming from a unix/linux background, string building just seems broken on win32.
broken.bat
SET PARAM1=%1
SET CMD_PATH="c:\path\to\%PARAM1%"
echo %CMD_PATH%
Result: broken.bat "tool.exe" => "c:\path\to\"tool.exe""
fixed.bat
SET PARAM1=%1
REM Strip quotes: http://www.dostips.com/DtTipsStringManipulation.php
for /f "useback tokens=*" %%x in ('%PARAM1%') do set PARAM1=%%~x
SET CMD_PATH="c:\path\to\%PARAM1%"
echo %CMD_PATH%
Result: fixed.bat "tool.exe" => "c:\path\to\tool.exe"
Check your workspace. I went to delete it as above (which probably would have fixed it as well) but I noticed that someone a project within my project got it's own workspace assigned in addition to the overall workspace. I removed that project from the workspace and it downloaded all my files when I clicked ok to exit the workspace menu.
My problem was that I was running VS developer command prompt from VS 2012 studio but my workspace mapping is inside vs 2013.
Make sure you run tf.exe from inside of visual studio directory which has workspace mapping, than simple tf.exe get "path" /all /recursive works just fine
Choose a date in future to get specific:
tf get * /version:D01/01/2099 /recursive /force /noprompt
Also make sure you have rights on the team project site in your project collections.
In my case, I could see the project folder and branches in TFS source control explorer, but I couldn't access the project's TFS website.
Instead of returning an error detailing a permissions issue, VS instead said all files are up to date when I tried to get the latest version.