We develop every US and Bug on a branch to be able to review the code related with that element so we generate a nice amount of branches. Every month we delete the old ones and traditionally this process has been way too slow.
We can only commit the deletion of 6 to 10 branches at once, otherwise we get a timeout. The deletion of 10 branches takes more that 5 minutes.
We have migrated recently from TFS 2010 and VS 2010 to TFS 2015 and VS2015 and nothing changed.
Is this normal? Is there a way to speed up this? Looking for an answer in google all I found is something related with local workspaces which is not the case, the workspace is remote.
Regards.
First, please check if other operations also delay so long, like adding a file, are fast.
And you could use tf delete command instead of GUI in VS. Which may do the trick. Deleting the branch will only actually perform a "soft delete". The branch will still exist complete with all of it's history, except it will be hidden. If you want to permanently deletes version-controlled files from Team Foundation version control, need to perform a tf destroy command.
Also check if it's a client issue, like a conflicting Visual Studio add-in. Try to clear TFS and VS cache.Besides, you could use another account and machine to do the delete operation next month.
Must to say there is some delay of deleting branch in TFS using VS GUI. I have created 20 empty branches for test. It took about 30 seconds to complete the whole delete operation. So if your branch have a number of folders with a large number of files, then the deletion of 10 branches takes about 5 minutes looks like acceptable. After all in TFVC, we don't create and delete branches frequently like GIT.
Related
I'm housekeeping the TFS Server and have 40,000 shelvesets of which 6,500 are for the Build Service Accounts.
I assume i can simple remove these as i don't think the builds will refer to old shelvesets.
I exported all current shelvesets.
Can someone confirm if we can just delete all of these? Or will we run into trouble if we want to run an old build
We are running TFS 2017.2
I don’t see any reason to keep them. Since commits should be occurring therefore giving you code history and reproducible builds.
If you delete a shelve set you cannot retrieve it latter for any old builds.
Shelve sets are not the same as code commits
Reference the MSDN documentation which applies to TFS 2017 and shelve sets for use cases.
the purpose is not to be a version history of code changes or
associate code with builds.Thats what commits are used for.
Shelve sets are temporary. However, no rule stopping you from keeping forever. Unlikely any value would be gained from so many
though.
there is not a need to reproduce a build with a shelf set that is older then your code release cycle because commits our occurring making the shelve sets moot.
#NOTE: I do not know why you would have the build agent doing code shelve sets. I would look at your build definition and take out anything automatically creating shelve sets.
Its always advised to find the details of shelveset unless you know its old enough to delete. You must be a shelveset owner, or your Administer shelved changes permission must be set to Allow as explained here. Shelveset commands to delete/find/list/... are listed here
TFS Sidekicks tool provides a dedicated UI for a number of clean-up actions, including old shelvesets till TFS2015. Not sure about its working in 2017
We have a branch called "Main". In July 2012 I created a new branch from this called "Phase 3" for the next version of our project. We have been working on this since then, but from time to time some other changes get applied to Main.
In May this year we performed a merge from Main to Phase 3 with some of those changes, and that was all fine.
Between then and now, we upgraded our TFS server from 2008 to 2012 update 3. (I wasn't involved with the "upgrade" but I believe it was an install on a new server with some kind of backup/restore of the data.) We've bit had any other issues with this.
Last week I tried to perform another merge from Main to Phase 3. I chose "selected changesets" because we have done a serious amount of rework in our phase 3 branch so merging changes across is quite difficult - so I wanted to do them bit by bit.
However, I was surprised to see that Visual Studio was trying to merge across changes from July 2011 - a good year before the branch was made (the very first changes made to this part of our project in fact.)
Oddly enough, if I view the history of the phase 3 solution I can expand it and see all these changes. So TFS appears to know that they have already applied.
I tried to merge some of the earlier changes across to see what would happened. The only changes it included were to do with items that had been renamed or deleted. For example, we had renamed our solution so TFS wanted to branch and merge a copy of the SLN with the old name. Or we had some images that had been subsequently deleted in both branches, but not at the time of this new merge.
So I backed this out and tried to merge everything across from May this year - i.e. just before our last merge across. This carried over a hideous number of changes - all sorts of things including regular merge/edit type changes. So I backed that out too!
We had created another branch from Phase 3. I have been able to merge between the two branches OK. I think it was created about a week before the TFS upgrade. But it's not experiencing the issue.
We have other branches that were taken from Main. These are experiencing the issue in that TFS is wanting to apply changes that it has already made.
I am using VS2012 update 3 to do the merge. I also tried VS2010 just in case but that does the same. Also a colleague has tried it and confirmed the same symptoms.
I don't think it helps that our phase 3 is so vastly different to main that merging anything across is really difficult.
Does anyone know how I can best resolve this? I'm a little worried about doing something I might regret later on!
I encountered similar problems when upgrading from TFS 2008 to TFS 2010. The issue is probably due to partially merged changesets. I.e. some of the files in the changeset have been merged, some haven't. Or it could be a branch move / rename situation. See the answer here for details of why a branch rename can cause this problem
In TFS 2008 if you attempted a merge, then unchecked files from the pending changes list. TFS assumed that you didn't want to merge the file ever again and on subsequent merges you wouldn't see those files.
In TFS 2010 or higher, the behaviour changes. If you uncheck files from the pending changes, on the next merge TFS will attempt to merge those files again. I think TFS 201x has the correct behaviour but its a pain that MS didn't highlight the change in behavior.
To check if this is the case, run the following from the command line
tf merge $/tp/main $/tp/phase3 /recursive /candidate
The /candidate switch tells TFS to give you a list of changesets it wants to merge without performing the merge. If you see any changesets in the list that have a * next to them, these are partially merged.
To fix it you have 2 choices.
Merge the files and resolve the conflicts, it might be worth merging on a changeset by changeset basis rather than trying to do them all at once. This will probably be a bit painful but once it's done it's done.
If you're confident that the phase 3 branch is correct then you can Merge using the command prompt. If you use tf merge $/tp/main $/tp/phase3 /version:c123~c456 /recursive /discard where c123 represents the oldest changeset you want to ignore and c456 represents the most recent changeset you want to ignore. The /discard switch tells TFS to update the merge history so that it thinks the merge has been done, but it won't actually perform the merge. This should remove the partially merged changesets from your list of candidates
If you opt for option 2 then you should do some analysis to make sure that you really don't want to take the partially merged changesets.
If you get to the point where merging is too difficult, or you just don't trust it.. then the only practical option is the "stomp over it with a new changeset". ie - do the merge manually outside of TFS and then commit your new, fixed changeset. then, kill the old branch and start again.
Not an ideal situation to be in, but your source integrity is paramount. Hopefully starting from a fresh branch will prevent issues like this in the future.
We have been using TFS 2010 to manage work items and sprints for a while now and have recently added on a dedicated QA person. What I need to be able to do is to either create a build definition that I can run on a scheduled basis (Tuesdays at 9pm for instance) that will only build and/or deploy the Work Items that are in a State of "ReadyToDeploy". Or a way to get a list of files to push based on the TFS API.
My end goal is to have a way to automate the release process so that only the items that have passed QA are sent to our staging environment weekly. Then the customer or QA can approve that the items work in staging which is a mirror of production, and another process or build definition will deploy those items which will be in a different state.
I have modified the work items and work flows to accomplish the different states, but I am having an issue getting either a build of just the fixes or a list of all the files to push based on the state of the work item.
I am open to any ideas or solutions for this, the alternative is that I have to manage the list of files and manually push files every week and I am trying to get away from that.
Thanks,
Edit: The way we have it setup now, is that each developer has their own branch and own website, our software is server based and has to be run on a particular server. Our Trunk is linked to the main dev website. This is where QA initially does their testing to move an item to the ready to deploy status. When a dev is ready for QA they check in their changes in their branch and merge into the trunk. The builds are created off of the trunk at the moment. On our deployment nights I open up the trunk website in VS and do a publish then take those files compared against a list the developers have given me and ftp the compiled files to our production server.
I could be wrong, but I am not aware of a way to tell a build to avoid certain changesets based on the state of an attached work item.
I think the only way you could achieve this would be to create a Release branch and perform a daily merge of changesets that are in the "ReadyToDeploy" state. When you merge, you are able to cherry pick changesets, but they must be contiguous. This means that you would have to perform multiple merges to get the Release branch into the appropriate condition.
We have operated something similar for many years and it works well for us. Many people will tell you that it's bad practice, and it probably is, but that is for you to decide.
As for automating this for a build, I don't think it would be a trivial job. The hardest part will be the merge. You might think that there will be no conflicts as it is a one-way merge, but having done this for a number of years, I can tell you that conflicts do arise.
Step By Step Explanation
Create a branch off the trunk called release
When you want to do a deploy, merge from the trunk into release, but only cherry pick the ReadyToDeploy changesets. This may take several merges as the changesets must be contiguous.
Fire off a build / deploy of the release branch.
Repeat steps 2 - 3 as your release schedule allows.
You could do something automatic, it would require an extensive bit of coding. It would also require a branching strategy so that, Dev, Staging, and Production all come from their own branches.
Set up a process that uses the TFS API to scan for items in that state
Use API to Traverse Items to get check ins attached
Use API to Get latest of source and target branch
Use API to Merge by change set (identified in #2) (this is non trivial, have to handle lots of cases, but merging should all be 1 way with no conflicts, so doable)
Use API to Check In
Kick off Compile Build
If Compile Build Successful kick off Publish Build
To the best of my knowledge there is a major issue with this approach. Say you have two changesets: # 1 and 2. Both of them containing modification for the same file. Now changeset #2 contains its own changes plus changes from #1.
If you decide to pull in changeset 2 and skip #1 guess what's gonna happen. You are going to suck in changes from #1. This is obviously a problem.
Yes this is one of the Doh! Damn! I shot myself in the foot. I don't have a lot of experience with TFS in large teams, but I'm facing this issue.
During a transition to new equipment, a developer forgot to check-in some code. Work proceeded on the new laptop for several weeks before noticing that the previous work was not checked in. Mutliple check-in have occured.
I have recovered the files from the old laptop, and have them on my current laptop. What is the best way to merge in these changes? Do I create a branch, merge in these changes, and then rejoin the branch?
Is there a "cookbook" out there that details what should happen when faced with various situations?
We are using TFS 2010.
Thanks in advance...
Creating a branch here is probably a little bit heavier-weight than what you need for this one-off situation. If it were me, I would do this:
Set up a workspace on your computer with the appropriate mappings.
Do a Get Specific Version to the version that the other computer was at. The best case scenario is if the user never deleted their workspace on the server. Then you can simply specify their workspace as the version and you'll get the files as they existed on the laptop. (You can specify this as Wworkspacename;owner name.) If the user deleted their workspace, you can get based on the changeset number they were at, or based on the date they were working at.
Copy the recovered files on top of the new TFS workspace.
Run tfpt online from the Team Foundation Server Power Tools. This will examine the local filesystem against the server and determine what changes were made. You may wish to examine the options, notably the /diff flags (which performs MD5s on the file instead of simply examining the readonly bit), and the /deletes and /adds flags, which detect deleted and add files, respectively.
Do a Get Latest on your workspace, resolve any conflicts, and check in.
You can follow this sequence to try out:
Make a merge-branch of your code version based on the time-stamp of where your restored laptop code has left the version control system.
Get your branched code to a location on disk.
Perform a check-out for edit of the entire workspace.
Copy the old restored code over the files in this workspace.
Perform a checkin of the local code into the branch.
Merge your latest code (main trunk) into the branch, merging changes, solving conflicts.
If all build and tests out correctly on the merge branch, merge that branch back into the main.
That should do the job.
I'm a consultant whose client runs a TFS 2005 repository. I manage my own source code in TFS and deliver my releases to their TFS. My source code is around 20,000 files that I maintain.
My normal process:
Detach my solution from my TFS
Connect to their TFS
Checkout the entire project
Overwrite my project files with
theirs
Check everything back in
Click the add button and add any new
files that have been added
Check everything in
Open the solution file and bind it
to TFS
Check everything back in
The main problem I'm seeing with this approach is if I delete a file on my end, I don't have a way to reflect that change.
I'm also not interested in synchronizing tools because I don't want to synch every checkin, just the current state.
Is there a way I can do this better?
What about maintaining parallel .sln and .proj files with the different bindings? Do they change often?
I think you can maintain change history by using the TFPT ONLINE command from the Team Foundation Power Tools.
Open SLN_A
Make changes (VS auto-checks out against TFS_A)
Before checkin on TFS_A, run TFPT ONLINE against TFS_B. This should pick up adds, edits, deletes.
Checkin SLN_A on TFS_A.
Checkin SLN_B on TFS_B.
Only problem with this might be that the SLN_A checkin could screw up the SLN_B pending changes b/c the files will be returned to read-only. Not sure.
Why do you need to maintain a parallel TFS? Seems like you ought to be working directly against their TFS, either on a branch, via the Proxy, or both.
Have you looked at TimelyMigration? (No affiliation and I've never had need to use it)
TFS to TFS migration