I want to set up an Incremental Build in TFS as we want to deploy only modified files into Physical path, not the entire code.
We want the feature to build & deploy only the files that have been changed from the previous deployment. This will reduce the build and deployment time and the developers won't have to wait longer to see their changes deployed.
What you're describing is not an "incremental build". You a describing a much more complex situation than an incremental build.
What you're describing has never been an out-of-the-box option, and is in fact incredibly difficult to do properly, and ultimately would probably not impact things as much as you're hoping, anyway.
First of all, it's actually very difficult to determine a subset of files that have changed between deployments. And if you're building and deploying properly, then you're making a single build and deploying it along a pipeline of environments. This means that "what's different" at any given time is potentially different for every environment in your pipeline. Ex: DEV has version 5, QA has version 4, and PROD has version 3. So you have to start by assuming that you're going to use the oldest version. Build systems have no innate knowledge of "releases", so you'd have to build something into your build and release pipelines to track what source version constitutes the latest code in production.
Let's say you've solved that problem. You now have the ability to retrieve just the delta between what's deployed to production and the commit being built.
If you're working with compiled code, then you still need all of the source code, because you're going to have to rebuild the whole thing. Every assembly is going to get regenerated, and different metadata at compilation time is going to mean those assemblies are different even if the code that constitutes those assemblies is the same. And since assemblies can reference other assemblies, you have no straightforward way of determining at build time which assemblies have actually changed and need to be deployed. So you pretty much have no choice but to deploy all compiled assets every time. Note that this still applies to TypeScript or anything else that goes through a compiler/transpiler process; you need all of the code available, and it has to go through the entire build process.
So at this point, you still have to build your entire application to get the deployable output. Build time hasn't gone down at all. You've managed to bring down just a subset of static content (i.e. HTML pages, images, etc) to be deployed, though. That may have sped your deployments up a bit!
However, if the thing that's making your build and deployment process slow is that you have a ton of non-code-related static content, then you've gone through a very long and convoluted process to arrive at a much simpler solution: Move static content to a CDN and get it out of source control, or have a separate process that manages static content so that it can be deployed independently of unrelated application code.
You haven't really provided any information that can be used to provide a recommendation on how to proceed, but hopefully this answer is helpful in understanding why what you want to do is not going to solve your problem, unless you are dealing entirely with static content or scripts that don't require building.
Our project group stored binary files of the project that we are working on in SVN repository for over a year, in the end our repository grew out of control, taking backups of SVN repo became impossible at one point since each binary that is checked in is around 20 MB.
Now we switched to TFS,we are not responsible for backing the repository up, our IT tream takes care of it and we have more network and storage capacity for backups because of that but we want to decide what to do with the binaries. As far as I know TFS stores deltas and for binary files but deltas will be huge, but we might end up reaching our disk space quota one day, so I would like to plan things better from the start, I don't want to get caught up in a bad situation when it's too late to fix the problem.
I would prefer not keeping builds in the source control but our project group insists to keep a copy of every binary for reproducing the problems that we see in the production system, I can't get them to get the source code from TFS, build it and create the binary, because it is not straightforward according to them.
Does TFS offer a better build versioning method? If someone can share some insight I'd really be grateful.
As a general rule you should not be storing build output in TFS. Occasionally you may want to store binaries for common libraries used by many applications but tools such as nuget get around that.
Build output has a few phases of its life and each phase should be stored in a separate place. e.g.
Build output: When code is built (by TFS / Jenkins / Hudson etc.) the output is stored in a drop location. This storage should be considered volatile as you'll be producing a lot of builds, many of which will be discarded.
Builds that have been passed to testers: These are builds that have passed some very basic QA e.g. it compiles, static code analysis tools are happy, unit tests pass. Once a build has been deemed good enough to be given to test it should be moved from the drop location to another area. This could be a network share (non production as the build can be reproduced) there may be a number of builds that get promoted during the lifetime of a project and you will want to keep track of what versions the testers are using in each environment.
Builds that have passed test and are in production: Your test team deem the build to be of a high enough quality to ship. As part of your go live process, you should take the build that has been signed off by test and store it in a 3rd location. In ITIL speak this is a Definitive Media Library. This can be a simple file share, but it should be considered to be "production" and have the same backup and resilience criteria as any other production system.
The DML is the place where you store the binaries that are in production (and associated configuration items such as install instructions, symbol files etc.) The tool producing the build should also have labelled the source in TFS so that you can work out what code was used to produce the binary. Your branching strategy will also help with being able to connect the binary to the code.
It's also a good idea to have a "live like" environment, this should be separate from your regular dev and test environments. As the name suggests it contains only the code that has been released to production. This enables you to quickly reproduce bugs in production
Two methods that may help you:
Use Team Foundation Build System. One of the advantages is that you can set up retention periods for finished builds. For example, you can order TFS to store the 10 latest successful builds, and the two latest failed ones. You can also tell TFS to store certain builds (e.g. "production builds"/final releases) indefinitely. These binaries folders can of course also be backed up externally, if needed.
Use a different collection for your binaries, with another (less frequent) backup schedule. TFS needs to backup whole collections, but by separating data that doesn't change as frequently as the source you can lower the backup cost. This of course depends on the frequency you are required to have the binaries backed up.
You might want to look into creating build definitions in TFS to give your project group an easy 'one button' push to grab the source code from a particular branch and then build it and drop it to a location. That way they get to have their binaries, and you don't have to source control them.
If you are using a branching strategy where you create Release or RTM branches when you push something to production, then you can point your build definitions at those branches and they can manually trigger them from the TFS portal or from within Visual Studio.
Some times ago I asked the question about how to integrate an application using dependencies on a build server and I had quite satisfying answers. Today I am facing a different case. For a project I have to use non-redistribuable depedencies (RDL object model for SSRS). It means that out-of-the-box, these assemblies are not made to be deployed for development purpose. But somehow, I need to...
My first guess was to publish them in the GAC. Fine, it worked and the build server was able to compile the project smoothly. But then I realised that it broke some applications like the Report Server and the Report Builder (probably it would also break BIDS). So publishing in the GAC is definitely not a decent solution.
My second guess was to check the assemblies in source control. Well, it could work if I had only 2 assemblies for an amount of about 1MB. But here it is 23 assemblies and 29MB I have to check in, so it is definitely not suitable either.
I don't know much about MSBuild targets and maybe it could be a solution but I really have no idea on how to use it. I have been scratching my head hard and now I have to chose between breaking my builds or breaking my services!
As some people stated in comments we finally decided to source control the assemblies.
But as we are in an environment where we sometimes need to move a lot, which means not always in office, and need to work from distance with occasionally somewhat unreliable Internet connection, we decided to put some strict condition on whether we source control the assemblies or we deploy them on the build server and development machines.
Assemblies will be source controlled if all these criterias are met:
Assemblies/Framework is not deployable/redistribuable
Assemblies/Framework deployment may interfere with local machine services stability
Total amount of deployed assemblies on the project collection does not exceed 100MB
You could try using a different repository just for these assemblies, and do a checkout/update during the build job.
Also, if you want to keep it in the main repo as well, you could use svn:externals (http://svnbook.red-bean.com/en/1.0/ch07s03.html) to automatically update the DLLs when you update your working copy.
I've used TFS for about 18 months now and I'm really not excited about it. It seems like the worst of the current versions of SCMs on the market.
I think this thread will help people decide if TFS is for them vs. other source control systems. While TFS does a lot more than that, I think that source control is so critical to software development that any system (or combination thereof) that you pick needs to consider source control first.
What are the good things about TFS vs. other source controls -- what does it do well that no one else does?
What are the things that TFS is bad at that everyone else seems to do just fine?
Pros
Fundamentally it's a sound system. Robust and reliable.
Integrated with work items, reporting, etc.
The power tools are really good.
[edit] It is improving, and has taken good jumps forwards with 2010, 2012, 2013
TFS is highly accessible for custom tools. There's a rich API that makes it so easy to write dashboards and other tools to get at the data in TFS. And as all the data is stored in SQL, you can browse it and query it directly if need be. I've worked with many different SCMs over the years and have never found one that is so open and accessible - everything (user stories, tasks, bugs, issues, test plans, iterations, source code control & branches, builds, unit testing, continuous integration) is just there at your fingertips. This is an awesome feature of TFS. A lot of the UI failings of TFS have been addressed in a few afternoons writing tools and a dashboard for my team to use. And let's face it, if you write your own, it does exactly what you need.
Cons
There is one area where the robustness fails miserably: If you apply several changes to a file (add, rename, edit) in "one go" it gets horribly confused. If you don't check in these actions separately, both TFS2005 and TFS2008 crash when you go to merge those changes across branches. In 2010 onwards it no longer crashes, but it often doesn't correctly check in the changes, so you have to go in and clean up a mess of missing and incorrectly named files.
There is no standalone source control browser. It's integrated into VS, which is really annoying when you want to just work on source control items without needing to run up another copy of VS. Of course, you can give your artist a Team Explorer, but let's ask ourselves if an artist who only ever wants to view the files, check out, check in, and GLV really needs a fully blown complicated VSTS instance running to achieve it? In addition, the integration is so poor that you can't realistically use TFS from the Solution explorer (it simply lies about what you have checked out, and is so unreliable when you apply actions from that window that you soon learn to open the source control window and work in there, which defeats the point of it being integrated in the first place) [edit: The file explorer extension is excellent - close to a standalone browser - and is simple and easy to use. The main drawback of it is lack of proper integration with file commands - to rename or delete files you must remember to use the TFS submenu, or you will rename/delete locally and this screws up source control completely as TFS knows nothing of the changes you have made. This unfortunately means that only 'advanced' TFS users can be trusted to use it. So, essentially, it's still a case of "no stand alone browser" for most users]
The user interface sucks (but is improving, at least on the web-access side). Sure, it works, but there is so much that could be done to make it efficient, pleasant, and more foolproof to use. e.g. [prior to 2012] When you click "check in" it ticks all remaining un-checked-in items so that if you accidentally click Check in again in future, it checks in a load of stuff you didn't want to. And after this, it would be so easy to supply an "undo last checkin" option to quickly roll it back - but there isn't one. [Edit: The UI is improved, but these specific problems are still present in VS2010, although it does now have a check-in confirmation dialog that reduces the risk of accidental checkins][edit: in 2012 it's much better, but they've gone mad and rolled all the separate TFS dialogs into a single window, which was a serious step backwards. The pending changes window doesn't work nearly as well as in 2010 - it is harder to find things, it takes more clicks to achieve the same things, and if you check in a file from anywhere all the currently 'included' files get chucked into 'excluded' so if you have several things on the go they all get mixed together]
Workspaces. In most cases, every team member has to have essentially the same workspace mapping, slaved off a local root folder. We need 7 mappings defined, which takes about 5 minutes to set up. There is no way to push the workspace definition from the server. There is no [edit]easy[/edit] way to duplicate a workspace so you can use an existing one (or another users one) as a starting point. No, you have to manually re-enter all the bindings over and over and over and over. If you change your active workspace in the source control explorer, it doesn't get synced to your pending changes window, so you spend 15 minutes wondering why the file you merged from your other branch just isn't listed. [edit: This is getting better with 2010/2012, as you can see workspaces on other PCs and copy and paste them more easily, but it's still a pretty clumsy UI]
It has changesets, but you can't bundle items into separate changesets in your pending checkins list as you can in Perforce, you can only associate them with a changeset by actually checking them in. You can really only work on one changeset at a time, or you have to separate the files out manually in your pending list as you go to check in. [still very poor in 2012]
The merge tools are terrible. As in: they simply don't work, and unnecessarily introduce bugs into your code if you rely on the automatic merge. These tools are just as bad as they were when I first used SourceSafe in 1994. So the first thing you have to do after buying a very costly VSTS licence is replace the merge tools with something that actually works. And that means that every time you get a merge conflict, you must select each file. Choose to resolve the conflict and ok. Choose to use your 3rd party merge tool and ok. Then merge. Then save. Then choose to accept your merged changes. (You should be able to choose "automatic merge" and have it simply use the third party merge tool that actually works without hitting you with a barrage of pointless and annoying dialogs that always default to the wrong option) [Edit: InVS2010 the merge tools are still awful. But the front-end UI is much improved (merging a conflict now takes a single click rather than 4 or 5 clicks - a massive improvement when you have to merge many files][In 2012 there have been further improvements, but they are still 'ok' rather than good]
It doesn't sync between running instances of VS. So if you check in a file in one VS, another one will still list that file in your pending checkins. (it's clearly easy to sync it because any changes made by the power tools windows-explorer extension are reflected in VS instantly). [Edit: In 2012 they have fixed this problem. Now every time you switch to the pending changes view it spends 15 seconds refreshing (in 2010 it cached it and showed it instantly but it was occasionally out of date)]
Branching is the standard way of working these days. So you'd expect the branch/merge tools to make this quick and easy. But no. [edit: Big improvements were made in 2010 and 2012, but merging is terribly supported - it is really labour intensive. Just little things like only being able to merge a contiguous set of changes, so if you want to merge 5 changes that are not contiguous you have to do them one by one, but each time you open the dialog it starts from scratch instead of remembering where you were, what you last merged, the list of availablke changesets, etc. You should be able to select any changesets you want and it should automate the rest]
If you GLV (get latest version of) a solution, and some of the projects in it have been changed, VS repeatedly asks if you wish to reload each changed project. It is about 10x faster to close your solution, then GLV, then open the solution again than to GLV with it open. If I'm GLV'ing then of course I want to reload the projects! When I buy my food at the supermarket they don't ask me for every item "do you wish to take this item home with you?". [Edit: Still broken in VS2010][Fixed in 2012. Hurrah!]
[edit] If two team members add a new project to a solution, then when the second person goes to check in, they must (obviously) resolve a merge conflict. However, TFS treats the .sln as a text file, and corrupts it (it adds the two project entries but the project count is effectively only incremented once). It would be so easy to fix the sln format to make the files mergeable.
[edit] I don't do any source control operations from within the Solution Explorer window, as it has been rather unreliable ever since "integration" first came along. Even in 2008 it usually has random "checked out" icons on files that are not checked out, and recursive operations sometimes do weird things. Almost every source control 'glitch' we have is a result of someone starting an operation from the Solution Explorer. Luckily, I prefer to work in a Source Control window anyway.[2012: Sorry, can't tell you if this is fixed, as I haven't used this feature since 2008]
[edit] Where to start with the Source Control Bindings window? VS could say "Your Source Control settings have been corrupted again for no obvious reason. I never could get the hang of Thursdays. Shall I fix this for you? [YES]", but instead, it shows a complicated, confusing dialog full of information that makes no sense to anybody, resulting in a UI so scary that it makes junior programmers soil themselves. The trick is to ignore the whole window, hide behind your desk and click the "fix it" button, and it fixes it.
[edit - added 12/2010] When you Get source code, especially when resolving merge conflicts, other windows are often brought to the front (either the Solution Explorer jumps in front of my Pending Changes view, which I have docked in the same tabbed area, or the Source Control window vanishes behind another document window. This is really annoying when you have another file to merge or another folder to Get, as you have to keep "finding" the Source Control/Pending Changes windows. Getting code should not constantly reorder my document/tool windows.[2012: Still broken]
[edit - added 1/2014] With TFS 2012/2013, there is a choice of Server or Local workspaces. Server is the name for the old system where you must be online with the server to check files out. Local is the new default and makes a copy of the entire source repository on your computer, allowing you to make edits to any files without needing to check them out first. TFS then diffs your files against its local copy to work out what you changed. This sounds good, and for many people it probably is good, but it has some serious drawbacks that you should be aware of:
As you no longer check out files, they do not get locked when you edit them, and thus several people can edit any given file simultaneously, requiring a merge operation when they check in. This is fine for text-based source code files, but results in difficult situations or lost work when the files are unmergeable. Unmergeable or non-automatically mergeable files include Solution, Project, Resource (resx), XAML and any other XML files - so this causes a lot of problems in a development environment. If (like us) you also want to store Word and Excel documents and binary files under source control, local workspaces are positively dangerous. We have lost several days of work because someone unwittingly used a local workspace and then it was not practicable to merge their changes. You can reconfigure the TFS server to make Server workspaces the default to defend against this.
With Local workspaces you have to keep two copies of everything on your computer. When we upgraded TFS we suddenly found everyone lost 25GB of disk space, and it took several weeks to work out where the disk space had gone! This was a major problem for us because we all use SSDs and it is only now (2014) that SSDs are getting large/cheap enough that we can afford to be so inefficient with our disk space.
In the few weeks that we used local workspaces we had several incidents where TFS corrupted files or lost changes, presumably due to bugs in the implementation. Quite simply, we cannot accept anything less than 100% reliability for our source control system.
TFS is getting much easier to manage; these days if you don't want to customise anything too much you can set up a server in a very short time (hours) and setting up continuous integration builds and backups etc is extremely easy. On the flip side, while I found it very easy to set up backups of a TFS database, restoring that database and getting up and running after our server bricked itself was another matter - it took 4 days to work through all the unnecessary blocking problems (e.g. you have to restore the backup form a network drive, the data can't be local. When I tried to restore the image to the rebuilt server, TFS kept telling me there were no databases that could be restored. When I got past that, TFS wouldn't use the databases because they didn't match the host server (because that server was gone, the OS had been reinstalled). It took a lot of searching and fettling to get the backup to restore. Restoring should "just work"!
As you can see, most of the above are just trivial UI gripes. There is such a lot that could be improved about the UI. But the actual underlying product is good. I prefer TFS to pretty much every other SCM I've used over the last 28 years.
I wouldn't even mind the poor UI so much, except that it is one of the core UIs developers have to use on an hour-by hour basis, and they have to pay such a lot to get it. If the subscription money from a single developer was invested on improving the UI it would make a massive difference to the usability of TFS! It's painful to think that TFS is merely good or ok when it could so easily be excellent with a bit of nice UI.
Hates
Doesn't track changes to files unless you've checked them out, so if you edit a file in Notepad++ TFS is unaware that anything changed.
It's very easy for someone to check out a fille and lock it so that nobody else can make changes. TFS shouldn't drop this ability, but it certainly should make it much harder to do than it is currently.
The methods to undo a commit or two is very unclear, so much so that I'm never quite sure if it worked or not.
The way that TFS makes files read only unless you check them out is obnoxious, though it does help me remember to check files out before I save the edits I've made.
Loves
I suppose built-in integration with visual studio is nice, if you like that kind of thing (I don't)
I am a member of the Team Foundation Server team at Microsoft. There are a lot of very valid issues raised here. Some of them are addressed in the 2010 release. Others remain as issues, but we do recognize them and are working to improve the developer experience with the next release. Discussions like this are great for helping us make sure we're solving the right problems.
Here is some info on issues that are at least partially addressed today in the 2010 version:
Stand alone client
For non-developer customers that want to use the product outside of VS, they can use the Windows Shell extension powertool.
If you have users (developers or not) that need to access TFS from non-Window machines, they can use Team Explorer Everywhere. This is supported on platforms including Mac & Linux.
Copy workspace
There are two ways to copy a workspace today. The 1st is by using the workspace template command at the cmd line. Ex.
Tf /workspace /new /template[workspace name/owner to copy from]
Alternatively, you can open a workspace in the UI, select all of the mappings, copy them, & then paste them into a file/email. Someone else can than paste those same mappings into their workspace.
It would definetly be great if you could simply specify a default workspace that clients automatically pick up, but we don't have this today.
Merging robustness
The scenario described where you do an add, rename, add & then have problems when you merge has been addressed in TFS 2010.
Branch/Merge as a 1st class experience
In TFS 2010, branches are now 1st class objects in TFS. You can visualize your branches & even track changes as they move through the branch. Branching is also now a fast server based operation.
Get Latest Version of multiple projects
You can do this today by choosing the TFS instance node in source control explorer & then selecting get latest. This is the equivalent of the root folder ($).
File locking
By default TFS never locks files when users checks them out. This is the way we use TFS at Microsoft & how we see the majority of our customers using TFS. It is possible to enable users to explicitly lock files. Some customers find this desirable, but it is not the default path experience.
Con: Checkout model. Many applications do not deal well with files that are marked as read-only then change to writable (Word 2007, Notepad). So you open a file, edit the file, try to save then you're told that you can't save because it's read-only. Great, now you have to Save As..., delete the original and renamed the new one to the old name. If there's an upside to having local files be read-only I don't see it. I really prefer Subversion's approach to this.
The one upside to making files read-only is that it reminds you to check them out. However that's really just a symptom of the check-out model.
I think that TFS is the single best ALM product on the market today. Looking at it from only a source control platform is slanted. I have used many products in my career to date: VSS, SVN, Git, StarTeam, CC/Harvest, and ClearCase - apart from TFS. Personally, I cringe at the thought of going back to anything other than TFS.
TFS is an extremely powerful platform. My biggest problem with it is often related to people not knowing how to use it or using it incorrectly. It is not meant to be an application that "just works". Sure, you can use it for basic source control without learning much about it - but if that is all you use it for, then you really are better off using one of the less robust tools out there. In reality, what TFS does not give you is the way to interpret features how you want to. It is specifically built from the ground up to support process and not just be a repository.
Con: Timestamps. There's no way to set TFS to use the remote last-modified timestamp as the local last-modified timestamp. The local file's timestamp only tells me when I got the file. If I get a file that's 2 years old, there's no way to know that based on the local timestamp.
Other source controls that I have used have this ability.
Cons:
workspace version: You can't identify the version of a workspace without doing a recursive search.
terrible offline experience. attrib -r + tfpt online shouldn't be the way to work offline. Give me something like git that allows me to track status, undo and make changes. I'm even fine if it only stores the difference between the workspace version and current.
Merging robustness: a changed file on the server + a local edit on different lines is not a conflict. a writeable file should not be an automatic conflict. The automerge button should NOT exist, because it should never be a scenario.
Workspaces: the idea of being able to rearrange the source structure is just odd, and causes issues. the requirement of having both branches mapped in order to merge is odd. The requirement of having to do an operation multiple times, because my workspace mapping doesn't have a true root folder is wrong.
Full reliance on remote server: There are some nice things about having all these things stored on the server, but really, you could store information locally and then upload it when needed. Keep pending changes, workspace mappings, basic undo history locally, etc.
Pros
Shelvesets: I love these, and wish support for them was brought to the local disk as well (think git stash)
Source control view in VS: It's pretty cool to be able to view the entire repository without downloading it. There are some usability issues, but the overall idea is cool.
Workspaces: yep, both places. While re-arranging a repo is odd, the ability to only download what you need is pretty awesome. I often wish I could choose a root folder and then check box the paths I need, but oh well.
Dislikes:
Using the history to figure out what has been done is cumbersome to say the least. You have to click on every single history entry to see what files were changed, and then you need to go through a context menu to get a diff.
Working while disconnected from the network is a big no-no. Ever heard of working on an airplane?
No Windows Explorer integration for when you work with files outside of VS (think TortoiseSVN).
Process methodologists (configuration managers) love to not allow shared check-outs. This is absolutely horrible for example for config files that you need to modify for testing.
SC gets confused with complex move/delete operations.
SC does not recognize when a checked out file has not changed. For example, service reference updates check out all related files and often regenerate the exact same content. These files should implicitly be removed from check-ins because they just add noise when you look at your changeset later.
Likes:
Shelving.
Anybody guessed which is my favorite SCM system? SVN + TortoiseSVN + VisualSVN :-)
Search functionality is not implemented in TFS 2010 ?
VSS we have search in file; TFS 2008 we have search file ...
Con: If you want to move multiple files to a subfolder of the existing location, you have to do that one at a time. Wow, that's horrible.
The lack of rollback has been my biggest pain point.
The lack of true rollback support and the inability to rename a TFS Project are my two main pet peeves with TFS. Other than that, I've been very happy with it for 2-3 years.
The fact that certain applications do not support in-edit changes from read-only to writable (forcing you to reopen the file in question) is annoying but is really a problem with those specific applications. The fact that a file is read-only while not checked out has certain uses, one of which being that it reminds you to check out the file. It does occasionally, however, lead to confusion when trying to get specific revisions of files. Writable files are not re-downloaded unless you enable a flag, because they're considered local edits.