My company is considering upgrading our on prem TFS 2017 update 3 to the latest Azure DevOps Server (notably, the on prem variety).
During discussions about that possibility, one key stakeholder claimed that if you upgrade, all of your build and release pipelines would have to be rebuilt from scratch. We have a healthy number of build and release definitions in TFS 2017.
I have looked for the answer in the Microsoft documentation about what exactly gets upgraded, but unfortunately I can't get the level of granularity which would prove or disprove the above claim. On the surface it would seem like a horrible upgrade story if it were true. But I also understand that designs and architectures change and upgrades aren't always possible.
Could somebody let me know whether the build and release pipelines can survive the upgrade more or less unscathed? Knowing this would be a valuable data point as we work toward a decision.
Thanks in advance!
The vNext build definitions and the release pipeline I would expect would be pretty lift and shift. Depending on the tasks that you have defined, they might no longer be supported or there might be new versions. The UI will let you know that new versions are available.
A lot of the new focus is building out the features for the YAML build definitions. If you want to leverage those, you'd have to do a lot more rework of converting those vNext tasks into YAML. But converting is not really a hard requirement.
You mentioned that you aren't using the XAML build definitions, but if you happened to be using them, I would image that is where a lot of the rework comes in. Having done that in the past, I can say it is a pain if you have to do it.
all of your build and release pipelines would have to be rebuilt from scratch.
I've tested it and it won't lose any data after upgrading. We should use scheduled backups to ensure that we always have backups in place in case something goes wrong.
we can use that new hardware to do a dry run first, and then we will wipe everything clean and use it again for the production upgrade.
For our dry run, the steps for our upgrade will be:
Copy recent database backups to our new SQL instance.
Install TFS 2015 on our new application tier.
Use scheduled backups to restore the database backups.
Run through the upgrade wizard, being sure to use a service account which does not have any permissions in our production environment. See Protecting production in the dry run in pre-production document for more information.
Optionally configure new features which require changes to our existing projects.
The production upgrade steps will be quite similar. There the steps will be:
Take the production server offline using TFSServiceControl's quiesce command. The goal here is to ensure that the backups we use to move to our new hardware are complete and we don't lose any user data.
Take new backups of each database.
Copy the backups to our new SQL instance.
Install TFS 2015 on our new application tier.
Use the scheduled backups wizard to restore the database backups.
Run through the upgrade wizard, using our desired production service account.
Optionally configure new features which require changes to our existing projects.
You can refer to this doc for more details.
Related
Environment:
TFS 2018 with source code in TFS Git
developers are using gitflow-like workflow (main, develop and short-lived feature branches)
there is a build definition used for CI (off of develop branch)
... and another one for releases (off of main branch)
as project evolves build definitions get updated (new steps, etc)
What is the best approach that allows reproduction of previous builds (or, at minimum, release builds)? (in case if previously made build was lost in boating accident)
Ideally I need to be able to plug in version (e.g. 8.5.12345.1) somewhere, press OK and eventually receive data identical to that produced by corresponding build in the past.
Your best approach is to switch to YAML builds and releases. That way your pipeline is versioned together with the code.
If you don't do that, you may need to clone your build and releases every time you make breaking changes.
Alternatively, use the version diff view in your pipeline to go back to an older version or use the json to create a new definition using the API.
Upgrading to Azure DevOps Servers 2020 will give you more advanced YAML features not yet available in Team Foundation Server 2018.
Note: for truly reproducible builds, you'll need to also find a way to lock the build tasks themselves, TFS and Azure DevOps will automatically roll forward to the latest minor version of a given build task. While task authors should try to prevent any breaking changes in those minor upgrades there are no guarantees. You can also never rely on any tool installers that use a v2.x notation or a task that relies on latest. Azure DevOps isn't ideally suited for full reproducible builds.
You can pin task versions in YAML now, if I remember correctly, this was added in Azure DevOps 2020.
You can set which minor version gets used by specifying the full version number of a task after the # sign (example: GoTool#0.3.1). You can only use task versions that exist for your organization.
See: https://learn.microsoft.com/en-us/azure/devops/pipelines/process/tasks?view=azure-devops&tabs=yaml#task-versions
The Tasks docs offer special scripts to pin the versions of out-of-the-box tasks as well.
We have TFS server to be moved to a new Hardware please suggest the steps involved and the best approach.
The DB instance will also change will just Backing up and Restoring the Collection DBs be fine? and attaching these collections to the New TFS setup?
Thanks & Regards
Just as DaveShaw suggested in the comment, there are official tutorials in MSDN, you could simply follow the suggestion there. The steps are very clear and detail
Restore data to a different server than the current one for TFS
If you also want to do the version upgrade something you could pay attention such as :
Pre-production upgrade is just a dry run of your upgrade in a production environment.
Usually we use this to test your upgrade. This process test upgrades the databases. You can use this to simultaneously test your TFS on another hardware while continue to use your existing older TFS up.
Once you are ready for upgrade, restore the databases again and just use the Production Upgrade scenario during the server configuration wizard.
Besides, suggest you directly move the changes in the production environment later, a tutorial for In-place upgrade to for your reference.
I assumed this would be easy, but I'm not finding anything on it...
I have a project in TFS 2010, which needs to be moved to a new TFS 2015 server. Apparently the project cannot simply be moved normally because it's using a different project template which is not compatible and causes errors when trying to migrate (so I'm told - I don't have any more details on this).
I'm looking for a way to bring over the changesets, keeping history, to the new server. I assumed there was some kind of "dump" where you could export the TFS changesets, then import them into the new server into an empty project - but I'm not finding that option.
TFS Integration is deprecated and apparently doesn't work for TFS2015, with no alternative listed.
I'm open to other creative options like temporarily exporting to a different version control system - for example, I've looked at SVNBridge, but I can't even get that working, let alone figure out if it would help here.
Is there a way to migrate all changesets for a given project and keep history, without migrating the entire project?
There is no default way to migrate changesets in TFS, you would need 3rd party tool, like OpsHub (some features are not free), to migrate the most commonly requested data. Check: http://www.opshub.com/products/opshub-visual-studio-migration-utility/
Or you may consider doing a upgrade from TFS 2010 to TFS 2015, which is a full data transfer. To understand factors that affect your upgrade's compexity, check the requirements and review the upgrade process.
Learn if a dry run makes sense for you, and weigh the benefits and the costs to perform a pre-production upgrade.
When you're ready to upgrade, minimize downtime with the TfsPreUpgrade tool - especially for very large TFS collection databases (> 1 TB). Follow these steps for how to upgrade TFS.
I have backup of TFS databases and I want to get my code files from it. Is it possible? If so, then what exactly do I need to do? TFS Version: 11.0.61030.0 (Tfs2012.Update4)
Whatever investigation I have done so far, it seems that the only way to restore the files is to install TFS 2012 on another machine, restore the database backups on that machine. And hopefully afterwards I should be able to download the files from this new TFS. I wanted to verify my procedure because I need to know if there is something missing in my understanding before I start the task.
Yes restore is the way to go, but you must be careful at some important details. I write as I remember:
Use the same version of TFS for the new environment.
The new environment is in the same Active Directory domain. If you are in a workgroup, must add additional steps to make, at least some, accounts match.
Restore from a marked transaction (this is done by the built-in backup/restore tool)
You will have two live system with the same identifier: this may confuse clients. To avoid run the tfsconfig ChangeServerID command.
If you restore the Configuration DB, must run TfsConfig RegisterDB.
For getting code this is enough, but consider that the new environment is still pointing to existing resources: build server, lab management.
If the TFS instance was already used, more steps are necessaries, like cleaning cache on AT.
I do not remeber a complete guidance: there are many variations on this topic. Make sure to study the content of Restore a deployment to new hardware
I'm starting to dive into TFS 2012 and I have a basic understanding of the tiers and how build servers, controllers and agents work and how different build scripts can have different configurations and projects.
However, one of the things I'm struggling with is a requirement for our source control solution that says that I need to be able to prove a particular changeset or shelfset produced a particular build. That is, given a particular binary, I can point to a release changeset that generated that binary. I should also be able to point to the test changeset that was merged into the release branch. The idea here is not just a separation of duty, but validating that because the release and test changesets are identical, no code was injected into a project by a code reviewer.
I've read one blog post that talks about "Binary promotions" -- would that concept be useful in my situation? I'm having a hard time finding how this binary promotion is set up in TFS.
Deployment
Out of the box TFS doesn't really support deployments, it can deploy to 1 location on build which often is a test server (think lab management). TFS 2012 has built in support for Azure deployments, but they still happen at the end of a build and the build artifacts cannot be automatically deployed to a new location.
You could modify the build template to allow to release to different locations, but that would still be a fresh build for every environment and not true binary promotions.
TFS does, however, have a concept of build quality and actually fires off events when this quality is changed. TFS Deployer is a 3rd party tool that hooks into the quality change event and can execute powershell scripts. This means with a simple change of a dropdown value you can automatically kickoff a script that releases to any environment you want. You can customize the build quality list (per team collection) to be a list of environments (dev, uat, staging, production etc) which the script then figures out where to release the specific build to.
VS2012 also has some nice improvements to web deploy which means deployment configurations are stored in source control with the project, which in theory means they'll be available in the drop folder for TFS Deployer to make use of.
I don't believe TFS keeps a history of build qualities, which means you can't really use the build quality history to maintain a list of what is deployed to which environment. You could fairly easily record this information as part of the deployment script though. Or at the very least add a custom summary node to the build with information about the release.
TFS2012 does have the ability to mark a build as deployed as part of the Azure deployment functionality, you mark tfs deployer builds as deployed using a script but it doesn't feel very useful.
Octopus Deploy is another project that's worth checking out, and could be used instead of TFS Deployer if your build template creates NuGet packages. It requires a bit more control over the production hardware as you need to install agents on each environment to handle releases, but it solves a lot of other issues with deployment.
Versioning
Once you have a nice consistent way of automatically releasing that people don't bypass, you can look at enhancing the build template to inject the build version, or changeset number as the assembly version for anything built as part of that automated build. There's a number of different ways to do it and plenty of blog posts and tools to help you achieve that.
Alternatively you could just use automatic assembly versioning ([assembly: AssemblyVersion("1.0.*")]) to give you the date/time the build occurred, which ends up like 1.0.1234.123 where 1234 is something like the days since jan 1st 2000, and 123 is the minutes since midnight (my specifics may be wrong here).
If you're deploying websites, then I highly recommend injecting the current build version into the html somewhere. This way you can check what version a website is running without needing access to the bin directory. It can also be appended as a querystring to css/js file imports to ensure no browser caching occurs between versions.
Thoughts
Personally I'm hoping Microsoft realise that the xaml build workflows are trying to do too much and that they split the different concerns (build, test, deployment...) into different scriptable parts. Of course that would not be until the next major release of TFS which is years away. Although with Team Foundation Service they are trying to iterate a lot quicker, so they may actually extend the Azure deployment stuff into something more useful in the nearer future.