Migrating TFS 2012 server installation to a new server - tfs

We are moving our installation from a hosted server to a VM on our local network. We have a mix of local users and Domain users. I am concerned about the local users that were created on the existing server. What will happen to them in the new environment?
For example
Server1\JohnDoe will not exist on the new server. What is the best practice for this?

What you are effectively doing is a domain change from the perspective of TFS. You need to merge two procedures. First is the environment move:
http://msdn.microsoft.com/en-us/library/ms404883.aspx
And the second is the hardware move:
http://msdn.microsoft.com/en-us/library/ms404869.aspx
What you are doing is both at once. This can be done safely and you need to be very careful with the accounts issue.
I have done this a whole bunch and documented it:
http://nakedalm.com/in-place-upgrade-of-tfs-2008-to-tfs-2010-with-move-to-new-domain/
Thats for an older version but the principals are the same.

Related

about change the location of TFS 2017 Databases

i attempt to backup and restore tfs database to a auther server
Any idea about risk ?
and How i can manage TFS Server to change the new location of TFS database ?
You could move Azure DevOps Server/TFS from one machine to another by restoring it to new hardware (called a restoration-based move).
When you move to a new server you do not lose any of your project history.
One risk
In some situations you might want to change the domain of a Azure
DevOps Server deployment as well as its hardware. Changing the domain
is an environment-based move, and you should never combine the two
move types. First complete the hardware move, and then change the
environment.
Besides, another place need to pay attention to. You must have a complete set of data backups for the SQL Server databases. If the data was encrypted, you must also have the encryption key and its password.
For more information, see Back up Azure DevOps Server
You must back up the TFS_Warehouse and TFS_Analysis databases if your deployment is configured to use SQL Server Reporting Services and you want to restore those databases to a different server. You cannot just rebuild the warehouse, as you can when you restore to the same server or instance.
Once the backup completes, verify that the backup is available on the storage device or network share, and that you can access this backup from the new hardware.
Actually, we do have a detail step-by-step official tutorial, you could kindly refer and follow it-- Move or clone from one hardware to another for Azure DevOps on-premises

How can I do environment migration for tfs?

I am in a little bit confusing situation. Below diagrams are supposed to tell the case:
I had 2 different tfs servers serving to 2 different teams on my company. The users were local on those servers. So, no active directory, no central control. Hence we decided to merge these 2 servers into 1 TFS.
Now I have 1 big TFS server containing all the collections, without any problem at that level. But as you can see from the users names, the users are imported to this new server with their local computer names - A\user1, A\user2, B\user3, B\user4. Which violates my sense of harmony.
What I want to do is, install an Active Directory service on a new machine, and have all my users included in this new domain as below:
My ultimate goal is to use the same users, only changing their netbios names like C\user1, C\user2, C\user3, C\user4, in order to keep old informations in TFS valid.
Microsofts documentations calls the first step I have achived so far a "Hardware Migration", and the second step as "Environment Migration". Then tells to not to do them at the same time. So I completed, the hardware migration, and now I need to do the environment migration, but since I don't have a deep knowledge on this domain things I am taking it slowly.
I haven't installed Active Directory yet, since I don't know what I need to change beforehand.
My questions are:
What I am dreaming is possible?
How can I achive it?
Environment-based migration scenario means changing the domain of the TFS deployment, whether it's a domain name change or going from a workgroup to a domain. So what you want can be achieved.
You can use Identities command to change the security identifier (SID) of users and groups in your deployment of TFS. This command supports the following scenarios:
changing the domain of your deployment
changing from a workgroup to a domain (your scenario) or from a domain to a workgroup
migrating accounts across domains in Active Directory
Command:
TFSConfig Identities /change /fromdomain:DomainName1 /todomain:DomainName2 [/account:AccountName] [/toaccount:AccountName]
More information is described here: https://msdn.microsoft.com/en-us/library/ms404883(v=vs.120).aspx

AzureWorkerHost get the uri after startup for Neo4jClient

I am trying to create a ASP.Net with neo4jclient project to be hosted on the Azure and am kind of unable to grasp how to do the following:
get hold of an neo4j rest endpoint address once the worker role has started. I think I am seeing a different address each time the emulator spins up a instance of worker role. I believe that i'll need this to create an client somewhat like this
neo4jClient = new GraphClient(new Uri("http ://localhost:7474/db/data"));
so any thoughts on how to get hold of the uri after the neo4j is deployed by AzureWorkerHost.
Also how is the graph database persisted on the blob store, in the example its always deploying a new instance of pristine db in the zip and updating, which is probably not correct. I am unable to understand where to configure this.
BTW I am using the Neo4j 2.0 M06 and when it runs in emulator, I get an endpoint somewhat like this http://127.255.0.1:20000 in the emulator log but i am unable to access it from my base machine.
any clue what might be going on here?
Thanks,
Kiran
AzureWorkerHost was a proof of concept that hasn't been touched in a year.
The GitHub readme says:
Just past alpha. Some known deficiencies still. Not quite beta.
You likely don't want to use it.
The preferred way of hosting on Azure these days seems to be IaaS approach inside a VM. (There's a preconfigured one in VM Depot, but that's a little old now too.)
Or, you could use a hosted endpoint from somebody like GrapheneDB.
To answer you question generally though, Azure manages all the endpoints. The worker roles says "hey, I need an endpoint to bind to!" and Azure works that out for it.
Then, you query this from the Web role by interrogating Microsoft.WindowsAzure.ServiceRuntime.RoleEnvironment.Roles.
You'll likely not want to use the AzureWorkerHost for a production scenario, as the instances in the deployed configuration will destroy your data when they are re-imaged.
Please review these slides that illustrate step-by-step deployment of a Windows Azure Virtual Machine image of Neo4j community edition.
http://de.slideshare.net/neo4j/neo4j-on-azure-step-by-step-22598695
A Neo4j 2.0 Community Virtual Machine image will be released with the official release build of Neo4j 2.0. If you plan to use more than 30GB of data storage, please be aware that the currently supported VM image in Windows Azure's image depot must be configured from console through remote SSH to Linux.
Continue with your development using http://localhost:7474/ and then setup the VM when you are ready for a staging or production build to be deployed.
Also you can use Heroku's free Neo4j database deployment but you must configure the basic authentication for your GraphClient connection in Neo4jClient.

TFS and Forms Authentication

I don't know squat about TFS, other than as a user who has performed simple check in/outs.
I just installed it locally and would like to do joint development with a friend.
I was having trouble making my TFS web site on port 8080 visible (the whole scoop is here if your interested) and I wonder if it could be related to the fact that TFS is probably using Windows Authentication to identify the user.
Can TFS be set up to use forms authentication?
We probably need to set up a VPN, though that's a learning curve too.
To use TFS, do our machines have to belong to a domain?
We're not admin types, though he is better than me, though I would be interested in any feedback or advice on which path is likely to pan out the best. I already got AxoSoft OneTime working in this type of an environment and it suits us well, but I am tempted at all the bells & whistles with TFS and the ability to tie tracked bug items to code changes.
As far as finding a good way to share code, do sites like SourceForge allow one to keep code secure among members only?
It does not need to be installed in a domain. I'm running TFS at home within a workgroup on a virtual machine.
Create a user on the machine that hosts TFS. Let's assume this machine is named TFS-MACHINE. Grant that user appropriate Team and Project rights.
When connecting to TFS from the remote machine, the user should be prompted for a user ID and password. They should use a User ID of TFS-MACHINE\username and the appropriate password.
Regarding external spots to host code. If you're looking for cheap/free, you can look at something like Unfuddle, which supports SVN and Git.
If you're looking for hosted TFS, the only place I've been able to find thus far is SaaS Made Easy, but they can start getting a bit expensive, depending on the number of users you have.
Keep in mind if you're going to host locally that you'll still need to do things like periodic backups, etc.

TFS remote users... SSL + Password or VPN?

I'm currently tasked with setting up a TFS server for a client. The TFS will mainly be accessed by local (on-site) users through the internal network... Easy!
But what about the few remote users we have? Should they connect via VPN or is it better to make the TFS server public and have the users connect over SSL and provide username and password to the TFS?
Do you have any suggestions on how these solutions will perform compared to each other?
VPN is the way to go if you want the optimal TFS experience with TFS 2005 or TFS 2008. While TFS mainly uses web service based protocols that can all go over SSL, there are a few small things that will not work unless you have proper network access. For example:
Viewing the Build Log (unless worked around)
Access Team Build drops
Publishing Test Results
As well as a few other little niggles. Going the VPN route will also mean that your TFS installation will vary less from a standard base TFS installation which gives you some peace of mind that you won't run into any problems when it comes to upgrading to a new version, applying service packs etc. (or at least any problems you run into will have been run into by many before :-) ). Going the SSL route you are treading a less worn path - though obviously plenty of people do run it that way including CodePlex and all the commercial companies that provide a hosted TFS installation.
The downside of VPN is that usually you are granting users to an entire section of your network (unless you are running TFS in it's own mini private network or something). If you go down the SSL route then be sure to properly test the new team projects as this is easy to break and you might not realise until you try and create one either inside or outside the network.
For additional information, see Chapter 17 of the TFS Guide.
I'd start with a few questions: does the client have a VPN? And are the remote consumers on this VPN already? How secure does this need to be?
(In our case, we have lots of outside vendors we don't want on our VPN, so our source control is publicly accessible with SSL)
When I did it, I used a VPN. Was easier to setup, and made sure that no-one could even see the machine with out being authenticated via the VPN - this was obviously way better from a security standpoint, which trumped any performance benefit we would have got from using SSL, if there even was one...
My previous experience with TFS was in an environment where we had a team of developers staffed out at client sites all over the city. In many situations we still accessed our TFS instance instead of something at the client site. We used SSL with public access to TFS. It worked very well for us.

Resources