Complete MOSS 2007 Migration - sharepoint-2007

The situation at the moment is that we have a sharepoint server which started out as a pilot but now actually runs as the production environment. The server on which sharepoint runs is an old machine which does not conforms the standard requirements so I want to move the current environment to the shiny new server.
I've red a lot about migrating the MOSS services, databases and content and stuff but to be honest I am kinda lost in a sea of information and I can't find the right method to do this, I've tried to install MOSS 2007 on the new server as a clean install, restored the databasses on the new server, restored the backup on the new server which I made with Sharepoint Central Administration, alas, I did not worked :-(
Lots of "Can't find this" and "Can't find that" errors...
It should be possible to grab all the data/sites/subsites/databasses/content/documents and everything else and restore that to the new server right?
Anyways, I was hoping for some step-by-step information... :-)
Regards
Erik404

I think I found a way to do a complete migration...
Install a fresh version of MOSS 2007 on the new server (Server_B). Install the features and solutions you have on Server_A. Then use the SPContentDeploymentWizard which can be downloaded for free from CodePlex to make an export of all site content and import these on Server_B. Also backup custom databases needed by features and create these on Server_B.
I do have a almost completly equal server running now, some funky errors pop up now and then so I don't think it's the best way to do it...
Also, custom developed webparts need to be deployed manually to the new server, I didn't find a way to migrate these

You are on the right track. The CodePlex solution is just a wizard GUI of what you would have to do at the command line via stsadm.
Essentially you:
Build the new server w/ all the patches, Service Packs etc that are on the old.
You will want to run a command at the cmd line to scan for and resolve any orphaned sites in your databases on Server A (command slips my mind but you can find it in TechNet - I think it's repair database - something like that)
Limit access to Server A - so you don't have user changes during your migration.
Prepare the databases for migration - another stsadm command - stsadm -o preparetomove
Detach the databases from Server A. Horrible command of -o deletedatabase - it deletes the reference but not the actual database (but still!)
Attach the databases to Server B
You should be good at that point.
As you've discovered - you can't migrate custom code, webparts etc. These need to be reinstalled. One of the reasons that solutions / features are really recommend for customizations.
Also - the search dbs, indexes etc. need to be recreated. You can't bring those over. But that's fairly straightforward - and makes sense when you think about it.

Use these scripts if you need.
http://globaldeploymentmoss2007.blogspot.ca/

Related

Umbraco Bi-directional Deployment

I'm using Umbraco 7.4.x. I've been trying to figure out the best way to do bi-directional deployments.
As in, we have more than one dev working locally, and we have a dev server and a live server. We have single click deploys from local to dev, but that's only code. We were copying up the databases to dev, but now we also have people who need to enter content on dev. This leads us to making changes on dev database as well and copying down the database. We do all this with Version control of course, but still, this is all very inconvenient.
Is there a better approach to this that I'm missing? I tried using usync a few months ago but we'd often run into crashes.
I have heard of Courier, it seems like it would be good for deploying from dev/stage to production, but would that also work for pushing content/doc type changes to our local machines? I wasn't sure as they're not web servers on the internet but just local IIS Express running through Visual Studios
Thanks in advance!
We use uSync (uSync + uSync.ContentEdition - https://our.umbraco.org/projects/developer-tools/usync/) for moving everything between instances. Give it another shot as it has changed from the point when you're exploring it in the past. It's worth to mention that it requires good configuration on different enviroments to avoid conflicts etc.
You can also use Courier and it's latest version is used by Umbraco Cloud (http://umbraco.io/) which may also interest you as it gives you full control over deployment processes between multiple Umbraco instances.
One option is to have all of your developers set up to work off of the same dev database. On occasion, your developers might have to "Republish the entire site" or reindex the examine indexes to make sure all their cache and TEMP file are up to date. Otherwise, this has worked well for us for many years. One frustrating part of this is that media files uploaded by dev A won't be immediately on the file system for dev B. You should be able to move your media to azure blob storage to work around this problem. There is a package that should help set this up here.
I wouldn't recommend uSync.ContentEdition. I haven't tried it personally, but I have yet to hear a good report about it. uSync on the other hand has been a life saver for us even if it isn't perfect. At this point, we install usync on every site even if we never configure it to read in changes. We like that we can record our changes to document types and datatypes in source control. Working with the shared database setup means that we don't need usync to be reading on our dev and local environments. However, you will need to make sure that your devs all understand usync. If dev A adds a doc type, the usync .def file for that doc type could show up on the file system for dev B. Dev B should not commit that usync file in that situation.
Courier has been working a lot better recently. I wouldn't recommend it unless you are running umbraco 7 and can get the latest version of Courier. Courier is very useful, but you should do a lot of testing with it before you hand it over to a client because Courier gives you the ability to shoot yourself in the foot in a big way. It has definitely improved. In Courier for umbraco 6 I used to have to try really hard to deploy without breaking my site. Now, in Courier for umbraco7, I have to try really hard to break it. This is now a viable option for deploying content changes to production. Just make sure you test it heavily before you use it in a production environment.

Deploying Website with Migration using FTP

I have an asp.net MVC website making extensive use of ef and migrations.
i have tried deploying it to a system running windows 10 on a local network but seems like ms has removed that options from the latest release and now deployment using web deploy is only possible on server os's.
No trying to do the same using FTP.
whats the ay to deploy using ftp on a local server. I have already setup FTP publishing but cant seem to figure out how to deploy the db and configure the app to run code first migration after every deploy.
The accepted answer in this forum post helped me out with this exact issue. I added the code given there to Application_Start in my Global.asax.cs
Database.SetInitializer(new MigrateDatabaseToLatestVersion<MyObjectContext, MyObjextContextMigration>());
MyObjectContext contexttest= new MyObjectContext();
contexttest.Database.Initialize(true);
Just be aware that this will run your seed method every time you visit your site (at least it did for me). In my configuration.cs I put in a check to see if data existed in my tables, if that data was null then it was ok to seed that table, etc.

Entity framework code first - best practice for migrations on multiple environments

Our team works on a website with ASP MVC5 & EF5. there are 3 machines involved in my question:
Developer's machine
Test server
Production server
So a regular work cycle is: the developer adding features & puts them on the test server when there are some approved tested features they goes to the Production server.
When there are no model changes - everything works great.
But, when there are model changes - sometimes the migration gets wrong.
and the DB get so messed up so we need to backit up, clear everything & copy the data from the backup.
what we do is:
Add migration on dev-env.
Copy all the source to the test/production (Model & Migration folders).
Update database.
when doing this basic cycle some times the migration failes to update because of table already exists; I guess im doing something wrong in the process.
Another bad thing:
we have copy of the development environment on the Test & Production servers - my guess is that we dont need VS on the server in order to update version (we use the VS to run the EF commands from the package manager.
My question is
Is there any best practice/manual regarding dev->test->production process & how to do it right with code first & no visual studio on the servers?

Change Lightswitch internal database to external

I have been developing an application in Visual Studio Lightswitch (Silverlight client in VS2010 SP1 if it makes any difference), and so far have done all the database development in the Lightswitch designer.
I now want to use an external database, but don't want to recreate the whole app, or even just the screens. I have scripted the database, and have created a copy in SQL Server, but can't find out how to get Lightswitch to use this external database without starting the whole thing again.
Is there a simple way to change the connection string so that I can carry on from where I am, but have Lightswitch point at the external database instead of the internal one?
Turns out this is a very difficult thing to do. Super Lightswitch-hacker David Baker saw a post I'd made on the subject in the MSDN Lightswitch forum, and kindly offered to have a go at it for me. It took him several goes, but we got there in the end.
I wouldn't recommend this to anyone. I would strongly recommend using an external database right from the start. I can't see much benefit from using the internal one, and if you ever need more control over it, or want to switch to an external one, you've got major problems.
Hope this helps someone.
You must publish your project in order to create a new SQL database , publishing your project will create DB scripts for your new DB , don't use the internal LS DB to create script for your new DB.

EF Migrations - how to manage during dev and deployment?

We're considering using EF 4.3.1 code-based migrations, but aren't clear about how to integrate Migrations with our present dev/deployment methodology...
The app in-question is a desktop WPF app, with each desktop having its own SQL Server instance (each with 4 separate databases). It is deployed into a "field" environment with zero local IT support. Any database migration must be done using SQL scripts executed by the installer (probably InstallShield). There will not be anyone available who can run a command at a PMC prompt to upgrade the db when it is deployed/upgraded at a field location. Thus the ultimate "output" from EF Migrations must be a set of SQL scripts, which the installer will selectively apply.
Also, we have multiple developers making concurrent database changes.. there is NO DBA. Each developer simply checks-in their code (model) changes to TFS, and the next time they do get-latest, the changes to the model automatically cause a new database to be created on their dev system. So how can we now have each developer perform their own local migrations (rather than deleting/recreating their local databases), and then manage/consolidate/combine those migrations? And what about collisions?
During dev and unit-testing, each developer may delete their (entire) local database multiple times during a single checkout/checkin iteration. This works great with Code First, since the database gets automatically rebuilt when the app is restarted. But this means that the _MigrationHistory table in the database also gets deleted. How do we handle this? Don't we need the migration history of each dev system? If not, then where/how do we detect the aggregate changes which need to be applied to the delivered system?
I can see the value of using Migrations to deal with the mechanics of migrating a database, but what's not clear is how to take advantage of it without introducing a centralized database "change-control" bottleneck into the dev cycle, and thus losing one of the key benefits of Code First.
Any insight/advice would be greatly appreciated!
DadCat
I know this is an old question but I thought I'd post some of my experiences with EF Migrations (v6.1).
Each dev will be fine. Migrations are put into classes with a timestamp in the name, so no collisions will happen. The DB on the dev's machine will be updated after doing a get latest and running the app (or the update-database command).
Deleting the local db and recreating is fine. Just make sure the dev runs update-database before adding additional migrations or things will get out of sync. I'm a bit confused as to why they'd need to delete the local DB, but that's out of scope. You may find that your process needs to change to accommodate EF Migrations.
I can't help you with the installer question, as a similar question brought me here. The update-database command does have a -script option that will generate the proper change script, but I'm unclear how to automate that on a build server.

Resources