I am about to perform database migrations in MVC .NET. I was wondering how can I unit test this.
For example, I would like to apply the migration in this question: Rename a db column. How could I unit test this?
My thoughts:
Migrate all existing migrations, except my latest one which I am going to test
Add data to the context
Apply new migration
Test data still there
If my thoughts makes sense, any idea how to apply these to MVC .NET? Thanks!
For integration test. Just my rough idea. Probably depends on your environment - build server, DB server, deployment making, ...
You have to always have same (known) starting point with database. Either you'll create it manually and commit it into VCS and always manually update to M-1 (M=migration). Or just suppose all the migrations M-1 worked before (because it was tested) and create it automatically using i.e. migrate.exe. Then you try to do your steps and then test, preferably using different "channel" than EF, that the data is there, that the column is there etc. Just pure SQL through good old ADO.NET is enough here. Because it doesn't need to be versatile, you can create some simple helper(s) that will run the query using the well known Connection-Command-Reader path and return it as a raw data i.e. simple IEnumerable (I did that myself, through dynamic to make super simple.).
My advice is just to keep it simple, nothing fancy and clever. It's just to support testing.
Related
I have an app and many history versions of the database.
Our users are typically "once in a year" users, so this means you can never be sure which version of the database their app is running on.
Now in my new version of the database I need to do some custom migration.
The method I use to do this is described in this tutorial: http://9elements.com/io/index.php/customizing-core-data-migrations/
To summarize: I have to make Custom Mapping Models so that I can write my own migration policies for some fields.
Now when I create a Custom Mapping Model, I have to select a Source "xcdatamodel" and a Destination "xcdatamodel" (where "destination" is te new version of my database).
My question is, if I want to do this custom migration from all possible versions, do I need to create multiple Custom Mapping Models, all with a different source, or is there a smarter way to do this?
Or is CoreData smart enough to recognize this?
The short answer is yes; you need to test every migration from every source model to your current destination model. If that migration requires a custom mapping then you will need to have a mapping for that pair.
Core Data does not understand versions; it only understands source and destination. If there is not a way to get from A to B then it will fail. If it can migrate from A to B automatically and you have the option turned on, then it will. Otherwise a heavy (manual) migration is required.
Keep in mind that heavy migrations are VERY labor intensive and I strictly recommend avoiding them. I have found it is far more efficient to export (for example to JSON) and import the data back in then it is to do a heavy migration.
It is enough to have a consistent sequential series of migration models up to the current version. Core Data is "smart" enough to execute the migrations you tell it to migrate in the given order.
Perhaps a stupid question, and without code, so I'm not sure I'm on the right StackExchange site. If so, sorry and give me a comment please.
I begin programming in Sinatra (only intranet until now) and in the samples they almost always use migration with activerecord, the same with RoR.
I have experience enough with activerecord itself and it is very helpfull but I'm not sure why migration is always used ? If I create a project I just create a SQL script or a Ruby activerecord script that does the creation of the DB for me and that's it..
If I need the site or script somewhere else I just need to execute that script and ready.
Obviously I'm missing here a lot, so who can me explain the other benefits or point me to a good explanation ?
From Rails docs:
Migrations are a convenient way to alter your database schema over
time in a consistent and easy way. They use a Ruby DSL so that you
don't have to write SQL by hand, allowing your schema and changes to
be database independent.
So, the main two benefits are:
It's like Git for db schema, you won't know how that's useful until you are in a mid-size or big project with many contributors and someone makes a bobo :)
You write ruby code to build your db schema, this way if you decide to move from mysql to pg for example, you don't need to open up pg manual and check code compatibility
Update
In the api docs of migrations, you will find many nice use cases (to be honest i didn't know about half of them) ... check it out (http://api.rubyonrails.org/classes/ActiveRecord/Migration.html)
Building a creation script for your database is great, provided two things:
All your database deployments are on new machines
You get your database schema exactly right the first time
In our modern agile environment we not only don't believe it is possible for a project larger than a few lines of code, we also don't encourage aspiring to it.
Active Record's Migrations offer a way to describe your database incrementally. Each "migration" defines how to add its features to the database ("upgrade"), and how to remove them if necessary ("downgrade").
When running migrations, on top of actually running the scripts, Active Record also maintains its own table (schema_migrations), to remember which migrations have been applied to the database, and which are pending.
This enables you to build your database alongside the features as you develop them. It also facilitates working in teams, since each team member develops her own migrations, and Active Record "stitches" everything together, so you don't have to maintain a monolithic creation script.
I'm a PHP programmer for over a decade and making the move to RoR. Here is what I'm used to from the PHP world:
Create DB schema in a tool like MySQL WorkBench -- and make fields precisely the size I want without wasting space (e.g. varchar(15) if it's ip_address).
Write models using Datamapper and place those exact field lengths and specifications in there so my app doesn't try to put in any larger values.
In the RoR world from what I've seen over the past two days, this seems to be the flow suggested:
Add fields / schema using the command line which creates a migration script and apparently created large ass fields (e.g. "ip_address string" is probably making the field varchar(255) in the db when I run the migration).
Put in validations during model creation.
Am I missing something here? What's the process in the RoR world for enterprise level applications where you actually want to create a highly customized schema? Do I manually write out migration scripts?
The scaffolding is what you use to get started quickly. But before running the migration, you can edit it and add constraints and specific column lengths.
Validations specified in the model (in the ruby code) does not carry the same level of security as validations /constraints specified on the database. So you still need to define those on the database.
While it is possible to work with Rails without migrations, I would strongly advice against it. In some cases it cannot be avoided (when working with legacy databases for instance).
The biggest advantage of using the migrations is that your database schema, accross different platforms, can be held in sync through different stages. E.g. your development and your production database. When deploying your code, the migrations will take care that the database is migrated correctly.
You can edit the migration scripts before you run the migration in order to customize the fields.
Yes, if you need to tweak the defaults, you edit the migration scripts.
Also note that you don't need to use migrations, they're a "convenience" while iterating through DB development. There's nothing that says you must use them. The active record pattern doesn't rely on how the DB tables/fields/etc. are created or defined.
For example, migrations are useless when dealing with legacy DBs, but you can still write a Rails app around them.
I'm switching to RoR from ASP.NET MVC. Yeah, migrations are cool, but I do not need to use different databases in my web applications. Postgresql will do just fine.
So is it okay if I use PGAdmin to create and administer my databases and schema and avoid all these fancy migrate, rake etc?
Update
Thanks everyone! Now I better understand what migrations are, and why I should use them.
I don't think that's what migration means.
Migrations in rails (and in other frameworks) is a method by which you can use to update your database schema when there are multiple versions of the same database running
For example, you may have two databases, one running on your production server, and another running locally for development. After a few days of coding, your local development database may looks a bit different. With migrations, you can simply push your code to the production server and then run the migrations to automatically update your production database so it is up-to-date with the one you use locally for development.
So, to answer your question, Yes it is OK but you might not get a few of the migrations niceties when the time comes that you'll have to maintain multiple versions of your database.
Have to agree with charkit but one (rather two) important note why you should use migrations: Migrations don't make up the model definitions. They are stored seperately in a file schema.rb. This defines the rows and tables of your database. When looking into the file, you find these lines:
This file is auto-generated from the current state of the database. Instead of editing this file, please use the migrations feature of Active Record to incrementally modify your database, and then regenerate this schema definition.
The second reason is for testing: you can easily set up a test database to run all your tests against without the need to touch the "real" database. I know when developing, this is not a big problem but this will get more important after some time.
So, yes, it is possible to use PGAdmin to create all your database related stuff but you should not forget to always keep the schema file up to date and come up with a solution for testing.
With migrations you're able to develop your database schema in Ruby and this is usually database indpendent.
In short, spend the 20 minutes or so to really get migrations and the value they add. Then determine whether or not you want to ditch them. Strangely for me I learned Rails before I started my first MVC project; one of the things I missed most was migrations.
From a technical standpoint you should be fine without them.
I am trying to build a script to automate the update process for symfony projects. E.g. apply any patches from one revision to another including any possible model updates and new fixtures without re-loading the entire database.
I have completed most of it, but I have been unable to find a way to generate an SQL file containing the INSERT statements that get executed by propel load data when it inserts the fixture.
I know that I could dump the database to get the info, but that's not the point and that wouldn't work in my case because I'd need an empty db that has no other data on it.
So in short, my question is: Is there a way to take a fixture YML and generate the SQL inserts necessary?
I know this could be done by processing the yml and building the logic, but the build process it does it somewhere and there's no reason to write that again..
Any help would be much appreciated.
Perhaps I did not understand your question correctly, but couldn't you just call the task that loads the fixtures into the db, or alternatively write code that calls sfPropelData::loadData() (I'm assuming Propel here).
$loader = new sfPropelData();
$loader->loadData(sfConfig::get('sf_data_dir').'/fixtures');
Even sfPropelData and its parent sfData do not generate sql code when loading fixtures, but rather do it using the ORM (as can be seen in sfPropelData::loadDataFromArray().
If you explained your problem more, and why this solution isn't enough, I may be able to help.