After some changes I've made today in my schema.yml, each of one followed by the diff, migrate, build commands, the diff operation stopped working.
The last successful migration was the 243rd.
Now, every new change I make, when I give the diff command, the result is always the same:
/usr/bin/php /.../symfony --color doctrine:generate-migrations-diff
>> doctrine generating migration diff
>> file+ /tmp/doctrine_schema_92228.yml
Done.
No new file is created in lib/migration/doctrine, so I cannot use the migrate command to commit the changes to the db.
I tried to clear the cache, clean model files, build all classes, and also reboot.
Any ideas?
This is the best way I have come across to make migrations and success everytime. Cost me a lot to guess but works perfectly and works with multiple databases. I can be wrong in some sentences so feel free to add or correct me in anything you see.
MIGRATIONS, the safest way :)
If you need to work Migrations for multiple databases apply these patches and clear symfony cache, they work perfectly:
doctrine_core.r7687
doctrine_manager.r7657
A. BACKUP PROJECT AND DATABASE:
Save Symfony project Files. (optional but safe way).
Save Database Table Schemas only.
Save Database Table Data only.
Save Database Table Schema with Data.
B. HOW TO MAKE CHANGES TO .yml FILES:
.yml files cannot contain strange symbols, like ñ, ´, ```... or non UTF characters..
Always shows spaces and tabs in Notepad++ or Sublime. There cannot be tabs!!
You CANT have two modules with the same name, even in different databases. Never set two modules with same name or you will have a lot of problems.
If you want to work with multiple databases, you must specify the connection attribute at the beginning of your schema.yml file:
connection: doctrine_master
Working with multiple databases, again you must set the binding connection for the module with the right connection:
Tbtest001:
connection: doctrine_master
tableName: tb_test001
Setting the right variable value and type in schema.yml:
Schema Files
Variables, models and types
Working with multiple databases, take care and modify only one schema.yml for only one database each time!
If you are adding a new table with relations to another table, its recommended to do it in two steps, two migrations. First only add the table and migrate. Then Add the relation and Migrate again. It is the saftest way.
You can have different schemas.yml in different places.
C. MIGRATING THE CHANGES:
Install this plugin, because it has fixes and improvements for checking changes:
idlDoctrineMigrationPlugin
Make a new table for each database for your project. Needed for the plugin to work:
name: migration_version , column: version (int (11)). (autoincrement=false).
In version column, set its value to the lastest migration version you have now. You must do this step for every database where you have the table migration_version:
UPDATE databasetest.migration_version SET databasetest.migration_version.version='31';
UPDATE databasetest2.migration_version SET databasetest2.migration_version.version='31';
Clear Symfony cache:
symfony cc
Make the migration difference (you need the plugin above and version tables created)
symfony model:diff > migratediff.log
Check if the lastest generated changes are right in the following files:
.\lib\migration\doctrine\XXXXXX_versionXXX.php
.\data\migration\history\XXXXXXXXXX.yml
Proceed with the migration UP by specifing a number!, NEVER make migrate UP!. Also take in mind the new parameter --connection. It works now if you applied the above patches and it will migrate only the right databases:
symfony doctrine:migrate 32 --connection=doctrine_master > migrateUP.log
Rebuild models, Forms, Filters, Delete old models..
symfony doctrine:build-model
symfony doctrine:build-forms
symfony doctrine:build-filters
symfony doctrine:clean-model-files
symfony cc
Set all databases to the lastest migration number in their table migration_version:
UPDATE databasetest.migration_version SET databasetest.migration_version.version='32';
UPDATE databasetest2.migration_version SET databasetest2.migration_version.version='32';
Optional step, if you want to know the lastest SQL query send to the database after the migration:
symfony doctrine:build-sql [--application[="..."]] [--env="..."]
D. LINKS AND FILES:
Correct way to do a migrations diff
Doctrine migrations fallback
http://trac.symfony-project.org/ticket/7272
http://trac.symfony-project.org/ticket/7689
http://trac.symfony-project.org/ticket/8604
http://php-opensource-help.blogspot.com/2010/07/how-to-get-connection-from-doctrine-to.html
http://www.doctrine-project.org/documentation/manual/1_1/en/connections
http://forum.symfony-project.org/viewtopic.php?t=29361&p=104098
Main Files involved in migrations:
Migration.php, Builder.php, sfTaskMigration.php
Related
Every time I run migration:generate, it creates a migration that regenerates the entire database schema (rather than a migration for just the recent changes to my entities). I'm using Typeorm version 0.2.7, the latest version.
My ormconfig.json is:
{
"host": "localhost",
"logging": false,
"port": 5432,
"synchronize": false,
"type": "postgres",
"entities": ["build/server/entity/*.js"],
"migrations": ["build/server/migration/*.js"],
"cli": {
"entitiesDir": "build/server/entity",
"migrationsDir": "build/server/migration"
},
"username": "***",
"password": "***",
"database": "***"
}
When I run typeorm migration:generate -n SomeEntityChanges, the new migration file contains instructions for creating and linking up tables for all my entities, even though most of them already have corresponding migrations in build/server/migration.
When I run typeorm migration:run, I can see that there are no pending migrations, and that the migrations that cover the existing entities have been run (i.e. they're in my migrations table).
What am I missing? The docs say that the migration:generate command should just generate a migration with the recent changes.
This might sound really stupid but I was having the same issue and the problem was the name of my database. My database name was mydbLocal with a capital L, but when typeorm was reading in order to generate a new migration it would look for mydblocal and since there was no schema with that name it caused the generation to regenerate the whole schema. It seems like it is a bug because on the parsing of the schema it looked for the lower case one but when running the migration it went into the real one (upper case L).
Anyway the way I solved it was by changing my db name to all lowercase and also editing my ormconfig database name to be all lowercase.
It is a really strange situation but this solved my problem. Hopefully it will help someone else too.
As mentioned is one of the previous answers, the issue for me was indeed camel-casing the database name. Changing database name to all lowercase seems to have fixed the migration generate issue.
However, in my new project, I noticed that the entity table name override also seems to have the same behavior. Strangely, this was not an issue in the previous projects for me.
//Previous table name causing migration file to regenerate
#Entity({
name: 'TempTable',
})
//New table name which stops the regeneration
#Entity({
name: 'temp_table',
})
Hope this helps someone facing the same issue.
Removing {schema: 'public'} from the all of my #Entity definitions fixed it for me with Postgresql
Previous:
#Entity({schema: 'public'})
Working:
#Entity()
It is because your database is probably empty. TypeOrm computes the diff between your actual codebase entities and your actual database, thus generating the migration.
Check your ormconfig.json, as it is what is read by typeorm CLI to generate the migration, it probably points to an empty database, thus generating all tables in the migration.
Just migrate on your database, and run generate again.
I kept having this issue on MySQL and after hours of tinkering I finally resolved the problem.
Assuming the { schema: "public" } fix didn't work for you, here are the actions I had to take to clean up a terrible migration generation.
First of all, make sure you're running the latest version of TypeORM. I was down by a few minor version, and this alone was enough to give me countless index-based errors.
If you're up to date, great! Here's the bad news: your entities are wrong. The biggest issue I kept running into was the default property on most of my duplications. From what I've come to understand, both Postgres and MySQL return different-than-expected results when TypeORM is trying to compare database defaults to the defined default.
For example: on a type "decimal" with a 4-digit trailing decimal, default: 0 works fine when building your column, but MySQL actually returns "0.0000", meaning no matter how many times you run this update, the default will never be a literal zero. TypeORM sees this being different, and wants to change the existing MySQL default back to a normal zero.
This error spanned across everything from default: null to "tinyint" booleans being listed as int in my schema.
Read the generated output carefully and check each entity for the property being mentioned. Some of this was fixed by updating to the latest version of TypeORM, but I managed to completely clear almost 250 table alterations by ensuring the default data actually matched what MySQL stores.
I was having the same issue, my solution was not pretty decent but actually "works".
In the migration with the whole database, search for the tables involved, filter them, and delete all the other queries generated by typeorm, after deleted, run the migration command and it will work, creating/altering the tables you desired.
It's not scalable but remember, here they specify to alter only few tables before migration.
The rule of thumb is to generate a migration after each entity change.
1st
pay attention to lowercase or upercase
2nd
if u use upsercase name put it on ""
3rd
if u use schema add it to ormconfig
like:
"schema": "public",
Good Luck
In my case after around four migrations new migrations consisted of old changes. Searched the web, checked/tried the other answers from here. The only thing which helped in my case was just to remove the project, clone it again, setting it up again (adding the .env, run npm install), run the current migrations, generating the new migration file with only the new changes as expected. Of course this is not showing the root cause, but after that I could continue working without problems.
rm -rf project
git clone project
cd project
npm ci
ts-node ./node_modules/typeorm/cli migration:run
These two comments (1, 2) made me try this.
You should use migration:create instead of migration:generate. What I recommend is, inside your package.json:
{
...
scripts: {
"migration:create": "NODE_ENV=local npm run typeorm -- migration:create -n",
"typeorm": "ts-node -r tsconfig-paths/register ./node_modules/.bin/typeorm"
}
}
then you can just run:
$ npm run migration:create NameOfYourMigration
to create your migration successfully.
I want to use the database-migration grails plugin for database migration. When I start my Grails app the first time all the database tables are created automatically. The production setting in my DataSource.groovy is:
production {
dataSource {
dbCreate = "update"
url = "jdbc:mysql://localhost/myapp?useUnicode=yes&characterEncoding=UTF-8"
username = "test"
password = "test"
dialect = org.hibernate.dialect.MySQL5InnoDBDialect
properties {
validationQuery = "select 1"
testWhileIdle = true
timeBetweenEvictionRunsMillis = 60000
}
}
}
In my config.groovy I set:
grails.plugin.databasemigration.updateOnStart = true
grails.plugin.databasemigration.updateOnStartFileNames = ['changelog.groovy']
When I add properties to my domain classes I need to adjust the changelog file.
What is the best way to do database migration in this case? What are the steps I have to do when I add or remove columns?
As you're probably aware, the dbcreate directive is not recommended for production use:
You can also remove the dbCreate setting completely, which is recommended once your schema is relatively stable and definitely when your application and database are deployed in production.
So keep in mind that you will need to remove this (or set to 'none').
Initial Baseline Workflow
Define current state
Create database from change log or mark as up-to-date
Set config options
The first step is to get the changelog to reflect the current state. If you've got an existing database, you want to use that to define the baseline. Otherwise, use GORM to define the tables.
These commands will generate a baseline for your database. Also I choose to use the groovy DSL format rather than liquibase XML, because readability.
Existing Database
If you've got a production database with data already, its a little bit tricky. You will need to access the database or a copy of it from your grails environment. If you manipulate a copy, you will need to apply the updates back to your production (and potentially manage it as a planned outage).
The command is:
grails [environment] dbm-generate-changelog changelog.groovy
...where environment optionally specifies the dev/test/prod/custom environment the database is defined as.
Following that, mark the database as 'up-to-date' with regards to the changelog:
grails [environment] dbm-changelog-sync
Then reapply the database to production, if neccesary.
New Database
If you don't have an existing database (or don't care):
grails dbm-generate-gorm-changelog changelog.groovy
Then, to create the database from the changelog:
grails [environment] dbm-update
Configuration
You've already correctly got the options set:
grails.plugin.databasemigration.updateOnStart = true
grails.plugin.databasemigration.updateOnStartFileNames = ['changelog.groovy']
These options simply mean that the plugin will attempt to apply the changes to the database when the application starts.
Development Workflow
Make changes to domains
Generate changelog identifying differences
(Backup and) Update the database
So now you've got a database up-to-date, and you're smashing out changes to the domain classes, adding new ones and changing validation properties.
Each time you want to record your changes, you want to compare your GORM classes to what exists in the database, and create a new changelog file to record the difference:
grails [environment] dbm-gorm-diff [meaningful name].groovy --add
Here environment is the database you are comparing against, and meaningful name should reflect in some way the change being applied (perhaps a JIRA issue key, or a version number, or a description).
The --add flag will insert an include statement in changelog.groovy.
If you've configured updateOnStart, then you're done! Otherwise, to manually process the update, reuse the command:
grails [environment] dbm-update
RTFM
Plugin documentation - Getting Started
Plugin documentation - General Usage
Confile's answer above points to a good tutorial that goes into detail about manual changes to changelogs
Liquibase documentation - Changesets (Uses the XML format, but useful for understanding concepts)
The approach that I would use is to migrate every table to a Grails domain with the mapping (very important!) properly set.
Then leave Grails to create the database the first time and then populate it with a previous backup of the database you want to migrate.
After this set Grails config to update the database every time it starts.
I know it seems a little bit messy but if I´ve to do it I would´ve do it this way.
Hope it helps :)
I found a very good tutorial, which explains the solution to my problem:
Grails Db Migration Tutorial
Workflow consists of following steps:
1) Install the plugin using the command grails install-plugin database-migration
2) After setting up the plugin run the command:
grails dbm-generate-gorm-changelog changelog.groovy or changelog.xml
By default it will generate a file on location grails-app/migrations/changelog.groovy or .xml
3) set dataSource dbcreate='none'
3) Now, run
grails dbm-changelog-sync
this will create a table name databasechangelog and will insert entries according to your existing schema.
Thats it.
I use symfony 1.4.11 , I have a project...
I have schema.yml , and I have migrations with tables which are not in the schema. For example I have in my db "pages" table, and it not described in schema. When I get project in first time I make: build --all --and-load --no-confirmation ; and I get my db,I think that it created some tables from Base classes, because there are many tables in my db, but they are not described in schema . So now I need add a few new fields to my page table, I make migration, and it is all ok, I have new fields in my db, but I do not have it in schema.yml, so when I make symfony doctrine:build --all-classes nothing happen it do not generate page class with new column. I do not understand, if it possible to generate new class or changes to class without schema? How people that make project before me , do this?
Thank you! And sorry for my bad English
it's possible. Try to use the following command to clean you model files.
./symfony doctrine:clean-model-files
if you want to use migrations, you should not use doctrine:build --all --and-load --no-confirmation anymore. Migrations assume incremental updates. Dropping and building the DB every time is not good.
Try to follow those resources
http://www.slideshare.net/weaverryan/the-art-of-doctrine-migrations
http://www.slideshare.net/denderello/symfony-live-2010-using-doctrine-migrations
i ran
./symfony doctrine:build --all --and-load --no-confirmation
with the example's schema.yml file and all the tables and model classes in symfony populated nicely. however when i changed that schema.yml file completely by deleting all the example tables and rewrote my own, the database didn't delete the old changes nor did it delete any of the model classes. it just added the new tables into the database.
how can i get doctrine to "forget" about the old schema.yml ?
You need to run doctrine:clean-model-files to delete model classes that are not represented in project or plugin schema.yml files.
.. the old data was stored in an example file under the same directory, project_root/config/doctrine/schema_example.yml
turns out doctrine imports every file, not just schema.yml. woops.
I have a migration that runs an SQL script to create a new Postgres schema. When creating a new database in Postgres by default it creates a schema called 'public', which is the main schema we use. The migration to create the new database schema seems to be working fine, however the problem occurs after the migration has run, when rails tries to update the 'schema_info' table that it relies on it says that it does not exist, as if it is looking for it in the new database schema and not the default 'public' schema where the table actually is.
Does anybody know how I can tell rails to look at the 'public' schema for this table?
Example of SQL being executed: ~
CREATE SCHEMA new_schema;
COMMENT ON SCHEMA new_schema IS 'this is the new Postgres database schema to sit along side the "public" schema';
-- various tables, triggers and functions created in new_schema
Error being thrown: ~
RuntimeError: ERROR C42P01 Mrelation "schema_info" does not exist
L221 RRangeVarGetRelid: UPDATE schema_info SET version = ??
Thanks for your help
Chris Knight
Well that depends what your migration looks like, what your database.yml looks like and what exactly you are trying to attempt. Anyway more information is needed change the names if you have to and post an example database.yml and the migration. does the migration change the search_path for the adapter for example ?
But know that in general rails and postgresql schemas don't work well together (yet?).
There are a few places which have problems. Try and build and app that uses only one pg database with 2 non-default schemas one for dev and one for test and tell me about it. (from thefollowing I can already tell you that you will get burned)
Maybe it was fixed since the last time I played with it but when I see http://rails.lighthouseapp.com/projects/8994/tickets/390-postgres-adapter-quotes-table-name-breaks-when-non-default-schema-is-used or this http://rails.lighthouseapp.com/projects/8994/tickets/918-postgresql-tables-not-generating-correct-schema-list or this in postgresql_adapter.rb
# Drops a PostgreSQL database
#
# Example:
# drop_database 'matt_development'
def drop_database(name) #:nodoc:
execute "DROP DATABASE IF EXISTS #{name}"
end
(yes this is wrong if you use the same database with different schemas for both dev and test, this would drop both databases each time you run the unit tests !)
I actually started writing patches. the first one was for the indexes methods in the adapter which didn't care about the search_path ending up with duplicated indexes in some conditions, then I started getting hurt by the rest and ended up abandonning the idea of using schemas: I wanted to get my app done and I didn't have the extra time needed to fix the problems I had using schemas.
I'm not sure I understand what you're asking exactly, but, rake will be expecting to update the version of the Rails schema into the schema_info table. Check your database.yml config file, this is where rake will be looking to find the table to update.
Is it a possibility that you are migrating to a new Postgres schema and rake is still pointing to the old one? I'm not sure then that a standard Rails migration is what you need. It might be best to create your own rake task instead.
Edit: If you're referencing two different databases or Postgres schemas, Rails doesn't support this in standard migrations. Rails assumes one database, so migrations from one database to another is usually not possible. When you run "rake db:migrate" it actually looks at the RAILS_ENV environment variable to find the correct entry in database.yml. If rake starts the migration looking at the "development" environment and database config from database.yml, it will expect to update to this environment at the end of the migration.
So, you'll probably need to do this from outside the Rails stack as you can't reference two databases at the same time within Rails. There are attempts at plugins to allow this, but they're majorly hacky and don't work properly.
You can use pg_power. It provides additional DSL for migration to create PostgreSQL schemas and not only.