I created something like a JSON backup for my project's database, and then I populate it like this
Model.find_or_initialize_by(:id => h["id"]).update(h)}
being h a hash of the model attributes for an instance.
The records are effectively created, but when I want to create a new record, rails rises this error
PG::UniqueViolation: ERROR: duplicate key value violates unique constraint "table_pkey"
What could I be doing wrong? It happens for all models which were created using scaffold, here a migration as an example.
class CreateModel < ActiveRecord::Migration[6.1]
def change
create_table :models do |t|
t.string :attribute1
t.string :attribute2
t.string :attribute3
t.timestamps
end
end
end
You're using sequential integer id's for your table according to the migration. This works well enough if you allow the database to assign id's for you. Every time a new record comes in, database takes the next number on the list and assigns it to that record (simplifying here).
Lets assume the database id sequence is currently at 3 and the records you imported have ids 4, 37 and 143025. Inserting a new record to the database, database says id is 3, all good, sequence is now at 4. Inserting another one, database says id is 4. Trying to insert it, but there already is a 4 in the database.
PG::UniqueViolation: ERROR: duplicate key value violates unique constraint "table_pkey"
A few possible solutions:
After importing, change the database id sequence to something bigger than the largest id you imported. (hacky, but works) Postgres manually alter sequence
Import the items without hardcoding their id-s. (complicated)
Change your database to use uuid-s instead of integer id-s (architectural change, difficult if the app is live, best solution if you're still in development)
Use a proper database backup system rather than building your own. pg_dump
Related
For a project my company is involved in, I need to forward some data according to their consecutive numbering according to their creation date(time). Some data is already stored.
The only way I found up to now is to set a similar value as the id column for a model in Ruby on Rails. These values are consecutive starting from 1 (to n, with n being the number of records in the table), according to the values of another column.
I don't want to override the default id column, of course.
I just came up with the following code:
class User < ApplicationRecord
...
def id_2
User.all.order(:created_at).index(self) + 1
end
But I sense that there can be some better way to this approach, even database-oriented.
Is there a more efficient implementation for this, instead of calling the whole contents of the table, as per the code?
If your database supports it, you probably should simply set you column to auto_increment
add_column :table_name, :id_2, :integer, auto_increment: true
I use this command :
rails g migration CreateJoinTableUserPloy user ploy
And i check the Migration file:
create_join_table :Users, :Posts do |t|
#t.index [:user_id, :ploy_id]
#t.index [:ploy_id, :user_id]
end
There are 2 index is be commented by defualt.
Then i run this command:
rake db:migrate
Then i check my database structure
And i not seen primary key, Does it mean that join tabel no need add index and primary key in database structure?
Consistent with http://meta.serverfault.com/a/1931, I'm going to respond to this as an answer, even though some of the information is in the comment thread.
There is no primary key in a Rails-generated join table (i.e. as created by create_join_table, which is used by migrations with JoinTable in their name) and Rails has no inherent requirement for one. That's because most pure join tables are only accessed by the id's of the joined tables, in which case primary_key is unnecessary. You can, of course, add and designate a primary key column if you wish.
Rails does not support multiple column primary_keys, although there are multiple gems that provide that support, such as https://github.com/composite-primary-keys/composite_primary_keys.
Further, there is no fundamental need to create an index. You can create one if you wish, and it will speed up access to records at the cost of some additional time spent on record creation and update. See https://stackoverflow.com/a/3658980/1008891 for more discussion of this.
If I have the following line in a migration, will postgresql add an implicit index? Should I explicitly add an index for the foreign key? The models contain has_many and belongs_to as appropriate.
t.integer :club_id,
:null => false,
:options => "CONSTRAINT fk_transactions_club REFERENCES clubs(id)"
From the fine manual:
5.3.5. Foreign Keys
[...]
Since a DELETE of a row from the referenced table or an UPDATE of a referenced column will require a scan of the referencing table for rows matching the old value, it is often a good idea to index the referencing columns. Because this is not always needed, and there are many choices available on how to index, declaration of a foreign key constraint does not automatically create an index on the referencing columns.
You're FK is referencing a PK so you probably don't have to worry about UPDATEs. If your referencing table (the one with club_id) is going to be large and you're expecting to be deleting club rows often, then an index of some sort on club_id should make deleting club rows faster. If you don't have an index on club_id, attempting to delete a club row will require a table scan on the table containing club_id and table scans are not your friend.
So the answer is maybe, it depends on how club will be used.
We have multiple lists of shops from different data sources that have to be matched.
The shops have a composite primary key [source, id]. The matching creates a separate entry in the shops table with source=0 and extracts values (name, url, ...) that can differ a bit from source to source.
Now I could add another two columns meta_shop_source and meta_shop_id to shops and a belongs_to :meta_shop, class_name "Shop", foreign_key: [:meta_shop_source, :meta_shop_id] to the Shop model. I am using the composite_primary_keys gem.
However, as meta_shop_source is always 0, it would seem like a waste of space. The same process will later on be used for products and there are millions of rows, so optimization will be needed.
So I am looking for something like belongs_to :meta_shop, class_name "Shop", foreign_key: [0, :meta_shop_id] or a method that I can override so that I don't need the meta_shop_source column in my database.
The one alternative is to define the value in the model so that you can directly access it from the controller directly.
But yes for undefined columns in tables, attr_accessor can be used as you don't want to store in the database directly and that will only exist for the life of the object.
attr_accessor :source
If the meta_shop_source will always be 0, then there is no need to create a column with this field in the table.
Just use a foreign primary key, meta_shop_id.
Similarly, the primary key for the Shop should be id.
Having so, you will never have to worry about the source as it will be 0 always.
You can define the value in the model so that you can directly access it.
UPDATE:
As far as optimization is concerned then do it at migration level by adding indexes with:
add_index wherever necessary. Eliminating one column just because of one source having 0 is not a good idea and will not effect the performance. This may be possible if all of the sources also have values greater than 0.
Since you have a distinct “source” field, an index can be created. By creating an Index, you would be basically creating an internal register which gets saved in by the MySQL itself.
ALTER TABLE shop ADD INDEX (source);
Upon creation and setting of the Index, later whenever you wish to fetch some information pertaining to the individual who has been assigned the source 5, the service would straight-way go to it by using the index, hence generate a result at a much faster pace than the earlier query.
I have a rails program that is accessing a legacy database with UPPERCASE table columns.
I want to be able to type user.firstname rather than user.FIRSTNAME
How do I make ActiveRecord retrieve the lowercase version of these column names to allow me to use lowercase attributes in the model?
It might be easier to change the column names with migrations. Otherwise you will have to change the gems that you are using and then pack them in vendor/gems to keep as they degrade.
script/generate migration down_case_table_names_and_columns
write the migration
rake db:migrate
For each table:
rename_table :OLD_NAME, :new_name
For each column:
rename_column :COLUMN_NAME, :column_name
problems
You might not have to change the name of the tables, fyi - you might get an error about changing the table names. I have never changed a table name so I don't know if this will work. In therms of changing column names there will be no problem.
This is probably the best way to deal with this if changing column names isn't feasible: https://github.com/reidmix/legacy_mappings