Active record querying with store_accesors - ruby-on-rails

I have a database where I need to do a search on specific record that has a certain output. What is making this tricky for me is that these values are found in a 'store_accessor' and therefore they aren't always there.
For instance if I run Team.last.team_configuration, I get this value below, and what I need are only teams that have a specific setting.
<TeamConfiguration:0x00007123456987> {
:id => 8,
:owner_id => 6,
:team_type => "football",
:settings => {
"disable_coach_add" => false,
"delink_players_at_18" => true
},
:type => "TeamConfiguration"
}
My thoughts have been something around these lines, but i keep getting undefined method 'settings' for team_configuration:Symbol
Team.where(:team_configuration.settings['delink_players_at_18'])
Would anyone know what I am doing wrong in this instance? I think because there are two separations from the main source it has been causing me some issues. Thanks in advance!

The problem is way store_accesors works, look what documentation says:
Store gives you a thin wrapper around serialize for the purpose of
storing hashes in a single column. It's like a simple key/value store
baked into your record when you don't care about being able to query
that store outside the context of a single record.
https://api.rubyonrails.org/classes/ActiveRecord/Store.html
So a posible solution could be to search by that column, converting a previously hash of what you want to string.
Team.where(team_configuration: data.to_s)

If you're using a postgres database and the TeamConfiguration#settings column is serialized as jsonb column you can get at this with postgres json operators:
Team.joins(:team_configurations)
.where("team_configurations.settings #> '{\"delink_players_at_18\": true}'")

Related

Rails duplicate key error for mongodb in batch insert

On my Rails4.2 app using MongoDB v5, Well I have some data in this array format which I have to insert in the database:
array_to_be_inserted = [
{'unique_key' => '12234'},
{'unique_key' => '3214'},
{'unique_key' => '32142'}
]
SomeModel.create(array_to_be_inserted) //For inserting
In the database I already have lets say '12234' unique key, so this throws an exception and the code stops, and the remaining data does not get inserted (i.e. 3214 and 32142 keys will not get inserted even though they are not present in the database). Even if I do rescue Exception , the code continues but the insert still fails.
Is there any way to get around this which can be instant ?
I already tried to make an array of those unique keys and did SomeModel.in(array_of_unique_keys) and then filtered out the array so that the new array becomes this:
array_to_be_inserted = [
{'unique_key' => '3214'},
{'unique_key' => '32142'}
]
I use this for array filter:
array_to_be_inserted = array_to_be_inserted.select { |x| existing_data.none? { |y| x['unique_key'] == y['unique_key'] } }
But the problem is that, that array filter takes time and memory too and meanwhile some other person stores that unique key already and code fails again once again.
I need something which can be instant, like can be done in one single query, for example in MySQL we can simply do INSERT IGNORE INTO, isn't there something for this which is fast ?
I have been facing a similar issue with a Rails application and a postgresql db. I need to insert millions of rows and if I try to check the existing primary keys my app crashes.
I am using bulk_insert gem which allows the option ignore: true. However, postgresql syntax does not support the 'INSERT IGNORE' statement, so it was quite an useless option in my case.
destination_columns = [:title, :author]
# Ignore bad inserts in the batch
Book.bulk_insert(*destination_columns, ignore: true) do |worker|
worker.add(...)
worker.add(...)
# ...
end
If you are using activerecord-import you can refer to this blog post:
Author.import(
[:name, :key],
rows_to_import_second,
on_duplicate_key_update: [:name],
validate: false
)
However, it seems that this gem also doesn't have an adapter for mongo (check the source code).
I guess that a combination of these two SO answers might fix your problem:
Insert many mongo - continue on error ruby / Ruby Mongo equivalent for mysql insert ignore
Mongoid ignores collection.insert if at least one duplicate exists

Rails 4: How to use Store (serialized stored hash) in Where?

I'm using Rails4's Store feature. I've added a new stored attributes namely "friends" with four accessors, first ... fourth.
The problem is how to utilize it in the "where" condition. When I use it as:
#persons = Person.where(friends.has_value?#user.id)
I receive this error:
NameError in UsersController#myfrineds
undefined local variable or method `friends'
I tried some other different ways but still I get error. Could you please help me to solve it? Or please let me know if you have any better idea to implement it (storing a dynamic hash of key/values)?
As stated by Uzbekjon, store is not made for this kind of things. Some workarounds to your problem:
Use a custom query (would be quite slow depending on table size so be careful):
#persons = Person.where('friends LIKE ? OR friends LIKE ? OR friends LIKE ? OR friends LIKE ?', "%first: #{#person.id}\n%", "%second: #{#person.id}\n%", "%third: #{#person.id}\n%", "%fourth: #{#person.id}\n%")
This assumes you used yaml for the serialization of friends attribute (it's the default). If you used json you'll have to change the query accordingly.
If you're using PostgreSQL you can use array attribute instead of store. Queries would have better timings since PostgreSQL supports this datatype.
Migration:
def change
add_column :people, :friends, :text, array: true, default: []
# if you want to add index:
add_index :people, :friends, using: 'gin'
end
Creation of records:
Person.create(..., friends: [friend_id_1, friend_id_2, friend_id_3, friend_id_4])
Query:
#persons = Person.where('? = ANY(friends)', #person.id)
You may also need to add to your Gemfile:
gem 'postgres_ext'
Hope it helps!
Short answer - You can't! Because, ActiveRecord stores your "hash" as a string in a single column. The only way I can think of is to use .where("friends LIKE :friend", friend: 'BFF') syntax. Don't forget to index your column though.
It is mentioned in the docs as well:
It's like a simple key/value store baked into your record when you don't care about being able to query that store outside the context of a single record.

Using class methods in named_scope and rspec

I may have painted myself into a corner.
In some of my rails (2.3.18) named_scopes I've used class methods to retrieve known rows from the database - for example status values.
However, when I try to use these with rspec, I think I've got a problem because the fixtures (I'm using FactoryGirl) haven't loaded before the app gets loaded - so I get an error when its parsing the named_scopes (I think).
For example:
named_scope :active_users, :conditions => [ 'status_id = ?', UserStatus.Active.id ]
When the user model is loading it gives an error to effect
app/models/user.rb:34: Called id for nil, which would mistakenly be 4
which is the named_scope line.
user_status.rb
def self.Active
UserStatus.find_by_name('active')
end
So I think I've got two questions:
Is this an abuse of named_scope and if so what would be a better way of writing it?
Is it possible to get rspec to load some key data into the database before it loads the application?
Thanks
Your named scope is written fine. You need to check output of UserStatus.Active.id. It should return array of ids. As per the naming convention if you have written method named 'Active' in UserStatus then it is wrong. It should be in lowercase.
Also I do not understand the use of id in => UserStatus.Active.id. Can you put this method here?
UserStatus.Active must be giving you nil so
UserStatus.Active.id giving you this error. Because id of nil is 4. Make sure you are getting a record in Active method
Managed to answer my own question and am putting it here in case anyone else has the same issue.
To ensure that an attempt to access the database when the named_scope is parsed is avoided, I needed to wrap the :condition in a lamdba / proc as below
named_scope :active_users, lambda {{ :conditions => [ 'status_id = ?', UserStatus.Active.id ] }}
This now allows the application to be loaded, and then the data required for the tests to be loaded into the database ahead of the test as usual.

How to search for a record and then delete it

How to search for the ttoken record by mobile no. and then delete that record?
User.rb:
field :ttokens, type: Hash, :default => {} # Stored as hash over numbers
Normally the value of ttokens, in the rails console, are as follows:
ttokens: {"919839398393"=>{"atoken"=>"f704e803061e594150b09ad8akabfc6105ac85ab", "confirmed"=>true}, "91812798765"=>{"atoken"=>"255cb70926978b93eik67edb52fa23a163587b4b", "confirmed"=>true}}
I need a mongodb query to search for the ttoken record by mobile number and then delete that record. The DB used is MongoDB. Any guidance or help would be highly appreciated.
You need to use MongoDB 'dot notation' for the embedded element, which means the "key" must be a string type of notation. Also apply $exists to match where the key in the has is present and the .unset() method from mongoid:
User.where('_id'=> userId, 'ttokens.919839398393'=> { '$exists' => true }).unset(
'ttokens.919839398393'
)
This is effectively the $unset operator of MongoDB, which removes "keys" from the document by the path specified.
From the sample document this would match and remove the first key, leaving only the other.

Batch insertion in rails 3

I want to do a batch insert of few thousand records into the database (POSTGRES in my case) from within my Rails App.
What would be the "Rails way" of doing it?
Something which is fast and also correct way of doing it.
I know I can create the SQL query by string concatenation of the attributes but I want a better approach.
ActiveRecord .create method supports bulk creation. The method emulates the feature if the DB doesn't support it and uses the underlying DB engine if the feature is supported.
Just pass an array of options.
# Create an Array of new objects
User.create([{ :first_name => 'Jamie' }, { :first_name => 'Jeremy' }])
Block is supported and it's the common way for shared attributes.
# Creating an Array of new objects using a block, where the block is executed for each object:
User.create([{ :first_name => 'Jamie' }, { :first_name => 'Jeremy' }]) do |u|
u.is_admin = false
end
I finally reached a solution after the two answers of #Simone Carletti and #Sumit Munot.
Until the postgres driver supports the ActiveRecord .create method's bulk insertion, I would like to go with activerecord-import gem. It does bulk insert and that too in a single insert statement.
books = []
10.times do |i|
books << Book.new(:name => "book #{i}")
end
Book.import books
In POSTGRES it lead to a single insert statemnt.
Once the postgres driver supports the ActiveRecord .create method's bulk insertion in a single insert statement, then #Simone Carletti 's solution makes more sense :)
You can create a script in your rails model, write your queries to insert in that script
In rails you can run the script using
rails runner MyModelName.my_method_name
Is the best way that i used in my project.
Update:
I use following in my project but it is not proper for sql injection.
if you are not using user input in this query it may work for you
user_string = " ('a#ao.in','a'), ('b#ao.in','b')"
User.connection.insert("INSERT INTO users (email, name) VALUES"+user_string)
For Multiple records:
new_records = [
{:column => 'value', :column2 => 'value'},
{:column => 'value', :column2 => 'value'}
]
MyModel.create(new_records)
You can do it the fast way or the Rails way ;) The best way in my experience to import bulk data to Postgres is via CSV. What will take several minutes the Rails way will take several seconds using Postgres' native CSV import capability.
http://www.postgresql.org/docs/9.2/static/sql-copy.html
It even triggers database triggers and respects database constraints.
Edit (after your comment):
Gotcha. In that case you have correctly described your two options. I have been in the same situation before, implemented it using the Rails 1000 save! strategy because it was the simplest thing that worked, and then optimized it to the 'append a huge query string' strategy because it was an order of magnitude better performing.
Of course, premature optimization is the root of all evil, so perhaps do it the simple slow Rails way, and know that building a big query string is a perfectly legit technique for optimization at the expense of maintainabilty. I feel your real question is 'is there a Railsy way that doesn't involve 1000's of queries?' - unfortunately the answer to that is no.

Resources