I'm in the situation that I'm using sqlite with ActiveRecord and Rails (also, this is JRuby and so I'm actually using the jdbcsqlite adapter, in case that matters). Now, I'm trying to insert a row into the table attention_seekers, but only if there is no other existing similar row. Accordingly,
unless AttentionSeeker.find(:first, :conditions => {:key_id => key.id, :locale_id => l.id})
item = AttentionSeeker.new(:key_id => key.id, :locale_id => l.id)
item.save
end
This is the generated output in the log:
CACHE (0.0ms) SELECT * FROM attention_seekers WHERE (attention_seekers.key_id = 318 AND attention_seekers.locale_id = 20)
AttentionSeeker Create (1.0ms) INSERT INTO attention_seekers (key_id, locale_id) VALUES(318, 20)
CACHE (0.0ms) SELECT * FROM attention_seekers WHERE (attention_seekers.key_id = 318 AND attention_seekers.locale_id = 20)
AttentionSeeker Create (2.0ms) INSERT INTO attention_seekers (key_id, locale_id) VALUES(318, 20)
As you can see, for some reason the find is being cached, even though I'm inserting elements which affect it. What am I doing wrong/how can I stop this behaviour?
I did some digging and came across this helpful blog post, with more information available here. My solution (using the validation that Mike Buckbee suggested - thanks!):
AttentionSeeker.uncached do
item = AttentionSeeker.new(:key_id => key.id, :locale_id => l.id)
item.save
end
Instead of putting this code in your controller (which is where I'm guessing it is), you may want to consider using a validation instead, which I think would solve the problem:
class AttentionSeeker < ActiveRecord::Base
validates_uniqueness_of :key_id, :scope => :locale_id
end
Please note the "scope" option in the validation rule.
Failing that you could try wrapping the query in a Transaction
Failing that, and this seems incredibly janky you could add a cachebuster to the query itself. Something like
buster = rand(Time.now)
attention_seeker = AttentionSeeker.find(:first, :conditions => ["#{buster} = #{buster}"])
Which should give you a unique query every time through your loop.
Unique indices on the schema-level are safer than validates_uniqueness_of.
See http://railswarts.blogspot.com/2007/11/validatesuniquenessof-is-broken-and.html
Stephan
Related
The model:
class Venue < ActiveRecord::Base
has_many :free_unids, :class_name => "Unid",
:conditions => ['id not in (?)',
(Spot.where('unid_id is not null').map(&:unid_id) + [-1])]
end
Accessing #venue.free_unids triggers an evaluation of the condition itself as we can see in the log:
Unid Load (0.4ms) SELECT "unids".* FROM "unids" WHERE "unids"."venue_id" = 79 AND (id not in (4,8723,8889,-1)) ORDER BY id LIMIT 1
Problem is that the subquery (Spot.where('unid_id is not null') / (4,8723,8889,-1)) often does not reflect the new records inserted into Spot a few seconds ago. And debugging at the line where the relation is accessed (pp Spot.where('unid_id is not null')) yields the correct set of records, including the new ones.
To me it looks like if the subquery expression result is cached, but I have to admit that I do not quite understand the logic behind the curtains here...
Is it possible to force the evaluation of the expression on every access? Or do we need another approach here?
I gave up the relation approach and went for the query method solution like this:
def free_unids
return Unid.where(
'venue_id = ? and id not in (?)',
id,
(Spot.where('unid_id is not null').map(&:unid_id) + [-1])).
order('id')
end
See my comment to Rob's answer for details on why I chose this solution.
Using:
#venue.reload
before accessing the association should work.
I'm trying to optimise some N+1 queries in active record for the first time. There are 3 to kill - 2 went very easily with a .includes call, but I can't for the life of me figure out why the third is still calling a bunch of queries. Relevant code below - if anyone has any suggestions, I'd be really appreciative.
CONTROLLER:
#enquiries = Comment.includes(:children).faqs_for_project(#project)
MODEL;
def self.faqs_for_project(project)
Comment.for_project_and_enquiries(project, project.enquiries).where(:published => true).order("created_at DESC")
end
(and the relevant scope)
scope :for_project_and_enquiries, lambda{|p, qs| where('(commentable_type = ? and commentable_id = ?) or (commentable_type = ? and commentable_id IN (?))', "Project", p.id, "Enquiry", qs.collect{|q| q.id})}
VIEW:
...
= render :partial => 'comments/comment', :collection => #enquries
...
(and that offending line in the partial)
...
= 'Read by ' + pluralize(comment.acknowledgers.count, 'lead')
...
Two SQL queries are called for each comment. The 2 queries are:
SQL (2.8ms) SELECT COUNT(*) FROM "users" INNER JOIN "acknowledgements" ON "users".id = "acknowledgements".user_id WHERE (("acknowledgements".feedback_type = 'Comment') AND ("acknowledgements".feedback_id = 177621))
CACHE (0.0ms) SELECT "users".* FROM "users" WHERE "users"."id" = 1295 LIMIT 1
I would have thought appending (:user, :acknowledgements) into the controller's .includes would have solved the problem, but it doesn't seem to have any effect. If anyone has any suggestions on what I'm missing, I'd be really appreciative
I believe in your Comment table you want to add a :acknowledgers_count column as a counter cache
has_many :acknowledgers, ....., counter_cache: true
You will need to create a migration to add the :acknowledgers_count column to the comments table. Rails should take care of the rest.
You can learn more about the ActiveRecord::CounterCache api here.
The count method in comment.acknowledgers.count is overloaded in ActiveRecord to first check if a counter cache column exists, and if it does, it returns that directly from the model (in this case the Comment model) without having to touch the database again.
Finally, there was very recently a great Railscast about a gem call Bullet that can help you identify these query issues and guide you toward a solution. It covers both counter caches and N+1 queries.
As #ismaelga pointed out in a comment to this answer, it's a generally better practice to call .size instead of .count on a relation. Check out the source for size:
def size
loaded? ? #records.length : count
end
If the relation is already loaded it will just call length on it, otherwise it will call count. It's an extra check to try and prevent the database from unnecessarily being queried.
How do I select a single random record for each user, but order the Array by the latest record pr. user.
If Foo uploads a new painting, I would like to select a single random record from foo. This way a user that uploads 10 paintings won't monopolize all the space on the front page, but still get a slot on the top of the page.
This is how I did it with Rails 2.x running on MySQL.
#paintings = Painting.all.reverse
first_paintings = []
#paintings.group_by(&:user_id).each do |user_id, paintings|
first_paintings << paintings[rand(paintings.size-1)]
end
#paintings = (first_paintings + (Painting.all - first_paintings).reverse).paginate(:per_page => 9, :page => params[:page])
The example above generates a lot of SQL query's and is properly badly optimized. How would you pull this off with Rails 3.1 running on PostgreSQL? I have 7000 records..
#paintings = Painting.all.reverse = #paintings = Painting.order("id desc")
If you really want to reverse the order of the the paintings result set I would set up a scope then just use that
Something like
class Painting < ActiveRecord::Base
scope :reversed, order("id desc")
end
Then you can use Painting.reversed anywhere you need it
You have definitely set up a belongs_to association in your Painting model, so I would do:
# painting.rb
default_scope order('id DESC')
# paintings_controller.rb
first_paintings = User.includes(:paintings).collect do |user|
user.paintings.sample
end
#paintings = (first_paintings + Painting.where('id NOT IN (?)', first_paintings)).paginate(:per_page => 9, :page => params[:page])
I think this solution results in the fewest SQL queries, and is very readable. Not tested, but I hope you got the idea.
You could use the dynamic finders:
Painting.order("id desc").find_by_user_id!(user.id)
This is assuming your Paintings table contains a user_id column or some other way to associate users to paintings which it appears you have covered since you're calling user_id in your initial code. This isn't random but using find_all_by_user_id would allow you to call .reverse on the array if you still wanted and find a random painting.
I'm getting this error when I deploy my app on Heroku:
Started GET "/collections/transect/search?utf8=%E2%9C%93&search%5Btagged_with%5D=village&commit=Search" for 98.201.59.6 at 2011-03-27 17:02:12 -0700
ActionView::Template::Error (PGError: ERROR: column "photos.custom_title" must appear in the GROUP BY clause or be used in an aggregate function
: SELECT "photos".* FROM "photos" INNER JOIN "taggings" ON "photos"."id" = "taggings"."photo_id" INNER JOIN "tags" ON "tags"."id" = "taggings"."tag_id" WHERE "tags"."name" IN ('village') AND ("photos".collection_id = 1) GROUP BY photos.id LIMIT 20 OFFSET 0):
17:
18: - #bodyclass = 'dark'
19: #search_view.photo_tiles
20: = render :partial => 'collections/photos/alt_tiles', :collection => #photos, :as => :photo
app/views/collections/search.html.haml:20:in `_app_views_collections_search_html_haml__2343730670144375006_16241280__2249843891577483539'
I saw these similar questions (1,2).
The problem is, nothing in this view is asking for the custom_title attribute, nor am I executing a query with a "group_by" clause.
Here's the partial that seems to trigger the error:
- ((photo_counter+1) % 5 == 0) ? #class = 'last' : #class = ''
.photo{ :class => #class }
.alt_tile
= link_to( image_tag(photo.file.url(:tile)), collection_photo_path(#collection,photo), :class => 'img_container' )
.location= photo.location(:min)
.tags= photo.tag_array.join(' | ')
Here's the collections#search action which is what raised the error:
def search
#curator_toolbar = true
#collection = Collection.find(params[:id])
#search = #collection.photos.search(params[:search])
#photos = #search.page(params[:page]).per(20)
end
So it looks like maybe this is a plugin issue? I'm using MetaSearch for search functionality and Kaminari for pagination. Does anyone have any ideas or suggestions as to what would cause this specifically and how I can possibly fix it?
--EDIT--
Ok, I seem to have found the real problem:
Using MetaSearch with my keyword tags model, I created a search method that looks like this:
def self.tagged_with( string )
array = string.split(',').map{ |s| s.lstrip }
joins(:tags).where('tags.name' => array ).group('photos.id')
end
Now, I was given a lot of help in creating this method -- as I mentioned before I'm a total SQL moron.
This method works on SQLite but not on PostgreSQL because whenever keywords are included in a search it triggers the "group_by" problem.
So, in this question it seems to indicate that I need to put every column that is part of my photo model in the "group" argument or Postgre will break.
That horrifies me for several reasons:
My photo model is pretty complex and has a ton of fields.
My app is still in development and the photo model changes more than any other.
I don't want to have my code breaking every time someone touches the photo model in the future if they forget to add the columns to the group statement on the tag searching argument.
So, can anyone help me understand how to rewrite this method so that it won't break PostgreSQL -- and ideally so that I won't have to include a list of all the fields that belong to this model in the solution, or at least not a manually maintained list?
So, it turns out I could solve this problem by replacing "group" with "select" in my tagged_with method.
def self.tagged_with( string )
array = string.split(',').map{ |s| s.lstrip }
select('distinct photos.*').joins(:tags).where('tags.name' => array )
end
Problem solved! See this article for a great explanation as to why this is a better idea anyway. (Sorry, web site was removed later on and I don't recall what it said.) Also, thanks to Mark Westling for his answer on a spinoff question that solved my problem.
I want to grab the most recent entry from a table. If I was just using sql, you could do
Select top 1 * from table ORDER BY EntryDate DESC
I'd like to know if there is a good active record way of doing this.
I could do something like:
table.find(:order => 'EntryDate DESC').first
But it seems like that would grab the entire result set, and then use ruby to select the first result. I'd like ActiveRecord to create sql that only brings across one result.
You need something like:
Model.first(:order => 'EntryDate DESC')
which is shorthand for
Model.find(:first, :order => 'EntryDate DESC')
Take a look at the documentation for first and find for details.
The Rails documentation seems to be pretty subjective in this instance. Note that .first is the same as find(:first, blah...)
From:http://api.rubyonrails.org/classes/ActiveRecord/Base.html#M002263
"Find first - This will return the first record matched by the options used. These options can either be specific conditions or merely an order. If no record can be matched, nil is returned. Use Model.find(:first, *args) or its shortcut Model.first(*args)."
Digging into the ActiveRecord code, at line 1533 of base.rb (as of 9/5/2009), we find:
def find_initial(options)
options.update(:limit => 1)
find_every(options).first
end
This calls find_every which has the following definition:
def find_every(options)
include_associations = merge_includes(scope(:find, :include), options[:include])
if include_associations.any? && references_eager_loaded_tables?(options)
records = find_with_associations(options)
else
records = find_by_sql(construct_finder_sql(options))
if include_associations.any?
preload_associations(records, include_associations)
end
end
records.each { |record| record.readonly! } if options[:readonly]
records
end
Since it's doing a records.each, I'm not sure if the :limit is just limiting how many records it's returning after the query is run, but it sure looks that way (without digging any further on my own). Seems you should probably just use raw SQL if you're worried about the performance hit on this.
Could just use find_by_sql http://api.rubyonrails.org/classes/ActiveRecord/Base.html#M002267
table.find_by_sql "Select top 1 * from table ORDER BY EntryDate DESC"