Having trouble with Rails query, ambiguous column name - ruby-on-rails

So I'm having a little difficulty using Rails to query some data.
I have two models and I'm attempting to join one on to the other. My issue arises using a group by method. Both models have the same attribute name and this is causing an ambiguous column name error. My query is:
Photo.joins(:votes).group(:photo_id, :image, :title, :bytes, :user_id, :public_id).order("count_all desc").limit(10).count
How can I choose which model that the group by attributes use?

You can specify the table name like this:
Photo.joins(:votes)
.group(:photo_id, :image, :title, :bytes, 'photos.user_id', :public_id)
.order("count_all desc")
.limit(10)
.count
Assuming user_id is the ambiguous column name and photos is the actual tablename.

Related

Cassandra Rails not able to do more then one column filter

Rails Cassandra not work with filter more then one column.
my modal
class Template
include Cequel::Record
key :id,:int,index:true
column :u_id,:uuid,:auto=>true
column :user_id, :int,index:true
column :code_type,:text,index:true
column :name, :text
column :code_text, :text
timestamps
end
Template.where(:code_type=>"job_html_template",:user_id=>1).allow_filtering!
Cequel::Record::IllegalQuery: Can't scope by more than one indexed column in the same query
I am using cequel (3.0.0)
Can some one please help me what's wrong with this ?

Hash with 'nil' value when using `group` and `count` in query

In Rails, I get a hash using includes:
<% #teste = UserProfile.includes(:mobile_models).group(:name).count %>
The problem is that includes generates a hash like the following:
{nil=>4774, "2610"=>7, "2626"=>4, "2630"=>5, "2760"=>4, "3250"=>3, "355"=>5, "3I607 BlackJack"=>5, "5230"=>13, "5235"=>4, "5310"=>5, "5500"=>5, "5800 Xpress Music"=>16, "6020"=>4, "6120c"=>4, "6131"=>4, "7210"=>5, "A1200r"=>5, "A1900"=>5, "AIKO 70"=>5, "B3410W Ch#t"=>4, "beTouch E100"=>4, "BlackBerry 8320 (Curve)"=>10,....
In my database, I don't find any mobile record with the name "nil". Checking my database, I can't find what might be producing this nil.
The other goal is to sum all values, like this:
<%= sum = #teste.values.sum %>
But when I do this, the 'nil' is added too.
---Update
models/UserProfile
class UserProfile < ActiveRecord::Base
has_and_belongs_to_many :mobile_models, join_table: 'user_profiles_mobile_models', order: 'name'
models/MobileModel
class MobileModel < ActiveRecord::Base
belongs_to :mobile_maker
Because you are grouping by :name, some of the MobileModel or UserProfile objects have the name attribute set to nil. You will need to check both as without seeing the model definition, I can't tell which model has the :name property you are grouping on. If you can share the model code, I can be more explicit.
If both models have a name attribute, you can be more explicit in your group statement:
UserProfile.includes(:mobile_models).group('mobile_models.name')
or...
UserProfile.includes(:mobile_models).group('user_profiles.name')
Also, if a number of your users do not have any mobile_models to include, I believe they will get dumped into the nil grouping as well.
You are getting that hash because of group(:name).
That means you have 4774 records who's name is nil.

Assemble a complex SQL query using Activerecord::Relation and/or Arel with 'joins', 'as', and 'like'

I am building a rails 3.2 app using datatables (http://datatables.net) with client-side paging and filtering on most html tables and server-side paging and filtering on some other html tables. I want to do per-column filtering, which is super-easy for the client side tables, but I think I need to construct a sql query for the database to do per-column filtering for the server side tables. I closely followed the example from RailsCast #340 on datatables and got that working.
The challenge is doing sorting and filtering on a column that is really a foreign_key relation to another table. I don't want to sort and filter on the actual contents of the foreign_key values. I want to sort and filter on the '.to_s' values displayed for the linked objects (which is the semantics of using the client-side sort and filter feature). Here is an example:
class Address < ActiveRecord::Base
attr_accessible :city, :line1, :line2, :state, :zip
has_many :people
def to_s
[line1, line2, city, state, zip].join(' ')
end
end
class Person < ActiveRecord::Base
attr_accessible :address, :name
belongs_to :address
end
so the view displaying the people list has two columns, for name and address
<td><%= p.name %></td>
<td><%= p.address %></td>
and what appears in the index table is
John Smith | 1234 Main St Anywhere City AA 12345
so with client-side sorting and filtering I can search for 'Anywhere' in the address column and get all the rows with that term in the address field. Doing the same thing on the server-side seems much more difficult. I think I'm trying to assemble a sql query that looks something like:
select * from people
join address on people.address_id = address.id
where concat(address.line1,
address.line2,
address.city,
address.state,
address.zip) as spec_address like query_term
order by spec_address
(This is not necessarily correct SQL code.)
I've looked at both the ActiveRecord Query Rails guide and anything I could find on Arel without success.
You can do this with a scope on Address which is then merged into the Person query.
class Address < ActiveRecord::Base
scope :anywhere, lambda{|search|
attrs = [:line1, :line2, :city, :state, :zip]
where(attrs.map{|attr| "addresses.#{attr} LIKE :search"}.join(' OR '), search: "#{search}%").
order(*attrs)
}
end
Person.joins(:address).merge(Address.anywhere(query_term))

mongoid batch update

I'd like to update a massive set of document on an hourly basis.
Here's the
fairly simple Model:
class Article
include Mongoid::Document
field :article_nr, :type => Integer
field :vendor_nr, :type => Integer
field :description, :type => String
field :ean
field :stock
field :ordered
field :eta
so every hour i get a fresh stock list, where :stock,:ordered and :eta "might" have changed
and i need to update them all.
Edit:
the stocklist contains just
:article_nr, :stock, :ordered, :eta
wich i parse to a hash
In SQL i would have taken the route to foreign keying the article_nr to a "stock" table, dropping the whole stock table, and running a "collection.insert" or something alike
But that approach seems not to work with mongoid.
Any hints? i can't get my head around collection.update
and changing the foreign key on belongs_to and has_one seems not to work
(tried it, but then Article.first.stock was nil)
But there has to be a faster way than iterating over the stocklist array of hashes and doing
something like
Article.where( :article_nr => stocklist['article_nr']).update( stock: stocklist['stock'], eta: stocklist['eta'],orderd: stocklist['ordered'])
UPDATING
You can atomically update multiple documents in the database via a criteria using Criteria#update_all. This will perform an atomic $set on all the attributes passed to the method.
# Update all people with last name Oldman with new first name.
Person.where(last_name: "Oldman").update_all(
first_name: "Pappa Gary"
)
Now I can understood a bit more. You can try to do something like that, assuming that your article nr is uniq.
class Article
include Mongoid::Document
field :article_nr
field :name
key :article_nr
has_many :stocks
end
class Stock
include Mongoid::Document
field :article_id
field :eta
field :ordered
belongs_to :article
end
Then you when you create stock:
Stock.create(:article_id => "123", :eta => "200")
Then it will automaticly get assign to article with article_nr => "123"
So you can always call last stock.
my_article.stocks.last
If you want to more precise you add field :article_nr in Stock, and then :after_save make new_stock.article_id = new_stock.article_nr
This way you don't have to do any updates, just create new stocks and they always will be put to correct Article on insert and you be able to get latest one.
If you can extract just the stock information into a separate collection (perhaps with a has_one relationship in your Article), then you can use mongoimport with the --upsertFields option, using article_nr as your upsertField. See http://www.mongodb.org/display/DOCS/Import+Export+Tools.

How can I get a unique :group of a virtual attribute in rails?

I have several similar models ContactEmail, ContactLetter, etcetera.
Each one belongs_to a Contact
Each contact belongs_to a Company
So, what I did was create a virtual attribute for ContactEmail:
def company_name
contact = Contact.find_by_id(self.contact_id)
return contact.company_name
end
Question: How can I get an easy list of all company_name (without duplicates) if I have a set of ContactEmails objects (from a find(:all) method, for example)?
When I try to do a search on ContactEmail.company_name using the statistics gem, for example, I get an error saying that company_name is not a column for ContactEmail.
Assuming your ContactEmail set is in #contact_emails (untested):
#contact_emails.collect { |contact_email| contact_email.company_name }.uniq
You don't need the virtual attribute for this purpose though. ActiveRecord sets up the relationship automatically based on the foreign key, so you could take the company_name method out of the ContactEmail model and do:
#contact_emails.collect { |contact_email| contact_email.contact.company_name }.uniq
Performance could be a consideration for large sets, so you might need to use a more sophisticated SQL query if that's an issue.
EDIT to answer your 2nd question
If company_name is a column, you can do:
ContactEmail.count(:all, :joins => :contact, :group => 'contact.company_name')
On a virtual attribute I think you'd have to retrieve the whole set and use Ruby (untested):
ContactEmail.find(:all, :joins => :contact, :select => 'contacts.company_name').group_by(&:company_name).inject({}) {|hash,result_set| hash.merge(result_set.first=>result_set.last.count)}
but that's not very kind to the next person assigned to maintain your system -- so you're better off working out the query syntax for the .count version and referring to the column itself.

Resources