Can't convert BSON::ObjectId into String - ruby-on-rails

So........ I have a rails app. The rails app uses Mongoid for mongodb data. When I create mongo records through web forms, they have IDs with type string. When I import records into mongo using mongoimport, they have IDs with type BSON::ObjectId.
The rails app is expecting the mongo record IDs to be strings, and therefore when I import the data, it causes my app to fail because when it looks up the records it complains that it can't convert type BSON::ObjectId to string
I'm confused on a number of levels here. BSON::ObjectId is the default type for IDs in mongo, so I don't understand why the records created through rails and Mongoid have string IDs. I don't see anywhere where Mongoid is specifying that the _id field should be a string. Does anybody have any clues?

What version of Mongoid are you using? From this post, it looks like Mongoid was using strings up until a year ago for _id's but now consistently uses the BSON::ObjectId type.
mongodb: converting object ID's to BSON::ObjectId
It references this gist for converting old documents with String _id's to using BSON::ObjectId type _id's.
When Mongoid inserts a document into a collection, it expects and uses the BSON::ObjectId type. This is an example using the Rails console:
post = Post.new
=> #
post.save
=> true
post._id
=> BSON::ObjectId('4ff5bcb39ef1728393000002')
post._id.class
=> BSON::ObjectId
Mongoid appears to know to look up _id's using the BSON::ObjectId type:
Post.where(:_id => "4ff5bcb39ef1728393000002").count
=> 1
Post.where(:_id => BSON::ObjectId("4ff5bcb39ef1728393000002") ).count
=> 1
Are you, by any chance, manually setting the _id's? If you are, then perhaps you're not setting the _id's as BSON::ObjectId types.

About your last paragraph: MongoDB's specification is expecting for 12 byte string.

So I figured it out. The issue is with the version of Mongoid that is used by my application. Version 1.9.5 uses strings as the default type for the _id field, which is what I'm using.
I thought about updating Mongoid, but I was afraid that the old string IDs would somehow break the application.
The answer was to somehow import the records using the rails app, so that the old version of Mongoid would be in charge of inserting the records. I created a rake task that would parse the CSV file and insert the records from that file within my app environment.
require 'csv'
require 'iconv'
namespace :deadline do
desc "Import data from CSV file"
task :import => :environment do
CSV.parse(File.open("/tmp/deadlines.csv").read).map{ |row|
app_deadline = OrgSpecific::ApplicationDeadline.create!(
:name => row[2] + " " + row[3],
:start_date => Date.strptime('1/1/2012', '%m/%d/%Y'),
:deadline_date => Date.strptime(row[11], '%m/%d/%Y'),
:term => row[2],
:year => row[3],
:comment => Iconv.conv("UTF8", "LATIN1", row[5]) + " : " + Iconv.conv("UTF8", "LATIN1", row[6])
)
}
end
end
And voila! All the data from my CSV file has been imported through my rails environment, meaning that my mongo records have an _id of type string. Thanks for the help guys!

Related

Find all items where match (or like) a text in a Model within Mongoid in Rails 4

I´m working in a project with Rails 4 and Mongodb as back-end helped by the wonderful gem 'Mongoid' and I want to find all items of my model 'Item' matching a search term using 'sql-like' too.
My model looks like:
class Item
include Mongoid::Document
field :name, :type => String
field :importe, :type => BigDecimal
field :tipo, :type => String
end
Trying to do this in the controller but doesn´t works correctly:
Item.where(name: Regexp.new(".*"+params[:keywords]+".*"))
(where "params[keywords]" is the searchterm) because doesn´t returns anything when there are items with this name value.
How do I make this query?
In ruby rails, we need to use this:
condition = /#{params[:keywords]}/i
Item.where(:name => condition)
Item.where({:name => "/#{params[:keywords]}/i"})

Upload and import data from CSV file in Rails application?

I am new to Ruby on Rails.
I want to write a module for uploading a CSV file in my application. Also, I want to import the data from that file to one of my tables in my Rails application.
In my application there is a model named "Book" which has four fields: name, author, publication_date and publisher_name.
I want to give user the ability to upload a CSV file with the format:
first column for name
second column for author
third for publication_date
fourth for publisher_name.
Also I want to add the validation so that upload will happen only when the file is of the expected format.
You need a FasterCSV gem to do that.First install it and later in your model do like this:
require 'csv'
validates_format_of :book, :with => /^.+\.(csv)$/,
:message => 'A .csv file is required.'
records = CSV.foreach('yourpath/tocsvfile/filename.csv').map do |row|
Book.create!({
:name => row[0],
:author => row[1],
:publication_date => row[2],
:publisher_name => row[3],
})
end
For more information about CSV,you can find it here
Hope it Helps!

Advice on migrating from MongoMapper to Mongoid?

It seems like Mongoid is now the superior ORM for Mongo based on performance and development activity. Unfortunately, we're on MongoMapper and need to migrate.
Are there any concerns or stumbling blocks we should be aware of? We have found a few outdated articles on Google and tried posting on the Mongoid Google Groups (though we were prohibited), but would love thoughts from SO members who have done this in the past.
We're on Rails 3.2.12.
Thanks!
Both of them are great MongoDB Libraries for Ruby. But if you want to switch, here are some notes:
Migrating MongoMapper ORM to Mongoid ORM - Notes
Configure the database connection.
Replace configuration yaml file(includes replica configuration).
Configure Mongoid specific options. e.g - raise_not_found_error: false. if you don't want an error every time a query returns nothing...
Change all models definations - include MongoMapper::Document to include Mongoid::Document
Change the format for all fields definitions.
In mongoid, you should specipy the timestamp: include Mongoid::Timestamps
Change validation. e.g: :in => ARRAY, will be: validates :name, presence: true, inclusion: { in: ARRAY }
Change indexes.
Change order_by format. e.g: MM: Model.all(:order => 'name'). Mongoid: Model.order_by('name ASC')
Error is a keyword in Mongoid. So if you have a model named Error, you should change it.
Pagination format is different, using another gem.
The primary key index entry in MM is id. In Mongoid it's _id, if you have other code relying on .id in the object JSON, you can override as_json function in your Model to create the JSON structure you want.
In MM, Model.fields(:id, :name) ,limits the fields returned from the database to those supplied to the method. In Mongoid it's Model.only(:name,:id)
Some queries changes:
Selecting objects by array: MM: Model.where(:attr.in => [ ] ) and Model.where(:attr => [ ] ) . Mongoid is only: Model.where(:attr.in => [ ] )
Map option of MM is equivalent to the Mid's pluck. Model.map(&:name) --to-- Model.pluck(:name)
Mongoid doesn't support find query for nil. e.g: value = nil. Model.find(value) will throw an error : "Calling Document .find with nil is invalid". So in mongoid we should do: Model.find(value || "").
In MM: Model.find_or_initialize_by_name("BOB"). In Mongoid Model.find_or_initialize_by(name: "BOB").
MM can be used in those two options: Model.where({:name => 'BOB'}).first, and also Model.first({:name => 'BOB'}). Mongoid has only first option.
In MM, to update multiple objects: Model.set({conditions},attr_to_update). In Mongoid: Model.where(conditions).update_all(attr_to_update).

How can I load DataMapper objects in an SQLite database with data from a spreadsheet?

I have Ruby on Rails app using DataMapper, the database is SQLite, the app is hosted on Heroku. I would like to load the DataBase with data from a spreadsheet, however, I don't know the most efficient way...please help!
As an example, let say I have a User model with fields:
Name
Age
Birthday
Hometown
I had a similar problem of importing external data into datamapper. I did a CSV dump of the data from the external database, then wrote an import which read the CSV and create a new record.
class Staff
include DataMapper::Resource
property :id, String, :key => true
property :full_name, String
property :email, String
has n, :stages
end
Then:
CSV.parse(staff) do |row|
#staff = Staff.create(
:id => row[1],
:full_name => row[0],
:email => row[0].downcase.gsub!(' ', '.')
);
#staff.save
Perhaps an approach like this would be suitable?

Getting types of the attributes in an ActiveRecord object

I would like to know if it is possible to get the types (as known by AR - eg in the migration script and database) programmatically (I know the data exists in there somewhere).
For example, I can deal with all the attribute names:
ar.attribute_names.each { |name| puts name }
.attributes just returns a mapping of the names to their current values (eg no type info if the field isn't set).
Some places I have seen it with the type information:
in script/console, type the name of an AR entity:
>> Driver
=> Driver(id: integer, name: string, created_at: datetime, updated_at: datetime)
So clearly it knows the types. Also, there is .column_for_attribute, which takes an attr name and returns a column object - which has the type buried in the underlying database column object, but it doesn't appear to be a clean way to get it.
I would also be interested in if there is a way that is friendly for the new "ActiveModel" that is coming (rails3) and is decoupled from database specifics (but perhaps type info will not be part of it, I can't seem to find out if it is).
Thanks.
In Rails 3, for your model "Driver", you want Driver.columns_hash.
Driver.columns_hash["name"].type #returns :string
If you want to iterate through them, you'd do something like this:
Driver.columns_hash.each {|k,v| puts "#{k} => #{v.type}"}
which will output the following:
id => integer
name => string
created_at => datetime
updated_at => datetime
In Rails 5, you can do this independently of the Database. That's important if you use the new Attributes API to define (additional) attributes.
Getting all attributes from a model class:
pry> User.attribute_names
=> ["id",
"firstname",
"lastname",
"created_at",
"updated_at",
"email",...
Getting the type:
pry> User.type_for_attribute('email')
=> #<ActiveRecord::ConnectionAdapters::AbstractMysqlAdapter::MysqlString:0x007ffbab107698
#limit=255,
#precision=nil,
#scale=nil>
That's sometimes more information than needed. There's a convenience function that maps all these types down to a core set (:integer, :string etc.)
> User.type_for_attribute('email').type
=> :string
You can also get all that data in one call with attribute_types which returns a 'name': type hash.
You can access the types of the columns by doing this:
#script/console
Driver.columns.each {|c| puts c.type}
If you want to get a list of all column types in a particular Model, you could do:
Driver.columns.map(&:type) #gets them all
Driver.columns.map(&:type).uniq #gets the unique ones
In rails 5 this will give you a list of all field names along with their data type:
Model_Name.attribute_names.each do |k| puts "#{k} = #{Model_Name.type_for_attribute(k).type}" end
Rails 5+ (works with virtual attributes as well):
Model.attribute_types['some_attribute'].type
This snippet will give you all the attributes of a model with the associated database data types in a hash. Just replace Post with your Active Record Model.
Post.attribute_names.map {|n| [n.to_sym,Post.type_for_attribute(n).type]}.to_h
Will return a hash like this.
=> {:id=>:integer, :title=>:string, :body=>:text, :created_at=>:datetime, :updated_at=>:datetime, :topic_id=>:integer, :user_id=>:integer}
Assuming Foobar is your Active Record model. You can also do:
attributes = Foobar.attribute_names.each_with_object({}) do |attribute_name, hash|
hash[attribute_name.to_sym] = Foobar.type_for_attribute(attribute_name).type
end
Works on Rails 4 too
In Rails 4 You would use Model.column_types.

Resources