I am using mongoid on a rails project.
I have an API call that fetches client records in JSON format (array of hashes).
users = api.get_users # Returns JSON
To leverage Mongo's search, sort, and pagination, I'd like to store the records I get through API in the database.
Of course I could run over every record in the JSON and do something like User.create(user), but I would like to just import all records at once and create each record in the database. Perhaps using https://docs.mongodb.com/manual/reference/program/mongoimport ?
Any suggestions?
You're right -- because your data is in JSON format, mongoimport is the tool you want to use. Once you've imported your data, you can set up Mongoid document schemas to match the data you've imported.
Here's a helpful mongoimport tutorial if you want to try it on some sample data.
For the inventory data in this tutorial, you could set up the schema:
# JSON data: { "item": "journal", "qty": 25, "size": { "h": 14, "w": 21, "uom": "cm" }, "status": "A" }
class Inventory
include Mongoid::Document
store_in collection: 'inventory'
field :item, type: String
field :qty, type: Integer
field :size, type: Hash
field :status, type: String
end
Some things to note here:
Make sure that you import your data to the default collection name for the model you want to create (i.e. import to the "users" collection for a model called User) OR indicate on the model which collection the data is stored in (using the store_in method)
When you import data as a Hash, you have some options on how to represent that in Mongoid. I've just used a Hash in this example, but you could also make size its own embedded document.
Related
I have a jsonb-type column called 'payloads' for my Tweet model in my Rails 6.1.1 app. I use a store to access various fields on this attribute directly:
class Tweet < ApplicationRecord
store :payload, accessors: [:lang, :text, :entities], coder: JSON
end
(Note that the part coder: JSON is necessary to change the serialization from the YAML default to JSON – otherwise you end up with YAML in your jsonb column.)
When I create a new tweet, I see that ActiveRecord erroneously escapes the JSON string during the insertion:
Tweet.create payload: {foo: 'bar'}
TRANSACTION (0.7ms) BEGIN
Tweet Create (2.0ms) INSERT INTO "tweets" ("payload", ...) VALUES ($1, ...) RETURNING "id" [["payload", "\"{\\\"foo\\\":\\\"bar\\\"}\""], ...]
I'm referring to the part "\"{\\\"foo\\\":\\\"bar\\\"}\"". It looks like it's been double escaped. This results in a sort of 'stringception' where a string within a string is stored in the payloads column, rendering postgres unable to perform any searches on fields using the arrow -> syntax. Postgres can't recognize this value as JSON. (Seemingly miraculously, however, Rails is able to deserialize the field properly on read operations.)
Another SO user has illustrated the issue here: https://dbfiddle.uk/gcwTQOUm
When I do not use store, I cannot reproduce the issue:
Tweet.create payload: {foo: 'bar'}
TRANSACTION (0.2ms) BEGIN
Tweet Create (1.6ms) INSERT INTO "tweets" ("payload", ...) VALUES ($1, ...) RETURNING "id" [["payload", "{\"foo\":\"bar\"}"], ...]
"{\"foo\":\"bar\"}" is the desired string.
This has led me to believe that I'm using store wrong.
I checked the docs:
NOTE: If you are using structured database data types (e.g. PostgreSQL hstore/json, or MySQL 5.7+ json) there is no need for the serialization provided by .store. Simply use .store_accessor instead to generate the accessor methods. Be aware that these columns use a string keyed hash and do not allow access using a symbol.
In other words, because I was already using a jsonb-type column, Rails already knew to serialize the hash – applying another serialization resulted in a double escape.
So, instead of doing:
store :payload, accessors: [:lang, :text, :entities], coder: JSON
I know do:
store_accessor :payload, :lang, :text, :entities
And it works.
I'm aware of the includes, and extract_associated methods.
extract_associated as used here, in the Rails docs, only returns the user records -
account.memberships.extract_associated(:user)
# => Returns collection of User records
I'm looking to return the Membership records, WITH the User records in the same array. I know that the includes method should do this for me, but my response only includes the user_id and not the actual record, i.e. the use of includes hasn't changed what's returned at all, like so:-
account.memberships.includes(:user)
# => Returns collection of Membership records, with user_ids
[{ "id": 3, "account_id": 1, "user_id": 2, "membership_name": 'Annual Membership'}]
My Memberships belongs_to a User, and a User has_many Memberships.
What am I missing here?
It's not an option for me to do membership.user in my view, because I'm using VueJS so need to pass all the data I need in.
you can include association's collection as json.
account.memberships.includes(:user)
.as_json(include: { memberships: { include: :user } })
See the details for API from documentation.
https://api.rubyonrails.org/classes/ActiveModel/Serializers/JSON.html
I want to delete a field in a document using ROR.
I have already tried
book.remove_attribute(:name)
book.unset(:name)
But they both set the attribute to nil and it is still present in the object.
I want it to vanish from my document. Any help is welcome.
When you access a document via mongoid, it returns you a Ruby object. You can actually see the data stored in the document only via mongo shell (just type 'mongo' in you terminal).
The object is created by Mongoid (MongoDB ODM/wrapper for rails). This object may occasionally look different from the document.
For example
When you unset a field, that field is entirely removed from that document. BUT, since your model still has that field on it, MONGOID returns you a nil attribute for that field, instead of giving you different number of fields for objects of same model.
Model book.rb
class Book
include Mongoid::Document
field :name
field :author
end
In rails console, type
Book.create(name: "b1", author: "a1")
=> #<Book _id: 555231746c617a1c99030000, name: "b1", author: "a1">
In Mongo shell
db.books.find()
{ "_id" : ObjectId("555231746c617a1c99030000"), "name" : "b1", "author" : "a1" }
Now, we unset.
In rails console
Book.first.unset(:name)
=> #<Book _id: 555231746c617a1c99030000, name: nil, author: "a1">
In Mongo shell
db.books.find()
{ "_id" : ObjectId("555231746c617a1c99030000"), "author" : "a1" }
If however you still dont want to see the field in your rails console (mind you, this is not taking up any extra space in db) you can always remove the field from the model. If you do that, you will no longer be able to access this field through rails/mongoid on any object. It will only be present on the document and accessible through mongo shell.
I am using Rails 4 with Mongoid for an event based application.
I am trying to create a model where I want to add an array field with embedded documents in that array. This embedded documents will contain user's geo coordinate and timestamp. After every 5 minutes I will be pushing user's latest coordinates to user's (location) array. can someone please help me, How can i create that.
My sample model and desired documents are as below.
class User
include Mongoid::Document
field :name, type: String
field :locations, type: Array
end
Here I want to push
Here is sample document that I am looking for as a result:
{ _id : ObjectId(...),
name : "User_name",
locations : [ {
_id : ObjectID(...),
time : "...." ,
loc : [ 55.5, 42.3 ]
} ,
{
_id : ObjectID(...),
time : "...",
loc : [ -74 , 44.74 ]
}
]
}
I was able to add the value in location array without embedded document through IRB, but as I will be using MongoDB's Geospatial queries later on, so I want to use 2D indexes and rest of the stuff Mongo Documentation mentioned.
Hence I believe it needs to have array of documents which contain the latitude & longitude. which will also save my time to code.
Also can I make the time of the location as documents '_id' ? (It can help me to reduce the query overhead)
I would really appriciate if someone can help me with the structure of model i should write or guide me to the references.
P.S: Let me know if you suggest some extra references/help about storing geospatial data in mongoDB which can be helpful for me.
Hope this will help somebody.
If you want to embed documents you can use embedded_many feature of mongoid, which handles such relations. It allows you to define index on embedded documents as well
http://mongoid.org/en/mongoid/docs/relations.html#embeds_many
Mongoid points out, that 2D indexes should be applied to arrays:
http://mongoid.org/en/mongoid/docs/indexing.html
In your case models may look like this:
class User
include Mongoid::Document
field :name, type: String
embeds_many :locations
index({ "locations.loc" => "2d" })
accepts_nested_attributes_for :locations # see http://mongoid.org/en/mongoid/docs/nested_attributes.html#common
end
class Location
include Mongoid::Document
field :time, type: DateTime # see http://mongoid.org/en/mongoid/docs/documents.html#fields
field :loc, type: Array
embedded_in :user
end
But beware of using update and nested attributes - it allows you only update attributes, but not delete or reject them. It's preferrable to use (association)_attributes= methods instead:
#user = User.new({ name: 'John Doe' })
#user.locations_attributes = {
"0" => {
_id : ObjectID(...),
time : "...." ,
loc : [ 55.5, 42.3 ]
} ,
"1" => {
_id : ObjectID(...),
time : "...",
loc : [ -74 , 44.74 ]
}
}
#user.save!
It seems like Mongoid is now the superior ORM for Mongo based on performance and development activity. Unfortunately, we're on MongoMapper and need to migrate.
Are there any concerns or stumbling blocks we should be aware of? We have found a few outdated articles on Google and tried posting on the Mongoid Google Groups (though we were prohibited), but would love thoughts from SO members who have done this in the past.
We're on Rails 3.2.12.
Thanks!
Both of them are great MongoDB Libraries for Ruby. But if you want to switch, here are some notes:
Migrating MongoMapper ORM to Mongoid ORM - Notes
Configure the database connection.
Replace configuration yaml file(includes replica configuration).
Configure Mongoid specific options. e.g - raise_not_found_error: false. if you don't want an error every time a query returns nothing...
Change all models definations - include MongoMapper::Document to include Mongoid::Document
Change the format for all fields definitions.
In mongoid, you should specipy the timestamp: include Mongoid::Timestamps
Change validation. e.g: :in => ARRAY, will be: validates :name, presence: true, inclusion: { in: ARRAY }
Change indexes.
Change order_by format. e.g: MM: Model.all(:order => 'name'). Mongoid: Model.order_by('name ASC')
Error is a keyword in Mongoid. So if you have a model named Error, you should change it.
Pagination format is different, using another gem.
The primary key index entry in MM is id. In Mongoid it's _id, if you have other code relying on .id in the object JSON, you can override as_json function in your Model to create the JSON structure you want.
In MM, Model.fields(:id, :name) ,limits the fields returned from the database to those supplied to the method. In Mongoid it's Model.only(:name,:id)
Some queries changes:
Selecting objects by array: MM: Model.where(:attr.in => [ ] ) and Model.where(:attr => [ ] ) . Mongoid is only: Model.where(:attr.in => [ ] )
Map option of MM is equivalent to the Mid's pluck. Model.map(&:name) --to-- Model.pluck(:name)
Mongoid doesn't support find query for nil. e.g: value = nil. Model.find(value) will throw an error : "Calling Document .find with nil is invalid". So in mongoid we should do: Model.find(value || "").
In MM: Model.find_or_initialize_by_name("BOB"). In Mongoid Model.find_or_initialize_by(name: "BOB").
MM can be used in those two options: Model.where({:name => 'BOB'}).first, and also Model.first({:name => 'BOB'}). Mongoid has only first option.
In MM, to update multiple objects: Model.set({conditions},attr_to_update). In Mongoid: Model.where(conditions).update_all(attr_to_update).