I have the following array:
#unregistered_users = ['my#email.com', 'your#email.com', ...]
Now, I want to create a document for each array element:
#unregistered_users.each do |email_address|
Model.create(email: email_address, user: self.user, detail: self)
end
But it only creates a single document (the first element of the array). The other array elements are simply not created. Why?
We're using Ruby 1.9.3-p385, Rails 3.2.12, MongoID 3.0.0 and MongoDB 2.2.3
Update #1
So, we had a custom _id field with a custom random token using SecureRandom.hex(64).to_i(16).to_s(36)[0..127].
After I removed it worked normally, but with regular mongo ID's (which is not what we want).
Update #2
This is how the token are being generated:
class Model
include Mongoid::Document
include Mongoid::Timestamps
...
field :_id, default: SecureRandom.hex(64).to_i(16).to_s(36)[0..127]
...
index( { _id: 1 }, { unique: true } )
end
Try something like this to check what are the errors on the mongoid model:
#unregistered_users.each do |email_address|
model = Model.create(email: email_address, user: self.user, detail: self)
puts model.errors.inspect unless model.persisted?
end
or use create! to raise an exception and see what's happening
Related
We are looking at use the reform gem for validating input.
One of the issues we are facing is that we accept input in this format:
params = {
records: {
"record-id-23423424": {
name: 'Joe Smith'
}
"record-id-43234233": {
name: 'Jane Doe'
}
"record-id-345234555": {
name: 'Fox trot'
}
"record-id-34234234": {
name: 'Alex'
}
}
}
so if we were to create reform class
class RecordForm < Reform::Form
property :records
validates :records, presence: true
# ?????????
end
How do we validate the contents of the records to make sure each one has a name? The record-id-values are not known ahead of time.
Reform currently doesn't allow dynamic properties, and actually, it's not planned since Reform is supposed to be a UI-specific form object.
The solution would be to pre-parse your input into something what Laura suggests. You could then have nested properties for each field.
collection :records do
property :id # manually parsed
property :name
end
Let's say we have a Virtus model User
class User
include Virtus.model
attribute :name, String, default: 'John', lazy: true
end
Then we create an instance of this model and extend from Virtus.model to add another attribute on the fly:
user = User.new
user.extend(Virtus.model)
user.attribute(:active, Virtus::Attribute::Boolean, default: true, lazy: true)
Current output:
user.active? # => true
user.name # => 'John'
But when I try to get either attributes or convert the object to JSON via as_json(or to_json) or Hash via to_h I get only post-extended attribute active:
user.to_h # => { active: true }
What is causing the problem and how can I get to convert the object without loosing the data?
P.S.
I have found a github issue, but it seems that it was not fixed after all (the approach recommended there doesn't work stably as well).
Building on Adrian's finding, here is a way to modify Virtus to allow what you want. All specs pass with this modification.
Essentially, Virtus already has the concept of a parent AttributeSet, but it's only when including Virtus.model in a class.
We can extend it to consider instances as well, and even allow multiple extend(Virtus.model) in the same object (although that sounds sub-optimal):
require 'virtus'
module Virtus
class AttributeSet
def self.create(descendant)
if descendant.respond_to?(:superclass) && descendant.superclass.respond_to?(:attribute_set)
parent = descendant.superclass.public_send(:attribute_set)
elsif !descendant.is_a?(Module)
if descendant.respond_to?(:attribute_set, true) && descendant.send(:attribute_set)
parent = descendant.send(:attribute_set)
elsif descendant.class.respond_to?(:attribute_set)
parent = descendant.class.attribute_set
end
end
descendant.instance_variable_set('#attribute_set', AttributeSet.new(parent))
end
end
end
class User
include Virtus.model
attribute :name, String, default: 'John', lazy: true
end
user = User.new
user.extend(Virtus.model)
user.attribute(:active, Virtus::Attribute::Boolean, default: true, lazy: true)
p user.to_h # => {:name=>"John", :active=>true}
user.extend(Virtus.model) # useless, but to show it works too
user.attribute(:foo, Virtus::Attribute::Boolean, default: false, lazy: true)
p user.to_h # => {:name=>"John", :active=>true, :foo=>false}
Maybe this is worth making a PR to Virtus, what do you think?
I haven't investigated it further, but it seems that every time you include or extend Virtus.model, it initializes a new AttributeSet and set it to #attribute_set instance variable of your User class (source). What the to_h or attributes do is they call the get method of the new attribute_set instance (source). Therefore, you can only get attributes after the last inclusion or the extension of Virtus.model.
class User
include Virtus.model
attribute :name, String, default: 'John', lazy: true
end
user = User.new
user.instance_variables
#=> []
user.send(:attribute_set).object_id
#=> 70268060523540
user.extend(Virtus.model)
user.attribute(:active, Virtus::Attribute::Boolean, default: true, lazy: true)
user.instance_variables
#=> [:#attribute_set, :#active, :#name]
user.send(:attribute_set).object_id
#=> 70268061308160
As you can see, the object_id of attribute_set instance before and after the extension is different which means that the former and the latter attribute_set are two different objects.
A hack I can suggest for now is this:
(user.instance_variables - [:#attribute_set]).each_with_object({}) do |sym, hash|
hash[sym.to_s[1..-1].to_sym] = user.instance_variable_get(sym)
end
Each user has one address.
class User
include Mongoid::Document
has_one :address
end
class Address
include Mongoid::Document
belongs_to :user
field :street_name, type:String
end
u = User.find(...)
u.address.update(street_name: 'Main St')
If we have a User without an Address, this will fail.
So, is there a good (built-in) way to do u.address.update_or_initialize_with?
Mongoid 5
I am not familiar with ruby. But I think I understand the problem. Your schema might looks like this.
user = {
_id : user1234,
address: address789
}
address = {
_id: address789,
street_name: ""
user: user1234
}
//in mongodb(javascript), you can get/update address of user this way
u = User.find({_id: user1234})
u.address //address789
db.address.update({user: u.address}, {street_name: "new_street name"})
//but since the address has not been created, the variable u does not even have property address.
u.address = undefined
Perhaps you can try to just create and attached it manually like this:
#create an address document, to get _id of this address
address = address.insert({street_name: "something"});
#link or attached it to u.address
u.update({address: address._id})
I had this problem recently. There is a built in way but it differs from active records' #find_or_initialize_by or #find_or_create_by method.
In my case, I needed to bulk insert records and update or create if not found, but I believe the same technique can be used even if you are not bulk inserting.
# returns an array of query hashes:
def update_command(users)
updates = []
users.each do |user|
updates << { 'q' => {'user_id' => user._id},
'u' => {'address' => 'address'},
'multi' => false,
'upsert' => true }
end
{ update: Address.collection_name.to_s, updates: updates, ordered: false }
end
def bulk_update(users)
client = Mongoid.default_client
command = bulk_command(users)
client.command command
client.close
end
since your not bulk updating, assuming you have a foreign key field called user_id in your Address collection. You might be able to:
Address.collection.update({ 'q' => {'user_id' => user._id},
'u' => {'address' => 'address'},
'multi' => false,
'upsert' => true }
which will match against the user_id, update the given fields when found (address in this case) or create a new one when not found.
For this to work, there is 1 last crucial step though.
You must add an index to your Address collection with a special flag.
The field you are querying on (user_id in this case)
must be indexed with a flag of either { unique: true }
or { sparse: true }. the unique flag will raise an error
if you have 2 or more nil user_id fields. The sparse option wont.
Use that if you think you may have nil values.
access your mongo db through the terminal
show dbs
use your_db_name
check if the addresses collection already has the index you are looking for
db.addresses.getIndexes()
if it already has an index on user_id, you may want to remove it
db.addresses.dropIndex( { user_id: 1} )
and create it again with the following flag:
db.addresses.createIndex( { user_id: 1}, { sparse: true } )
https://docs.mongodb.com/manual/reference/method/db.collection.update/
EDIT #1
There seems to have changes in Mongoid 5.. instead of User.collection.update you can use User.collection.update_one
https://docs.mongodb.com/manual/reference/method/db.collection.updateOne/
The docs show you need a filter rather than a query as first argument but they seem to be the same..
Address.collection.update_one( { user_id: user_id },
'$set' => { "address": 'the_address', upsert: true} )
PS:
If you only write { "address": 'the_address' } as your update clause without including an update operator such as $set, the whole document will get overwritten rather than updating just the address field.
EDIT#2
About why you may want to index with unique or sparse
If you look at the upsert section in the link bellow, you will see:
To avoid multiple upserts, ensure that the filter fields are uniquely
indexed.
https://docs.mongodb.com/manual/reference/method/db.collection.updateOne/
Rails 4, Mongoid instead of ActiveRecord (but this should change anything for the sake of the question).
Let's say I have a MyModel domain class with some validation rules:
class MyModel
include Mongoid::Document
field :text, type: String
field :type, type: String
belongs_to :parent
validates :text, presence: true
validates :type, inclusion: %w(A B C)
validates_uniqueness_of :text, scope: :parent # important validation rule for the purpose of the question
end
where Parent is another domain class:
class Parent
include Mongoid::Document
field :name, type: String
has_many my_models
end
Also I have the related tables in the database populated with some valid data.
Now, I want to import some data from an CSV file, which can conflict with the existing data in the database. The easy thing to do is to create an instance of MyModel for every row in the CSV and verify if it's valid, then save it to the database (or discard it).
Something like this:
csv_rows.each |data| # simplified
my_model = MyModel.new(data) # data is the hash with the values taken from the CSV row
if my_model.valid?
my_model.save validate: false
else
# do something useful, but not interesting for the question's purpose
# just know that I need to separate validation from saving
end
end
Now, this works pretty smoothly for a limited amount of data. But when the CSV contains hundreds of thousands of rows, this gets quite slow, because (worst case) there's a write operation for every row.
What I'd like to do, is to store the list of valid items and save them all at the end of the file parsing process. So, nothing complicated:
valids = []
csv_rows.each |data|
my_model = MyModel.new(data)
if my_model.valid? # THE INTERESTING LINE this "if" checks only against the database, what happens if it conflicts with some other my_models not saved yet?
valids << my_model
else
# ...
end
end
if valids.size > 0
# bulk insert of all data
end
That would be perfect, if I could be sure that the data in the CSV does not contain duplicated rows or data that goes against the validation rules of MyModel.
My question is: how can I check each row against the database AND the valids array, without having to repeat the validation rules defined into MyModel (avoiding to have them duplicated)?
Is there a different (more efficient) approach I'm not considering?
What you can do is validate as model, save the attributes in a hash, pushed to the valids array, then do a bulk insert of the values usint mongodb's insert:
valids = []
csv_rows.each |data|
my_model = MyModel.new(data)
if my_model.valid?
valids << my_model.attributes
end
end
MyModel.collection.insert(valids, continue_on_error: true)
This won't however prevent NEW duplicates... for that you could do something like the following, using a hash and compound key:
valids = {}
csv_rows.each |data|
my_model = MyModel.new(data)
if my_model.valid?
valids["#{my_model.text}_#{my_model.parent}"] = my_model.as_document
end
end
Then either of the following will work, DB Agnostic:
MyModel.create(valids.values)
Or MongoDB'ish:
MyModel.collection.insert(valids.values, continue_on_error: true)
OR EVEN BETTER
Ensure you have a uniq index on the collection:
class MyModel
...
index({ text: 1, parent: 1 }, { unique: true, dropDups: true })
...
end
Then Just do the following:
MyModel.collection.insert(csv_rows, continue_on_error: true)
http://api.mongodb.org/ruby/current/Mongo/Collection.html#insert-instance_method
http://mongoid.org/en/mongoid/docs/indexing.html
TIP: I recommend if you anticipate thousands of rows to do this in batches of 500 or so.
I'm trying to follow the advice in Mongoid 3 - Check for uniqueness of composite key to have a model with a unique constraint on 2 fields.
The id declaration is this:
field :_id, type: String, default: &method(:generate_id)
private
def generate_id
user.id.to_s + offering.id.to_s
end
But if I do this, it has a conniption when I instantiate an object via new because it tries to generate the id before it has a user and offering and it (rightly) doesn't want to use the id of nil. I can pass in the user and offering as constructor parameters and everything is fine.
My question is, is this the right way of doing this? It feels dirty given all the obtuse wackyness I have to do just to get a unique constraint. The code isn't very intent revealing at all. Is there a better way?
With plain MongoDB you would create this index with JavaScript like so (assuming your collection name is registrations):
db.registrations.ensureIndex( { "user_id": 1, "offering_id": 1 }, { unique: true } )
To generate this index with Mongoid, add this to your model:
index({ user_id: 1, offering_id: 1 }, { unique: true })
And run rake db:mongoid:create_indexes.
If you want to keep generating your _id with generate_id, you could move the generation to a before_validate callback.
field :_id, type: String
before_validate :generate_id
private
def generate_id
self._id ||= "#{user.id}:#{offering}"
end