How to globally override Mongoid::Document._id generation - ruby-on-rails

I wanna override the kind of _id generation in mongoid (another app, which shares the db uses String instead of ObjectId()
I can do it for every model by adding this:
field :_id, type: String, default: -> { BSON::ObjectId.new.to_s }
But how to globally attach this to keep it DRY?

Very valid usecase but looking into the code you might be out of luck at Mongoid::Fields, but you could overwrite mongoize which should go like this
Item.new.id
=> BSON::ObjectId('56e727892ada693ea8000000')
class BSON::ObjectId
def self.mongoize(k)
k.to_s
end
end
Item.new.id
=> "56e7276f2ada693737000002"

Related

trailblazer reform gem, how to handle this type of input validation?

We are looking at use the reform gem for validating input.
One of the issues we are facing is that we accept input in this format:
params = {
records: {
"record-id-23423424": {
name: 'Joe Smith'
}
"record-id-43234233": {
name: 'Jane Doe'
}
"record-id-345234555": {
name: 'Fox trot'
}
"record-id-34234234": {
name: 'Alex'
}
}
}
so if we were to create reform class
class RecordForm < Reform::Form
property :records
validates :records, presence: true
# ?????????
end
How do we validate the contents of the records to make sure each one has a name? The record-id-values are not known ahead of time.
Reform currently doesn't allow dynamic properties, and actually, it's not planned since Reform is supposed to be a UI-specific form object.
The solution would be to pre-parse your input into something what Laura suggests. You could then have nested properties for each field.
collection :records do
property :id # manually parsed
property :name
end

Mongoid Unique Constraint on Composite Key

I'm trying to follow the advice in Mongoid 3 - Check for uniqueness of composite key to have a model with a unique constraint on 2 fields.
The id declaration is this:
field :_id, type: String, default: &method(:generate_id)
private
def generate_id
user.id.to_s + offering.id.to_s
end
But if I do this, it has a conniption when I instantiate an object via new because it tries to generate the id before it has a user and offering and it (rightly) doesn't want to use the id of nil. I can pass in the user and offering as constructor parameters and everything is fine.
My question is, is this the right way of doing this? It feels dirty given all the obtuse wackyness I have to do just to get a unique constraint. The code isn't very intent revealing at all. Is there a better way?
With plain MongoDB you would create this index with JavaScript like so (assuming your collection name is registrations):
db.registrations.ensureIndex( { "user_id": 1, "offering_id": 1 }, { unique: true } )
To generate this index with Mongoid, add this to your model:
index({ user_id: 1, offering_id: 1 }, { unique: true })
And run rake db:mongoid:create_indexes.
If you want to keep generating your _id with generate_id, you could move the generation to a before_validate callback.
field :_id, type: String
before_validate :generate_id
private
def generate_id
self._id ||= "#{user.id}:#{offering}"
end

Create multiple documents with array

I have the following array:
#unregistered_users = ['my#email.com', 'your#email.com', ...]
Now, I want to create a document for each array element:
#unregistered_users.each do |email_address|
Model.create(email: email_address, user: self.user, detail: self)
end
But it only creates a single document (the first element of the array). The other array elements are simply not created. Why?
We're using Ruby 1.9.3-p385, Rails 3.2.12, MongoID 3.0.0 and MongoDB 2.2.3
Update #1
So, we had a custom _id field with a custom random token using SecureRandom.hex(64).to_i(16).to_s(36)[0..127].
After I removed it worked normally, but with regular mongo ID's (which is not what we want).
Update #2
This is how the token are being generated:
class Model
include Mongoid::Document
include Mongoid::Timestamps
...
field :_id, default: SecureRandom.hex(64).to_i(16).to_s(36)[0..127]
...
index( { _id: 1 }, { unique: true } )
end
Try something like this to check what are the errors on the mongoid model:
#unregistered_users.each do |email_address|
model = Model.create(email: email_address, user: self.user, detail: self)
puts model.errors.inspect unless model.persisted?
end
or use create! to raise an exception and see what's happening

Elasticsearch/tire Nested Queries with persistant objects

I'm trying to use Tire to perform a nested query on a persisted model. The model (Thing) has Tags and I'm looking to find all Things tagged with a certain Tag
class Thing
include Tire::Model::Callbacks
include Tire::Model::Persistence
index_name { "#{Rails.env}-thing" }
property :title, :type => :string
property :tags, :default => [], :analyzer => 'keyword', :class => [Tag], :type => :nested
end
The nested query looks like
class Thing
def self.find_all_by_tag(tag_name, args)
self.search(args) do
query do
nested path: 'tags' do
query do
boolean do
must { match 'tags.name', tag_name }
end
end
end
end
end
end
end
When I execute the query I get a "not of nested type" error
Parse Failure [Failed to parse source [{\"query\":{\"nested\":{\"query\":{\"bool\":{\"must\":[{\"match\":{\"tags.name\":{\"query\":\"TestTag\"}}}]}},\"path\":\"tags\"}},\"size\":10,\"from\":0,\"version\":true}]]]; nested: QueryParsingException[[test-thing] [nested] nested object under path [tags] is not of nested type]; }]","status":500}
Looking at the source for Tire it seems that mappings are created from the options passed to the "property" method, so I don't think I need a separate "mapping" block in the class. Can anyone see what I am doing wrong?
UPDATE
Following Karmi's answer below, I recreated the index and verified that the mapping is correct:
thing: {
properties: {
tags: {
properties: {
name: {
type: string
}
type: nested
}
}
title: {
type: string
}
}
However, when I add new Tags to Thing
thing = Thing.new
thing.title = "Title"
thing.tags << {:name => 'Tag'}
thing.save
The mapping reverts to "dynamic" type and "nested" is lost.
thing: {
properties: {
tags: {
properties: {
name: {
type: string
}
type: "dynamic"
}
}
title: {
type: string
}
}
The query fails with the same error as before. How do I preserve the nested type when adding new Tags?
Yes, indeed, the mapping configuration in property declarations is passed on in the Persistence integration.
In a situation like this, there's always the and and only first question: how does the mapping look like for real?
So, use eg. the Thing.index.mapping method or the Elasticsearch's REST API: curl localhost:9200/things/_mapping to have a look.
Chances are, that your index was created with the dynamic mapping, based on the JSON you have used, and you have changed the mapping later. In this case, the index creation logic is skipped, and the mapping is not what you expect.
There's a Tire issue opened about displaying warning when the index mapping is different from the mapping defined in the model.

Single index multi type - elasticsearch indexing via tire

In my multi-tenant app (account based with number of users per account), how would I update index for a particular account when a user document is changed.
I have a separate index for each account, in which the mappings for each model (user and comments - just an example actual app has many models) are specified. In this case if any change has been done for user model or comment model, the index that has been created for the related account has to be updated. Is this possible? Please let me know if yes.
I guess this is the way I specify the mappings in my case. Correct me if I'm wrong.
Account Model:
include Tire::Model::Search
Tire.index('account_1') do
create(
:mappings => {
:user => {
:properties => {
:name => { :type => :string, :boost => 10 },
:company_name => { :type => :string, :boost => 5 }
}
},
:comments => {
:properties => {
:description => { :type => :string, :boost => 5 }
}
}
}
)
end
The index is getting created correctly with both the mappings for account index. But, I don't see a way where I can update the index when any model specified in the mappings are changed.
Whenever a new user is added or if an user is updated the index created for the corresponding account has to be updated.
This question is cross-posted from Github issue Multiple model single index approach. Crossposting the answer here.
Let's say we have an Account class and we deal in articles entities.
In that case, our Account class would have following:
class Account
#...
# Set index name based on account ID
#
def articles
Article.index_name "articles-#{self.id}"
Article
end
end
So, whenever we need to access articles for a particular account, either for searching or for indexing, we can simply do:
#account = Account.find( remember_token_or_something_like_that )
# Instead of `Article.search(...)`:
#account.articles.search { query { string 'something interesting' } }
# Instead of `Article.create(...)`:
#account.articles.create id: 'abc123', title: 'Another interesting article!', ...
Having a separate index per user/account works perfect in certain cases -- but definitely not well in cases where you'd have tens or hundreds of thousands of indices (or more). Having index aliases, with properly set up filters and routing, would perform much better in this case. We would slice the data not based on the tenant identity, but based on time.
Let's have a look at a second scenario, starting with a heavily simplified curl http://localhost:9200/_aliases?pretty output:
{
"articles_2012-07-02" : {
"aliases" : {
"articles_plan_pro" : {
}
}
},
"articles_2012-07-09" : {
"aliases" : {
"articles_current" : {
},
"articles_shared" : {
},
"articles_plan_basic" : {
},
"articles_plan_pro" : {
}
}
},
"articles_2012-07-16" : {
"aliases" : {
}
}
}
You can see that we have three indices, one per week. You can see there are two similar aliases: articles_plan_pro and articles_plan_basic -- obviously, accounts with the “pro” subscription can search two weeks back, but accounts with the “basic” subscription can search only this week.
Notice also, that the the articles_current alias points to, ehm, current week (I'm writing this on Thu 2012-07-12). The index for next week is just there, laying and waiting -- when the time comes, a background job (cron, Resque worker, custom script, ...) will update the aliases. There's a nifty example with aliases in “sliding window” scenario in the Tire integration test suite.
Let's not look on the articles_shared alias right now, let's look at what tricks we can play with this setup:
class Account
# ...
# Set index name based on account subscription
#
def articles
if plan_code = self.subscription && self.subscription.plan_code
Article.index_name "articles_plan_#{plan_code}"
else
Article.index_name "articles_shared"
end
return Article
end
end
Again, we're setting up an index_name for the Article class, which holds our documents. When the current account has a valid subscription, we get the plan_code out of the subscription, and direct searches for this account into relevant index: “basic” or “pro”.
If the account has no subscription -- he's probably a “visitor” type -- , we direct the searches to the articles_shared alias. Using the interface is as simple as previously, eg. in ArticlesController:
#account = Account.find( remember_token_or_something_like_that )
#articles = #account.articles.search { query { ... } }
# ...
We are not using the Article class as a gateway for indexing in this case; we have a separate indexing component, a Sinatra application serving as a light proxy to elasticsearch Bulk API, providing HTTP authentication, document validation (enforcing rules such as required properties or dates passed as UTC), and uses the bare Tire::Index#import and Tire::Index#store APIs.
These APIs talk to the articles_currentindex alias, which is periodically updated to the current week with said background process. In this way, we have decoupled all the logic for setting up index names in separate components of the application, so we don't need access to the Article or Account classes in the indexing proxy (it runs on a separate server), or any component of the application. Whichever component is indexing, indexes against articles_current alias; whichever component is searching, searches against whatever alias or index makes sense for the particular component.
You probably want to use another gem like rubberband https://github.com/grantr/rubberband to set up the index the way you want it, beforehand, maybe on account creation you do it in the after_create callback
Then in mapping your User and Comment model you can use Tire to do something like this:
tire.mapping :_routing => { :required => true, :path => :account_id } do
index_name 'account_name_here'
...
...
end
the tricky part will be getting the account_id or name into that index_name string/argument, might be easy or difficult, haven't tried dynamically assigning index_name yet
hope this helps!

Resources