mongodb with rails, find by id in array - ruby-on-rails

I can fetch an element by BSON id from Mongodb with
db.my_collection.find({_id: ObjectId("567bc95ab62c732243123450")})
And it works. But how can I get an array of ids? something like
db.my_collection.find({_id: [ObjectId("567bc95ab62c732243123450"])})
I tried different ways, as suggested on mongodb's website, but the interactive shell complained for syntax.
EDIT:
Found a problem:
it should be
db.my_collections.find({_id: { $in : [ObjectId("567bc95ab62c732243123450")]}})

And in Rails:
MyCollection.find({'_id' => { "$in" => collection_ids}})

Related

Using Atlas Search with Mongoid on Rails

I am trying to get full text search with Atlas to work in my Rails app. I have set up the index following this tutorial in their docs. When I test the query in a vacuum it seems to work as expected, I'm able to query my database and get the results I would expect, but it seems like the documentation around how to do this in Mongoid is lacking. I have found this documentation for running text search in Mongoid, but it explicitly calls out that it isn't Atlas Search.
Has anybody successfully implemented an Atlas Search index/query using Mongoid (or otherwise in a Rails app) and, if so, could you please point me towards the relevant docs.
Alright - after some experimentation with this I have found that the following code will work with the current version of mongoid:
TableName.collection.aggregate([{
'$search' => {
'index' => 'index_name',
'text' => {
'query' => 'some string to search',
'path' => {
'wildcard' => '*'
}
}
}
}])

Clone a mongodb collection from within Rails Mongoid

I am trying to implement this solution in rails, using the collection aggregate method, to clone an entire collection within the same database.
In mongo shell, this works perfectly, and a cloned collection is created successfully:
db.source_collection.aggregate([ { $match: {} }, { $out: "target_collection" } ])
The rails-mongoid alternate, according to my research, should be this, which runs without errors:
SourceCollection.collection.aggregate({"$match" => {}, "$out" => "target_collection"})
#<Mongo::Collection::View::Aggregation:0x000000055bced0 #view=#<Mongo::Collection::View:0x44951600 namespace='DB_dev.source_collection' #filter={} #options={}>, #pipeline={"$match"=>{}, "$out"=>"target_collection"}, #options={}>
I also tried with an array
SourceCollection.collection.aggregate([{"$match" => {}}, {"$out" => "target_collection"}])
#<Mongo::Collection::View::Aggregation:0x000000054936d0 #view=#<Mongo::Collection::View:0x44342320 namespace='DB_dev.source_collection' #filter={} #options={}>, #pipeline=[{"$match"=>{}}, {"$out"=>"target_collection"}], #options={}>
UPDATE
This simplest syntax also works in Mongo console:
db.source_collection.aggregate( { $out: "target_collection" } )
But the respective syntax does not seem to work in Ruby:
SourceCollection.collection.aggregate({"$out" => "target_collection"})
Unfortunately, although there are no errors, the collection is not created.
Any clues as to the way I can make this happen?
Mongo gem version 2.5.3
Update2
Apparently $out is not considered in the pipeline, thus rendering the aggregation invalid.
This can be fixed with code... I am looking for a module/class/method override, as contacting mongodb issue tracking system for a change request might not be as quick..
UPDATE - FINAL
This issue has been solved, by help of Thomas R. Koll (thank you).
I add an update to post the response I got from the ticketing service of MongoDB, which pretty much describes Thomas's solution.
The reason you're not seeing the results without count is that the
aggregate method returns a lazy cursor; that is, the query does not
execute until the return value of aggregate is iterated over.
Calling count is one way to do this. This is the same behavior
you'll see if you call find or if you call aggregate without
specifying $out; the difference is that $out has an side-effect
beyond just returning the results, so it's more obvious when exactly
it occurs.
Found the solution, and I have to explain a few thigs:
This returns a Mongo::Collection::View::Aggregation object, it won't send a query to the database
User.collection.aggregate({"$out": "target_collection"})
Only when you call a method like count or to_a on the aggregation object it will be sent to the server, but if you pass a hash you'll get an error, so the pipeline has to be an array of hashes to work
User.collection.aggregate([{"$out": "target_collection"}]).count

elasticsearch unable to query path in ruby

I have an elasticsearch index 'events' - within that index there's a type 'event'.
event objects have a 'venue' which has various properties, including a 'name' - so the simplified structure is:
event {
venue {
name: "foo"
}
}
Now, i'm using elasticsearch-rails - everything works fine for listing the events, searching etc using the query dsl - but what if i want to list all the events at a venue with a particular name?
I'm assuming something like this should be possible:
Event.search "{ 'query': { 'match': { 'venue.name': '#{params[:v]}' }}}
but i get the following error:
Elasticsearch::Transport::Transport::Errors::BadRequest
followed by a substantial stack trace which contains a lot of this sort of thing:
Was expecting one of:\n \"]\" ...\n \"}\" ...\n ];
ParseExceptions suggesting malformed json - but i'm not sure why.
The simple search
Event.search '{"query" : { "match_all" : {} }}'
works fine, so i'm guessing it's just the structure of the query that's wrong.
I've tried switching single/double quotes around, tried following more closely the example on this page:
https://www.elastic.co/guide/en/elasticsearch/guide/current/denormalization.html
all to no avail, wondered if anyone else had encountered this situation and could suggest how to work this in ruby.
The Json you are trying to pass to the search function is not a valid Json. You can try passing a hash instead of Json to the search function. Try the following:
query_hash = {query: {match: {'venue.name' => params[:v] }}}
Event.search query_hash
Elasticsearch's json parser won't the use of single quotes to delimit strings - while some later parser's may, this isn't part of the standard.
You can of course escape them, although this makes things somewhat less legible, so using an alternative form of quoting may be preferable:
%< {"query": { "match": { "venue.name": "#{params[:v]}"}}} >
However it's much better to represent the query as a ruby hash and then convert that to json (for example the snippet above doesn't correctly escape special characters in the submitted value)

Cassandra and creating an int column name from Ruby client

I am attempting to create dynamic columns with a comparator/validator that is a 32 bit signed integer. This obviously will save on storage space amongst other advantages. Currently, this works great if I have a UTF8Type validator (Using Twitter's cassandra client for Ruby):
db.insert(:foo, 'mykey', {'mycol' => 'myval'})
This is where the problem occurs:
db.insert(:foo, 'mykey', {5 => 'myval'})
I think this is more of a Ruby issue than Cassandra issue. Using Rails console, I get the following thrown out at me:
TypeError: no implicit conversion of Fixnum into String
Further clarification, I can't simply do:
db.insert(:foo, 'mykey', {'5' => 'myval'})
This will trigger a validation fail which is expecting an integer for the column and not a string.
Is there a way to make this reasonably work in Ruby so that I don't have to use UTF8Type column names and can stick to int based ones for my Cassandra 1.2 based app?
The twitter Cassandra library leaves it to you the developer to serialize/deserialize all values. It requires everything that you give it to be in binary string representation. So if you want to use ints as your comparator you need to pack them before inserting and unpack them when fetching them out of Cassandra. Your insert needs to look like this:
db.insert(:foo, 'mykey', {[5].pack('N*') => 'myval'})
#MrYoshiji
Fixnum CAN be declared as key in Ruby Hashes.
Just use correct syntax. You're in Ruby, not Python !!!
irb(main):010:0> { 1 => 'bonjour', 2 => "okay" }
=> {1=>"bonjour", 2=>"okay"}
irb(main):012:0> { 1 => 'bonjour', 2 => "okay" }.keys.map(&:class)
=> [Fixnum, Fixnum]

Getting Mongoid from params array

In order to find a Root Document that contains a embedded document using MongoID/Rails 3 I need to do my query this way:
QuoteRequest.where( "order_request_items._id" => BSON::ObjectID(params[:id]) ).first
Is there a way to query without using the BSON::ObjectID ?
Thanks!
I'm not a MongoID/Rails user, but my guess is that you can't.
Even in the Mongo shell you have to use ObjectId() if you want to compare ObjectIDs. Something like this won't return any results:
db.foo.find({_id: "4c7ca651db48000000002277"})
You'll have to create an actual ObjectID from the string in order to get results:
db.foo.find({_id: ObjectId("4c7ca651db48000000002277")})
MongoID apparently doesn't automatically convert your input to ObjectIDs. But perhaps there's a way to tell MongoID which fields it should always convert to ObjectIDs? Then you would be able to omit the use of BSON::ObjectID.
This is a bug, the ids should be automagically converted by Mongoid. You should open a ticket on github: http://github.com/mongoid/mongoid/issues

Resources