I use Ruby on Rails and Postgres 9.5.
In the database I have a table called events.
I want to find all events based on particular value in the data field.
events.data has the following possible json value:
{ 'transition' => { 'from' => 'bacon', 'to' => 'ham' } }
How can I build a query that will find an event with data => transition => from bacon?
Assuming that your model is called Event, you can do it like this:
Event.where("data -> 'transition' ->> ? = ?", 'from', 'bacon')
Here is jsonb operators reference.
This query will return all events where data.transition.from is equal to bacon.
To DRY your queries, you can add it to the repository, e.g.:
# app/repositories/event_repository.rb
module EventRepository
extend ActiveSupport::Concern
included do
scope :where_transition, ->(field, value) { where("data -> 'transition' ->> ? = ?", field, value) }
end
end
After that include it in your model:
include EventRepository
Then you can use it like this:
Event.where_transition('from', 'bacon')
Event.where_transition('to', 'ham')
Related
Each user has one address.
class User
include Mongoid::Document
has_one :address
end
class Address
include Mongoid::Document
belongs_to :user
field :street_name, type:String
end
u = User.find(...)
u.address.update(street_name: 'Main St')
If we have a User without an Address, this will fail.
So, is there a good (built-in) way to do u.address.update_or_initialize_with?
Mongoid 5
I am not familiar with ruby. But I think I understand the problem. Your schema might looks like this.
user = {
_id : user1234,
address: address789
}
address = {
_id: address789,
street_name: ""
user: user1234
}
//in mongodb(javascript), you can get/update address of user this way
u = User.find({_id: user1234})
u.address //address789
db.address.update({user: u.address}, {street_name: "new_street name"})
//but since the address has not been created, the variable u does not even have property address.
u.address = undefined
Perhaps you can try to just create and attached it manually like this:
#create an address document, to get _id of this address
address = address.insert({street_name: "something"});
#link or attached it to u.address
u.update({address: address._id})
I had this problem recently. There is a built in way but it differs from active records' #find_or_initialize_by or #find_or_create_by method.
In my case, I needed to bulk insert records and update or create if not found, but I believe the same technique can be used even if you are not bulk inserting.
# returns an array of query hashes:
def update_command(users)
updates = []
users.each do |user|
updates << { 'q' => {'user_id' => user._id},
'u' => {'address' => 'address'},
'multi' => false,
'upsert' => true }
end
{ update: Address.collection_name.to_s, updates: updates, ordered: false }
end
def bulk_update(users)
client = Mongoid.default_client
command = bulk_command(users)
client.command command
client.close
end
since your not bulk updating, assuming you have a foreign key field called user_id in your Address collection. You might be able to:
Address.collection.update({ 'q' => {'user_id' => user._id},
'u' => {'address' => 'address'},
'multi' => false,
'upsert' => true }
which will match against the user_id, update the given fields when found (address in this case) or create a new one when not found.
For this to work, there is 1 last crucial step though.
You must add an index to your Address collection with a special flag.
The field you are querying on (user_id in this case)
must be indexed with a flag of either { unique: true }
or { sparse: true }. the unique flag will raise an error
if you have 2 or more nil user_id fields. The sparse option wont.
Use that if you think you may have nil values.
access your mongo db through the terminal
show dbs
use your_db_name
check if the addresses collection already has the index you are looking for
db.addresses.getIndexes()
if it already has an index on user_id, you may want to remove it
db.addresses.dropIndex( { user_id: 1} )
and create it again with the following flag:
db.addresses.createIndex( { user_id: 1}, { sparse: true } )
https://docs.mongodb.com/manual/reference/method/db.collection.update/
EDIT #1
There seems to have changes in Mongoid 5.. instead of User.collection.update you can use User.collection.update_one
https://docs.mongodb.com/manual/reference/method/db.collection.updateOne/
The docs show you need a filter rather than a query as first argument but they seem to be the same..
Address.collection.update_one( { user_id: user_id },
'$set' => { "address": 'the_address', upsert: true} )
PS:
If you only write { "address": 'the_address' } as your update clause without including an update operator such as $set, the whole document will get overwritten rather than updating just the address field.
EDIT#2
About why you may want to index with unique or sparse
If you look at the upsert section in the link bellow, you will see:
To avoid multiple upserts, ensure that the filter fields are uniquely
indexed.
https://docs.mongodb.com/manual/reference/method/db.collection.updateOne/
I added the following filter in ActiveAdmin.
filter :roles, as: :select, collection Model::ROLES, multiple: true
but when i choose the filter value to search the roles. it gives me following error
PG::InvalidTextRepresentation: ERROR: malformed array literal: "teacher"LINE 1: ...ted" = $1 AND roles" IN('teacher
DETAIL: Array value must start with "{" or dimension information. ^
Any idea ? How we can search/Filter ARRAY field using AA filters? I'm using Rails 4.2.4,
ruby 2.2.2p95
I came up to a solution slightly different (and inspired by) this one over here: https://stackoverflow.com/a/45728004/1170086
Mine involves some changes (and prevent breaking contains operator in other cases). So, you're going to basically create two initializer files:
This one is for Arel, in order to support #> operator (array's contain operator in PG) for a given table column.
# config/initializers/arel.rb
module Arel
class Nodes::ContainsArray < Arel::Nodes::Binary
def operator
:"#>"
end
end
class Visitors::PostgreSQL
private
def visit_Arel_Nodes_ContainsArray(o, collector)
infix_value o, collector, ' #> '
end
end
module Predications
def contains(other)
Nodes::ContainsArray.new self, Nodes.build_quoted(other, self)
end
end
end
The other file aims to create a new Ransack predicate but I also decided to support the :array type (that's not natively supported in Ransack in terms of predicates).
# config/initializers/ransack.rb
module Ransack
module Nodes
class Value < Node
alias_method :original_cast, :cast
def cast(type)
return Array(value) if type == :array
original_cast(type)
end
end
end
end
Ransack.configure do |config|
config.add_predicate 'contains_array',
arel_predicate: 'contains',
formatter: proc { |v| "{#{v.join(',')}}" },
validator: proc { |v| v.present? },
type: :array
end
And in other to use it. All you need to do is:
User.ransack(roles_contains_array: %i[admin manager])
Or as a filter in ActiveAdmin (which is my case):
ActiveAdmin.register User do
# ...
filter :roles_contains_array, as: :select, collection: User.roles_for_select
# ...
end
I hope it works for you as it worked for me. ;)
You can set up a custom ransacker method to first collect the ids you want returned using a regular postgres search, and then return the results based on those ids:
class User < ApplicationRecord
ransacker :roles,
formatter: proc { |str|
data = where("? = ANY (roles)", str).map(&:id)
data.present? ? data : nil
} do |parent|
parent.table[:id]
end
end
If your filter is a select drop-down, then this should work fine. If you have a free-form text box, then make sure to use the "in" predicate:
filter :roles_in, as: :string
leandroico solutions works well.
But if you add the predicate with this formatter
formatter: proc { |v| "{#{v.join(', ')}}" }, (note the space after the comma)
Then you could use the multiple: true keyword in the filter input and filter by more than one value:
filter :roles_contains_array, as: :select, multiple: true, collection: User.roles_for_select
I used the answer from #leandroico to come up with the below wiki-type approach to doing this.
How to Create Custom SQL Searches for ActiveAdmin (using Arel and Ransack)
In ActiveAdmin, filters are declared in app/admin/model.rb like:
ActiveAdmin.register Model do
filter 'column_name', label: 'column_name', as: :string
end
That will make a searchbox available on the front-end with options to choose between
contains
equals
starts with
ends with
You can even do something like...
filter 'column_name_contains', label: 'column_name', as: :string
...to only have a contains type search available on the front-end.
You can also (after defining some custom methods elsewhere) specify other, non-built-in search methods, like:
filter 'column_name_custom_contains', label: 'column_name', as: :string
The rest of this doc will be about how to define this custom search method, custom_contains
Within config/initializers/arel.rb, define the following:
module Arel
# this example of custom_contains will cast the SQL column as ::text and then do a wildcard-wrapped ILIKE
class Nodes::CustomContains < Arel::Nodes::Binary
def operator
'::text ILIKE'.to_sym
end
end
class Visitors::PostgreSQL
private
def visit_Arel_Nodes_CustomContains(o, collector)
infix_value o, collector, '::text ILIKE '
end
end
module Predications
def custom_contains(column_value)
column_value = self.relation.engine.column_types[self.name.to_s].type_cast_for_database(column_value)
column_value = "%#{self.relation.engine.send(:sanitize_sql_like, column_value)}%" # wrap escaped value with % wildcard
column_value = Nodes.build_quoted(column_value, self)
Nodes::CustomContains.new(self, column_value)
end
end
end
module ActiveRecord::QueryMethods
def custom_contains(predicates)
return none if predicates.length == 0
predicates.map{ |column_name, column_value|
column_value = table.engine.column_types[column_name.to_s].type_cast_for_database(column_value)
column_value = "%#{table.engine.send(:sanitize_sql_like, column_value)}%" # wrap escaped value with % wildcard
column_value = Arel::Nodes.build_quoted(column_value)
where Arel::Nodes::CustomContains.new(table[column_name], column_value)
}.inject(:merge)
end
end
module ActiveRecord::Querying
delegate :custom_contains, :to => :all
end
Within config/initializers/ransack.rb, define the following:
Ransack.configure do |config|
config.add_predicate(
'custom_contains',
arel_predicate: 'custom_contains',
formatter: proc { |v| v.to_s },
validator: proc { |v| v.present? },
type: :string
)
end
The above has accomplished a couple of things:
1) You can use the custom_contains method that was delegate'd to all ActiveRecord models:
puts Model.custom_contains(column_name: 'search for me').to_sql
2) You can use Ransack to search against the Arel predicates that were defined:
puts Model.ransack(column_name_custom_contains: 'search for me').result.to_sql
However, in order to do the below in ActiveAdmin...
filter 'column_name_custom_contains', label: 'column_name', as: :string
...we must add a scope to Model so that there is a method, column_name_custom_contains, on Model
scope_name = "#{column_name}_custom_contains".to_sym
unless Model.methods.include?(scope_name)
Model.scope(
scope_name,
->(value) {
Model.custom_contains({column_name.to_sym => value})
}
)
end
Voila!
I'm using the elasticsearch-rails gem and the elasticsearch-model gem and writing a query that happens to be really huge just because of the way the gem accepts queries.
The query itself isn't very long, but it's the filters that are very, very long, and I need to pass variables in to filter out the results correctly. Here is an example:
def search_for(input, question_id, tag_id)
query = {
:query => {
:filtered => {
:query => {
:match => {
:content => input
}
},
:filter => {
:bool => {
:must => [
{
# another nested bool with should
},
{
# another nested bool with must for question_id
},
{
# another nested bool with must for tag_id
}
]
}
}
}
}
}
User.search(query) # provided by elasticsearch-model gem
end
For brevity's sake, I've omitted the other nested bools, but as you can imagine, this can get quite long quite fast.
Does anyone have any ideas on how to store this? I was thinking of a yml file, but it seems wrong especially because I need to pass in question_id and tag_id. Any other ideas?
If anyone is familiar with those gems and knows whether the gem's search method accepts other formats, I'd like to know that, too. Looks to me that it just wants something that can turn into a hash.
I think using a method is fine. I would separate the searching from the query:
def query_for(input, question_id, tag_id)
query = {
:query => {
...
end
search query_for(input, question_id, tag_id)
Also, I see that this search functionality is in the User model, but I wonder if it is belongs there. Would it make more sense to have a Search or Query model?
I have a model Event that is connected to MongoDB using Mongoid:
class Event
include Mongoid::Document
include Mongoid::Timestamps
field :user_name, type: String
field :action, type: String
field :ip_address, type: String
scope :recent, -> { where(:created_at.gte => 1.month.ago) }
end
Usually when I use ActiveRecord, I can do something like this to group results:
#action_counts = Event.group('action').where(:user_name =>"my_name").recent.count
And I get results with the following format:
{"action_1"=>46, "action_2"=>36, "action_3"=>41, "action_4"=>40, "action_5"=>37}
What is the best way to do the same thing with Mongoid?
Thanks in advance
I think you'll have to use map/reduce to do that. Look at this SO question for more details:
Mongoid Group By or MongoDb group by in rails
Otherwise, you can simply use the group_by method from Enumerable. Less efficient, but it should do the trick unless you have hundreds of thousands documents.
EDIT: Example of using map/reduce in this case
I'm not really familiar with it but by reading the docs and playing around I couldn't reproduce the exact same hash you want but try this:
def self.count_and_group_by_action
map = %Q{
function() {
key = this.action;
value = {count: 1};
emit(key, value);
# emit a new document {"_id" => "action", "value" => {count: 1}}
# for each input document our scope is applied to
}
}
# the idea now is to "flatten" the emitted documents that
# have the same key. Good, but we need to do something with the values
reduce = %Q{
function(key, values) {
var reducedValue = {count: 0};
# we prepare a reducedValue
# we then loop through the values associated to the same key,
# in this case, the 'action' name
values.forEach(function(value) {
reducedValue.count += value.count; # we increment the reducedValue - thx captain obvious
});
# and return the 'reduced' value for that key,
# an 'aggregate' of all the values associated to the same key
return reducedValue;
}
}
self.map_reduce(map, reduce).out(inline: true)
# we apply the map_reduce functions
# inline: true is because we don't need to store the results in a collection
# we just need a hash
end
So when you call:
Event.where(:user_name =>"my_name").recent.count_and_group_by_action
It should return something like:
[{ "_id" => "action1", "value" => { "count" => 20 }}, { "_id" => "action2" , "value" => { "count" => 10 }}]
Disclaimer: I'm no mongodb nor mongoid specialist, I've based my example on what I could find in the referenced SO question and Mongodb/Mongoid documentation online, any suggestion to make this better would be appreciated.
Resources:
http://docs.mongodb.org/manual/core/map-reduce/
http://mongoid.org/en/mongoid/docs/querying.html#map_reduce
Mongoid Group By or MongoDb group by in rails
I'm trying to use Tire to perform a nested query on a persisted model. The model (Thing) has Tags and I'm looking to find all Things tagged with a certain Tag
class Thing
include Tire::Model::Callbacks
include Tire::Model::Persistence
index_name { "#{Rails.env}-thing" }
property :title, :type => :string
property :tags, :default => [], :analyzer => 'keyword', :class => [Tag], :type => :nested
end
The nested query looks like
class Thing
def self.find_all_by_tag(tag_name, args)
self.search(args) do
query do
nested path: 'tags' do
query do
boolean do
must { match 'tags.name', tag_name }
end
end
end
end
end
end
end
When I execute the query I get a "not of nested type" error
Parse Failure [Failed to parse source [{\"query\":{\"nested\":{\"query\":{\"bool\":{\"must\":[{\"match\":{\"tags.name\":{\"query\":\"TestTag\"}}}]}},\"path\":\"tags\"}},\"size\":10,\"from\":0,\"version\":true}]]]; nested: QueryParsingException[[test-thing] [nested] nested object under path [tags] is not of nested type]; }]","status":500}
Looking at the source for Tire it seems that mappings are created from the options passed to the "property" method, so I don't think I need a separate "mapping" block in the class. Can anyone see what I am doing wrong?
UPDATE
Following Karmi's answer below, I recreated the index and verified that the mapping is correct:
thing: {
properties: {
tags: {
properties: {
name: {
type: string
}
type: nested
}
}
title: {
type: string
}
}
However, when I add new Tags to Thing
thing = Thing.new
thing.title = "Title"
thing.tags << {:name => 'Tag'}
thing.save
The mapping reverts to "dynamic" type and "nested" is lost.
thing: {
properties: {
tags: {
properties: {
name: {
type: string
}
type: "dynamic"
}
}
title: {
type: string
}
}
The query fails with the same error as before. How do I preserve the nested type when adding new Tags?
Yes, indeed, the mapping configuration in property declarations is passed on in the Persistence integration.
In a situation like this, there's always the and and only first question: how does the mapping look like for real?
So, use eg. the Thing.index.mapping method or the Elasticsearch's REST API: curl localhost:9200/things/_mapping to have a look.
Chances are, that your index was created with the dynamic mapping, based on the JSON you have used, and you have changed the mapping later. In this case, the index creation logic is skipped, and the mapping is not what you expect.
There's a Tire issue opened about displaying warning when the index mapping is different from the mapping defined in the model.