I have been struggling for a few days trying to get queries to work. At the moment my model looks like this:
class Geojson
include Mongoid::Document
field :type, type: String, default: 'Point'
field :coordinates, type: Array
index({coordinates: "2dsphere"}, { bits: 12})
end
The following query returns nil:
Geojson.find(:coordinates => {"$nearSphere" => [-70.1197340629727, 4.67071244438]})
These are the current instances in my database:
[#<Geojson _id: 61b7b21a9eb0c9ef0aa5626d, type: "Point", coordinates: [-74.13041168951031, 4.6638117]>,
#<Geojson _id: 61b7b2619eb0c9ef0aa5626e, type: "Point", coordinates: [-74.1213041168951, 4.5638117]>]
I am able to query similar cases on mongosh with no issues, however I am not sure where the mistake is when doing it directly on rails.
I finally managed to make it work the following way: (for a 2d sphere index)
Geojson.where(:coordinates => {"$nearSphere" => [long, lat]}).to_a
Where longitude and latitude are the parameters received.
Related
When retrieving results from ES using Chewy, the date-typed results are returned as Ruby strings.
My index is defined just like:
class OrdersIndex < Chewy::Index
define_type Order do
field :id, type: "keyword"
field :created_at, type: "date"
end
end
When the results are retrieved:
OrdersIndex.order(created_at: :desc).first.created_at.class
# => String
Is there a way to deserialize this date field in a Ruby date object automatically, without having to explicitly map the results using Time#parse?
I 'm working on rails app in which I need to receive array of hashes in API call with Grape like.
{
tournament_id:1
match_id: 10
[
{
team_id: 1
score: 10
},
{
team_id: 2
score: 20
}
]
}
so that I can receive score of each team in single call for a specific match and tournament instead of multiple calls for score of each team.
I have tried multiple things like
group :teams_with_scores, type: Array, desc: "An array of Teams with scores" do
requires :team_id, type: String,desc: "Team ID"
requires :score, type: String,desc: "Score"
end
But don't have a clue that how to do it.
You can send this data as a json string, and then parse this json when you get it:
params do
scores_info, type: String, desc: 'the scores info'
end
get do
scores_info = JSON.parse(params[:scores_info])
end
When using Moped gem, I can store an array of hashes with:
users = [{username: "ben", password: "123456", type: "admin" }, {username: "joe", password: "abcd1234" }]
Mongoid::Sessions.default["collection"].insert(users)
With mongoid documents it would look like:
class User
field :username, type: String
field :password, type: String
end
users.each { |user_hash| User.create(user_hash) }
Which means an insertion operation for each.
Do you know a way to keep the single operation method? Maybe something like a transaction in ActiveRecord?
You can convert Documents back to Hashes and insert them with single call to #create:
User.create(users.map(&:attributes))
I'm trying to use Tire to perform a nested query on a persisted model. The model (Thing) has Tags and I'm looking to find all Things tagged with a certain Tag
class Thing
include Tire::Model::Callbacks
include Tire::Model::Persistence
index_name { "#{Rails.env}-thing" }
property :title, :type => :string
property :tags, :default => [], :analyzer => 'keyword', :class => [Tag], :type => :nested
end
The nested query looks like
class Thing
def self.find_all_by_tag(tag_name, args)
self.search(args) do
query do
nested path: 'tags' do
query do
boolean do
must { match 'tags.name', tag_name }
end
end
end
end
end
end
end
When I execute the query I get a "not of nested type" error
Parse Failure [Failed to parse source [{\"query\":{\"nested\":{\"query\":{\"bool\":{\"must\":[{\"match\":{\"tags.name\":{\"query\":\"TestTag\"}}}]}},\"path\":\"tags\"}},\"size\":10,\"from\":0,\"version\":true}]]]; nested: QueryParsingException[[test-thing] [nested] nested object under path [tags] is not of nested type]; }]","status":500}
Looking at the source for Tire it seems that mappings are created from the options passed to the "property" method, so I don't think I need a separate "mapping" block in the class. Can anyone see what I am doing wrong?
UPDATE
Following Karmi's answer below, I recreated the index and verified that the mapping is correct:
thing: {
properties: {
tags: {
properties: {
name: {
type: string
}
type: nested
}
}
title: {
type: string
}
}
However, when I add new Tags to Thing
thing = Thing.new
thing.title = "Title"
thing.tags << {:name => 'Tag'}
thing.save
The mapping reverts to "dynamic" type and "nested" is lost.
thing: {
properties: {
tags: {
properties: {
name: {
type: string
}
type: "dynamic"
}
}
title: {
type: string
}
}
The query fails with the same error as before. How do I preserve the nested type when adding new Tags?
Yes, indeed, the mapping configuration in property declarations is passed on in the Persistence integration.
In a situation like this, there's always the and and only first question: how does the mapping look like for real?
So, use eg. the Thing.index.mapping method or the Elasticsearch's REST API: curl localhost:9200/things/_mapping to have a look.
Chances are, that your index was created with the dynamic mapping, based on the JSON you have used, and you have changed the mapping later. In this case, the index creation logic is skipped, and the mapping is not what you expect.
There's a Tire issue opened about displaying warning when the index mapping is different from the mapping defined in the model.
I currently have a Mongoid model in a Ruby on Rails application as such:
class Listen
include Mongoid::Document
field :song_title, type: String
field :song_artist, type: String
field :loc, :type => Array
field :listened_at, type: Time, default: -> { Time.now }
index(
[[:loc, Mongo::GEO2D]], background: true
)
end
When I try to query the collection for example
listens = Listen.where(:loc => {"$within" => {"$centerSphere" => [location, (radius.fdiv(6371))]}})
I am returned the error (locations have been blanked out, the X's are not returned)
Mongo::OperationFailure (can't find special index: 2d for: { loc: { $within: { $centerSphere: [ [ XX.XXXXXXX, X.XXXXXXX ], 0.0001569612305760477 ] } } }):
I know I can create the indexed through a rake task such as rake db:mongoid:create_indexes but I don't want to have to do this every time a model is created. Is there any way for the model to create this automatically on insert to the collection?
Nope there is no way.
You must create indexes (not just Geo) once, to use it.