Search json field with mongoid - ruby-on-rails

I've got the following MongoDB/Mongoid document:
#<Event
_id: 51e406a91d41c89fa2000002,
namespace: :namespace,
bucket: "order_created",
data: {
"order"=>{
"id"=>100,
"deleted_at"=>nil,
"created_at"=>2011-10-06 15:45:04 UTC,
"updated_at"=>2013-07-10 16:37:07 UTC,
"completed_at"=>2013-07-10 16:37:07 UTC
}
}>
Here is the event class definition:
class Event
include Mongoid::Document
field :namespace, type: Symbol
field :bucket, type: String
field :data, type: Hash
end
I want to find and update it using the find_and_modify method in Mongoid but I can't figure out how to properly structure the search criteria to search the data field properly.
Specifically, I want to find data.order.id = 100. I've tried the following with no luck:
Event.where(:data.order.id => 100)
Event.where(:'data["order"]["id"]' => 100)
Event.where( { data: { order: { id: 100 } } } )
Event.where( { data: { :"order" => { :"id" => 100 } } }
The latter returns nil, but the former (and, from the documentation I've read, the correct way to do it) gives me a SyntaxError: unexpected ':'.
This is with Mongoid 3.1.4 and MongoDB 2.4.5.

Answering my own question. The Event class is not referencing a collection, which is what's critical for the Criteria search to work. I've instantiated a new db object to use against the collection and the find/where methods work. Here's an example:
#db = Mongoid::Sessions.default
#db[:events].find().first['order']
#db[:events].where("data.order.id" => 100).first

Related

Defining input types in graphql-ruby

I'm trying to implement an input type for filters with graphql-ruby and rails.
Base input type looks like:
module Types
class BaseInputObject < GraphQL::Schema::InputObject
end
end
My own input type looks like:
module Types
class PhotoFilterType < Types::BaseInputObject
argument :attribution, String, "Filters by submitter", required: false
argument :country, String, "Filters by country", required: false
argument :year, Int, "Filters by year", required: false
end
end
query_type's method header looks like:
field :filtered_photos, [Types::PhotoType], null: true do
argument :filters, Types::PhotoFilterType, 'filters for photo', required: true
end
And the query as follows:
const FILTER_QUERY = gql`
query getFilteredPhotos($filters: PhotoFilterType!) {
filteredPhotos(filters: $filters) {
id
title
width
height
}
}
`
Interacting with the backend using react-apollo as follows:
this.props.client.query({
query: FILTER_QUERY,
variables: {
filters: {
attribution: author.length > 0 ? author : '',
country: country.length > 0 ? country : '',
year: year ? year : 9999
}
}
})
I get the following error
{message: "PhotoFilterType isn't a defined input type (on $filters)",…}
But when I interact with rails console, I see:
irb(main):004:0> Types::PhotoFilterType
=> Types::PhotoFilterType
So I don't think the type being undefined is an issue.
Any idea what is going wrong and how to fix it? Thanks in advance.
Problem was solved by changing:
const FILTER_QUERY = gql`
query getFilteredPhotos($filters: PhotoFilter!) { // not PhotoFilterType
filteredPhotos(filters: $filters) {
id
title
width
height
}
}
`

Graphql Field doesn't exist on type

After skimming through the docs of Graphql I've started to implement it on a toy rails/reactJS project. The projects allows an user to sign in through devise then access a dummy /artist route that displays a list of artists. Everything seems to work fine until I try to consume api data with GraphQL from the react app to get artists and display them.
On the server side, I have a graphql_controller.rb such as:
class GraphqlController < ApiController
rescue_from Errors::Unauthorized do |exception|
render json: {errors: ['Unauthorized.']}, status: :unauthorized
end
def index
result = Schema.execute params[:query], variables: params[:variables], context: context
render json: result, status: result['errors'] ? 422 : 200
end
private
def context
token, options = ActionController::HttpAuthentication::Token.token_and_options request
{
ip_address: request.remote_ip
}
end
end
Then, following to my model logic, I have set up graphql under graph/ with the following files:
graph/queries/artist_query.rb
ArtistQuery = GraphQL::ObjectType.define do
name 'ArtistQuery'
description 'The query root for this schema'
field :artists, types[Types::ArtistType] do
resolve(->(_, _, _) {
Artist.all
})
end
end
types/artist_type.rb
Types::ArtistType = GraphQL::ObjectType.define do
name 'Artist'
description 'A single artist.'
field :id, !types.ID
field :name, types.String
field :description, types.String
end
schema.rb
Schema = GraphQL::Schema.define do
query ArtistQuery
end
On the client side, for the sake of keeping things organized, I use 3 files to render this artist list:
First, ArtistSchema.js
import { gql } from 'react-apollo';
const artistListQuery = gql`
{
query {
artists {
id
name
description
}
}
}
`;
export default artistListQuery;
Then, an Artist.js
import React, { Component } from 'react';
class Artist extends Component {
render() {
return (
<tr>
<td>{this.props.index + 1}</td>
<td>{this.props.data.name}</td>
<td>{this.props.data.description} min</td>
</tr>
);
}
}
export default Artist;
And finally, wrapping these two together in a larger layout: Artists.jsx:
import React, { Component } from 'react';
import {graphql} from 'react-apollo';
import Artist from './Artist';
import artistListQuery from './ArtistSchema';
class Artists extends Component {
render() {
if(this.props.data.loading) {
return (<p>Loading</p>)
} else {
console.log(this.props.data)
const ArtistsItems = this.props.data.artists.map((data,i) => {
return (<Artist key={i} index={i} data={data}></Artist>);
});
return (
<div>
<h1>Artists</h1>
<table className="table table-striped table">
<thead>
<tr>
<th>#</th>
<th>Name</th>
<th>Description</th>
</tr>
</thead>
<tbody>
{ ArtistsItems }
</tbody>
</table>
</div>
);
}
}
}
export default graphql(artistListQuery)(Artists);
What happens when this code is executed:
On server-side (sorry for the unformatted output, but it displays like this in console):
Processing by GraphqlController#index as */*
18:49:46 api.1 | Parameters: {"query"=>"{\n query {\n artists {\n id\n name\n description\n __typename\n }\n __typename\n }\n}\n", "operationName"=>nil, "graphql"=>{"query"=>"{\n query {\n artists {\n id\n name\n description\n __typename\n }\n __typename\n }\n}\n", "operationName"=>nil}}
Followed by the error:
Completed 422 Unprocessable Entity in 36ms (Views: 0.2ms | ActiveRecord: 0.0ms)
On the client side, if I monitor Network > Response for graphql, I (of course) receive a 422 error code and the following error message:
{"errors":[{"message":"Field 'query' doesn't exist on type 'ArtistQuery'","locations":[{"line":2,"column":3}],"fields":["query","query"]}]}
I assume my query is not done correctly. I have been trying various queries formats (from docs or gists examples) but I cannot end finding a correct way to get back my artist data.
What am I doing wrong?
I don't think this is the issue for this particular case, but I was getting this error and it turned out to be due to a simple syntax error. Query attributes need to be in camelCase format, not under_score format. Maybe this will help someone else that lands here when searching for this error like I did.
In my case the issue was that I had a structure like this:
module Types
class SomeType < Types::BaseObject
field :comparator,
Types::ComparatorType
field :options,
[Types::OptionType]
end
end
BUT in the query I had other nested structure called data that I forgot about:
mutation {
myMutation(
...
) {
someType {
comparator {
...
}
data {
options {
...
}
}
}
}
}
So after changing SomeType class to add that missing key solved my issue. So now it look like this:
module Types
class SomeType < Types::BaseObject
field :comparator,
Types::ComparatorType
field :data,
Types::DataType
end
end
# New file
module Types
class DataType < Types::BaseObject
field :options,
[Types::OptionType]
end
end
The GQL query you sent is malformed, it is asking for the query field of Query root object. Use this instead:
const artistListQuery = gql`
query UseTheNameYouWantHere {
artists {
id
name
description
}
}
`;
BTW, you can add graphiql gem (https://github.com/rmosolgo/graphiql-rails) to have a playground with your GraphQL API.
You might need to use a newer API endpoint URL version.
Example Request URL: https://shopify-graphiql-app.shopifycloud.com/api/*2022-01*/graphql
All right, the problem is not in the code itself. I changed the query to the one you adviced and updated the name artists to something else on both client and server side.
It seems like apollo-rails is caching previous api calls, even if they failed, that why I was always getting the same error. I still have to figure out how to clean this cache. Thanks for your time.
I had this error in RSpec, but not in a real call, i.e. Postman, for the same query. The solution was to change the query in spec from
<<~QUERY
query users {
posts {
...
}
}
QUERY
to
<<~QUERY
query users {
users {
posts {
...
}
}
}
QUERY
However in other specs, the first approach was working with similarly looking mutation, so I am not sure what was wrong
If you are still experiencing the issue, and you have arleady double checked that:
You are making a GraphQL query using camel case. E.g. visitCount instead of visit_count
You have a valid field and it is spelled correctly.
field :visit_count, Integer, :null => false, :description => "The sum of.."
The field is in the correct type.
E.g. If you are writing a query for:
{
user {
nodes {
blogs {
name
visitCount
}
}
}
}
your visit_count field should be in the BlogType.
If you are still experiencing this issue:
Check if you have an: Looking up hash key Error
It may look like:
Failed to implement Blog.visitCount, tried:
- `Types::BlogType#visit_count`, which did not exist
- `AppGraphql#visit_count`, which did not exist
- Looking up hash key `:visit_count` or `"visit_count"` on `#<AppGraphql:0x00007f8394982e88>`, but it wasn't a Hash.
To implement this field, define one of the methods above (and check for typos)
If this is the case, it's looking up a key for visit_count instead of visitCount. We need to update the field to include a hash_key.
❌ Before:
field :visit_count, Integer, :null => false
✅ After:
field :visit_count, Integer, :null => false, :hash_key => :visitCount

Better way to return validation errors for nested models as json in Rails?

I have class order, that can content entries. And each entry can be complex type, and consist from another entries.
class Order < ActiveRecord::Base
accepts_nested_attributes_for :entries
end
class Entry < ActiveRecord::Base
accepts_nested_attributes_for :members, allow_destroy: true
end
In form i have rails generated fields in form, using fields_for
<input autocomplete="off" class="string required form-control" id="order_entries_attributes_1459329286687_members_attributes_1459329286739_title" name="order[entries_attributes][1459329286687][members_attributes][1459329286739][title]" placeholder="Наименование" type="text">
So, i submit a form of order, for example with 2 entries, and 5 members with some validation errors ( 2 members without title ) and it passes to controller
class OrdersController
def update
if #order.update(order_params)
render json: #order
else
render json: #order.errors, status: :unprocessable_entity
end
end
end
And it returns me this
{"entries.members.title":["cant be blank"]}
Problem is that i cant find which entry and which member in it, has a validation error, and that's why i cant for example highlight this field. Moreover it merges similar errors. And this is problem.
On submit i pass unique index(in name attributes), and rails correctly use it for creating nested models, it would be nice if error response contained this indexes.
Is there any other way to return nice indexed errors from server, and use rails as api for json without pain?
Updated to have the same format as Rails nested params
render json: {
order: {
entries: #order.entries.enum_for(:each_with_index).collect{|entry, index|
{
index => {
id: entry.id,
errors: entry.errors.to_hash,
members: entry.members.enum_for(:each_with_index).collect{|member, index|
{
index => {
id: member.id,
errors: member.errors.to_hash
}
} unless member.valid?
}.compact
}
} unless entry.valid?
}.compact
}
}
You should get a JSON response like:
{
order: {
entries: [
0: {
id: 1, # nil, if new record
errors: {},
members: [
0: {
id: 7, # nil, if new record
errors: {
title: ["cant be blank"]
}
},
1: {
id: 13, # nil, if new record
errors: {
title: ["cant be blank"]
}
}
]
}
]
}
}
P.S. Maybe others know a rails-integrated way of doing this. Otherwise, I would say this might be a good feature request for Rails in git.

mongoid criteria result doesn't fill all field

I am new to Rails and MongoDB as well as MongoID..
class User
include Mongoid::Document
include Mongoid::Timestamps
field :fbid, type: String
field :facebookname, type: String
field :competitorFbid, type: String
field :createdAt, type: DateTime
field :updatedAt, type: DateTime
# building constructor the rails way: http://stackoverflow.com/a/3214293/474330
def initialize(options = {})
#fbid = options[:fbid]
#facebookname = options[:facebookname]
#competitorFbid = options[:competitorFbid]
end
def writeasjson
hash = { :fbid => #fbid,
:facebookname => #facebookname,
:competitorFbid => #competitorFbid,
:createdAt => #createdAt,
:updatedAt => #updatedAt
}
hash.to_json
end
attr_accessor :fbid, :facebookname, :competitorFbid, :createdAt, :updatedAt
end
I am using MongoID to query my mongodb database like this:
myuser = User.where(fbid: params[:fbid]).first
render :json => myuser.writesajson
However, the result is all the fields are "null"
If I print the criteria result like this,
render :json => myuser
it prints all the _id, authData and bcryptPassword field, however the rest of the field have null value,
Here is what I got from the MongoDB database in my app. If I query from MongoHub, all the null values will be filled
{
"_id": {
"$oid": "56d2872f00af597fa584e367"
},
"authData": {
"facebook": {
"access_token": "yEf8cZCs9uTkrOq0ZCHJJtgPFxPAig9yhW6DhBCLuJqPdMZBLPu",
"expiration_date": "2016-04-17T13:52:12.000Z",
"id": "9192631770"
}
},
"bcryptPassword": "$2a$10$9mUW3JWI51GxM1VilA",
"competitorFbid": null,
"createdAt": null,
"created_at": null,
"facebookname": null,
"fbid": null,
"objectId": "nLurZcAfBe",
"runCount": 2446,
"sessionToken": "0SwPDVDu",
"updatedAt": null,
"updated_at": null,
"username": "XgcWo4iUCK"
}
I have been debugging the whole day without any light, any help will be greatly appreciated...
EDIT:
adding the response
{"_id":{"$oid":"56d2872f00af597fa584e366"},"authData":{"facebook":{"access_token":"[ACCESS_TOKEN_REMOVED]","expiration_date":"2015-12-19T14:17:25.000Z","id":"[ID_REMOVED]"}},"bcryptPassword":"[PASSWORD_REMOVED]","competitorFbid":null,"createdAt":null,"created_at":null,"facebookname":null,"fbid":null,"objectId":"H5cEMtUzMo","runCount":790,"sessionToken":"[SESSION_TOKEN_REMOVED]","updatedAt":null,"updated_at":null,"username":"[USERNAME_REMOVED]"}
A field in the database is declared using the field method:
field :fbid, type: String
This also defines fbid and fbid= methods to work with the fbid attribute.
An instance variable with associated accessor and mutator methods is declared using the attr_accessor method:
attr_accessor :fbid
This will also add fbid and fbid= methods to work with the underlying instance variable.
They're not the same thing. Mongoid only knows about fields, those are the things that it will work with in the database so your query works; field also defines accessor and mutator methods for your fields.
But you have an attr_accessor call after your field calls so the methods that field creates (such as fbid and fbid=) are overwritten by those created by attr_accessor. The result is that all your attributes appear to be nil.
The solution is to drop the attr_accessor call from your class. You only need the field calls.

Mongoid Group By or MongoDb group by in rails

I have a mongo table that has statistical data like the following....
course_id
status which is a string, played or completed
and timestamp information using Mongoid's Timestamping feature
so my class is as follows...
class Statistic
include Mongoid::Document
include Mongoid::Timestamps
include Mongoid::Paranoia
field :course_id, type: Integer
field :status, type: String # currently this is either play or complete
I want to get a daily count of total # of plays for a course. So for example...
8/1/12 had 2 plays, 8/2/12 had 6 plays. Etc. I would therefore be using the created_at timestamp field, with course_id and action. The issue is I don't see a group by method in Mongoid. I believe mongodb has one now, but I'm unsure of how that would be done in rails 3.
I could run through the table using each, and hack together some map or hash in rails with incrementation, but what if the course has 1 million views, retrieving and iterating over a million records could be messy. Is there a clean way to do this?
As mentioned in comments you can use map/reduce for this purpose. So you could define the following method in your model ( http://mongoid.org/en/mongoid/docs/querying.html#map_reduce )
def self.today
map = %Q{
function() {
emit(this.course_id, {count: 1})
}
}
reduce = %Q{
function(key, values) {
var result = {count: 0};
values.forEach(function(value) {
result.count += value.count;
});
return result;
}
}
self.where(:created_at.gt => Date.today, status: "played").
map_reduce(map, reduce).out(inline: true)
end
which would result in following result:
[{"_id"=>1.0, "value"=>{"count"=>2.0}}, {"_id"=>2.0, "value"=>{"count"=>1.0}}]
where _id is the course_id and count is the number of plays.
There is also dedicated group method in MongoDB but I am not sure how to get to the bare mongodb collection in Mongoid 3. I did not have a chance to dive into code that much yet.
You may wonder why I emit a document {count: 1} as it does not matter that much and I could have just emitted empty document or anything and then always add 1 to the result.count for every value. The thing is that reduce is not called if only one emit has been done for particular key (in my example course_id has been played only once) so it is better to emit documents in the same format as result.
Using Mongoid
stages = [{
"$group" => { "_id" => { "date_column_name"=>"$created_at" }},
"plays_count" => { "$sum" => 1 }
}]
#array_of_objects = ModelName.collection.aggregate(stages, {:allow_disk_use => true})
OR
stages = [{
"$group" => {
"_id" => {
"year" => { "$year" => "$created_at" },
"month" => { "$month" => "$created_at" },
"day" => { "$dayOfMonth" => "$created_at" }
}
},
"plays_count" => { "$sum" => 1 }
}]
#array_of_objects = ModelName.collection.aggregate(stages, {:allow_disk_use => true})
Follow the links below to group by using mongoid
https://taimoorchangaizpucitian.wordpress.com/2016/01/08/mongoid-group-by-query/
https://docs.mongodb.org/v3.0/reference/operator/aggregation/group/

Resources