Mongoid Aggregate result into an instance of a rails model - ruby-on-rails

Introduction
Correcting a legacy code, there is an index of object LandingPage where most columns are supposed to be sortable, but aren't. This was mostly corrected, but few columns keep posing me trouble.
Theses columns are the one needing an aggregation, because based on a count of other documents. To simplify the explanation of the problem, I will speak only about one of them which is called Visit, as the rest of the code will just be duplication.
The code fetch sorted and paginate data, then modify each object using LandingPage methods before sending the json back. It was already like this and I can't modify it.
Because of that, I need to do an aggregation (to sort LandingPage by Visit counts), then get the object as LandingPage instance to let the legacy code work on them.
The problem is the incapacity to transform Mongoid::Document to a LandingPage instance
Here is the error I got:
Mongoid::Errors::UnknownAttribute:
Message:
unknown_attribute : message
Summary:
unknown_attribute : summary
Resolution:
unknown_attribute : resolution
Here is my code:
def controller_function
landing_pages = fetch_landing_page
landing_page_hash[:data] = landing_pages.map do |landing_page|
landing_page.do_something
# Do other things
end
render json: landing_page_hash
end
def fetch_landing_page
criteria = LandingPage.where(archived: false)
columns_name = params[:columns_name]
column_direction = params[:column_direction]
case order_column_name
when 'visit'
order_by_visits(criteria, column_direction)
else
criteria.order_by(columns_name => column_direction).paginate(
per_page: params[:length],
page: (params[:start].to_i / params[:length].to_i) + 1
)
end
def order_by_visit(criteria, order_direction)
def order_by_visits(landing_pages, column_direction)
LandingPage.collection.aggregate([
{ '$match': landing_pages.selector },
{ '$lookup': {
from: 'visits',
localField: '_id',
foreignField: 'landing_page_id',
as: 'visits'
}},
{ '$addFields': { 'visits_count': { '$size': '$visits' }}},
{ '$sort': { 'visits_count': column_direction == 'asc' ? 1 : -1 }},
{ '$unset': ['visits', 'visits_count'] },
{ '$skip': params[:start].to_i },
{ '$limit': params[:length].to_i }
]).map { |attrs| LandingPage.new(attrs) { |o| o.new_record = false } }
end
end
What I have tried
Copy and past the hash in console to LandingPage.new(attributes), and the instance was created and valid.
Change the attributes key from string to symbole, and it still didn't work.
Using is_a?(hash) on any element of the returned array returns true.
Put it to json and then back to a hash. Still got a Mongoid::Document.
How can I make the return of the Aggregate be a valid instance of LandingPage ?

Aggregation pipeline is implemented by the Ruby MongoDB driver, not by Mongoid, and as such does not return Mongoid model instances.
An example of how one might obtain Mongoid model instances is given in documentation.

Related

Products Filter by title in rails app + shopify_app gem

I have tried mostly everything. Checked all docs and stack questions and shopify community like:
Shopify API how to do a search query with like
https://community.shopify.com/c/Shopify-APIs-SDKs/Shopify-api-search-products-by-title/td-p/341866
How to search products by title using Shopify product search API?
https://github.com/Shopify/shopify_app
https://community.shopify.com/c/Shopify-APIs-SDKs/Search-product-from-title-handle-and-description/td-p/469156
and found out that
#search = "2018";
#products = ShopifyAPI::Product.find(:all, params: { limit: 10,title:#search })
but this is returning empty array although I have may records containing this in title. https://prnt.sc/sx37o6
I want to get records according to #search
I have tried Product.search too but it causes: undefined method `search' for ShopifyAPI::Product:Class
Using the RestAPI I failed doing filtering (with wildcards for example) as well. But with the GraphQL-API the search functionalities (see here https://shopify.dev/concepts/about-apis/search-syntax) are pretty solid.
This is an example including auth, filtering by title including wildcard-support (for part matching) and mapping results to a simpel array of hashes:
#responses = []
shopify_session = ShopifyAPI::Session.temp(
domain: shop.shopify_domain,
token: shop.shopify_token,
api_version: ShopifyApp.configuration.api_version
) do
client = ShopifyAPI::GraphQL.client
ql_query = <<-GRAPHQL
{
products(first: 10, query: "title:*#{query}*") {
edges {
node {
id
title
handle
}
}
}
}
GRAPHQL
query_result = client.query(client.parse(ql_query))
query_result.data.products.edges.each do |result|
#responses << {
id: result.node.id,
title: result.node.title,
handle: result.node.handle
}
end
end
#responses

How to use geoNear for List of Articles, when Addresses are in separate table?

I need a RoR mongoDB query to list articles within a given radius, sorted by created_at.
The challenge is that addresses are saved in separate table and referenced by key/id out of articles. Don't know how to make query with geoNear for this scenario.
Also pagination needed and performant query desirable.
Currently approaching like:
Get addresses within defined radius
Get articles associated to address results out of 1.
sort_by address (geoNear default)
Pagination is making usage of last_address_id. Also here have an issue, as last page is in loop.
#seaches_controller.rb
def index
#addresses =
Address.get_addresses_with_radius(article_search_params).to_a
#address_hash = #addresses.group_by{|a| a['_id'].to_s}
#articles = Article.includes(:gift, :category)
.where( transaction_status:
{
'$nin' => ["concluded"]
},
address_id:
{
:$in => #addresses.map{|a| a['_id'].to_s}
}
).to_a
.sort_by{|m|
#addresses.map{|a|
a['_id']}.index(m['address_id']) }
end
#address.rb
def self.get_addresses_with_radius(params, additional_query={})
#raw query for aggreegate with geoNear
last_maximum_distance = params[:last_maximum_distance] || 0 # in
meeters
radius = params[:radius] || 5000000 #In Meters
query_params = additional_query
if params[:last_address_id]
query_params[:_id] ||= {}
query_params[:_id] = query_params[:_id].merge({ '$ne' =>
(BSON::ObjectId(params[:last_address_id])) } )
end
addresses_in_radius =
Address.collection.aggregate([
{
'$geoNear':
{
near:
{
type: "Point",
coordinates: [ params[:lat].to_f, params[:lon].to_f ]
},
distanceField: "distance_from", #GeoNear Will atomatically distance as distance_from_field
minDistance: last_maximum_distance.to_f,
maxDistance: radius,
query: query_params,
#query:{ 'location.0': {'$ne' =>
params[:last_lat].to_f},'location.1': {'$ne' => params[:last_lon].to_f}},
spherical: true
}
},
{"$limit": params[:per_page].to_i}
])
addresses_in_radius
end
Currently I'm getting the list of articles sorted by addresses/distance, as per default geoNear behavior => should be by created_at.
Pagination is somehow based on addresses => should ideally be based on articles.
Pagination is buggy, as last page is loading in loop => loop-bug to go away.
Not sure if best is to first search for articles and then addresses, or first addresses and then get the articles; relevant note: all within defined radius.

Looping over Ruby hash and accessing values

I am quite new to Ruby and could not find an appropriate answer to my questions. Let's say I have hash named
users_hsh = {}.
I am looping through all of my users in the DB and creating the following.
users.each do |user|
users_hsh[user.full_name] = {
completed_activities: some_integer_value,
active_activities: some_integer_value,
future_activities: some_integer_value
}
end
Now, I created a new hash named
total_sum_not_zero_user_hsh = {}.
I want to loop over all of the users in the users_hsh and check for each user if the total sum of completed_activities + active_activities + future_activities does not equal 0 and if this condition holds, I want to add this user to total_sum_not_zero_user_hsh. I have done the following but seems that this does not work.
users_hsh.each do |usr|
if usr.values.sum != 0
total_sum_not_zero_user_hsh[usr] = {
completed_activities: some_integer_value,
active_activities: some_integer_value,
future_activities: some_integer_value
}
end
end
What am I doing wrong? Thanks in advance!
Let's use your example of:
users_hash = {
"Elvin Jafarli" => {
completed_activities: 10,
active_activities: 2,
future_activities: 0
}
}
Think carefully about what your data structure actually is: It's a hash that maps the user name to some user attributes. If you loop through these values, you don't just get a usr, you get back precisely this mapping.
It's helpful to name your variables descriptively:
users_hsh.each do |user_name, user_attributes|
if user_attributes.values.sum != 0
# ...
end
end
With your attempt, you would have seen an error like this: NoMethodError: undefined method 'values' for #<Array:0x00007fe14e22f538>. What happened is that each usr was actually an Array such as:
["Elvin Jafarli", {completed_activities: 10, active_activities: 2, future_activities: 0}]

Rails app: I need to build a json object from params inside a loop

I need to build a json object inside a loop using params.
My params look like this...
params[:answers]
returns => {"1"=>"answer1", "2"=>"answer2"}
The keys in this json object are the id's of the survey question.
So I planed to loop through the keys to build the json object like this...
def build_answersheet_json(params[:answers], params[:survey_id])
params[:answers].keys.each do |question_id|
current_question = question_id
current_answer = params[:answers][question_id]
end
end
Since im using "t.json" in my migration to save json to postgres, I wanted to use the extracted question_id and answer to build a json object that looks something like this...
{
survey_id: '1',
answers: {
question: [{
question_id: 1,
answer: 'answer1'
}, {
question_id: 2,
answer: 'answer2'
}]
}
}
Ive been trying to do this using a method that looks somthing like this...
build_answersheet_json(params[:answers], params[:survey_id])
Ive tried JSON.parse() and Ive tried to just logically work through it but I cant seem to figure this out.
Any help is appreciated.
Maybe you can try something like that:
/* fake params (to test) */
params = {
survey_id: '1',
answers: {
"1"=>"answer1",
"2"=>"answer2",
"3"=>"answer3",
"4"=>"answer4"
}
}
def build_answersheet_json(answers, survey_id)
{
survey_id: survey_id,
answers: answers.map { |k,v| { question_id: k.to_i, answer: v } }
}
end
survey = build_answersheet_json(params[:answers], params[:survey_id])
puts survey.class
#Hash
puts survey.to_json
# formated JSON string:
# {
# "survey_id":"1",
# "answers":[
# {"question_id":1,"answer":"answer1"},
# {"question_id":2,"answer":"answer2"},
# {"question_id":3,"answer":"answer3"},
# {"question_id":4,"answer":"answer4"}
# ]
# }
In order to save to a t.json postgress column type, just pass the Hash survey object, like that:
YourModel.create(survey: survey)
Source: http://edgeguides.rubyonrails.org/active_record_postgresql.html
Try
{
survey: ¯\_༼◉ل͟◉༽_/¯,
}
Json may not be parsed if json have construction like this:
survey = {
}
Json may not contain = and assignment
Check real variables values with puts varname.inspect near at code lines where you meet unexpected behaviour.

retrieve data from database - hash

I have a table called audits which has a column 'changes' storing data in the form of hash
I would like to retrieve all entries with the following conditions:
- auditable_type = 'Expression'
- action = 'destroy'
- changes = { :EXP_SUBMISSION_FK =>'9999992642'}
I first tried the following code which returns me with nothing:
#deleted_history = Audit.find(:all, :conditions => ["auditable_type =? AND action = ? AND changes = ?",'Expression', 'destroy' , { :EXP_SUBMISSION_FK =>'9999992642'} ])
I then tried the following code which retrieves all entries in the 'audits' table with auditable_type = 'Expression' and action = 'destroy'.
I then loop through the resultset and discards all entries where EXP_SUBMISSION_FK is not equal to 9999992642. The code below returns me 5 entries/records
#deleted_history = Audit.find(:all, :conditions => ["auditable_type =? AND action = ?",'Expression', 'destroy' ])
#deleted_history.each do |test|
if test.changes['EXP_SUBMISSION_FK'] != 9999992642
#deleted_history = #deleted_history.reject { test }
end
end
I would like to know where did I go wrong with the first code example and whether there is a way of retrieving all entries with the aforementioned conditions in a much simpler way.
Thanks a lot for your help.
i'd do:
#deleted_history.select!{|hist| hist.changes['EXP_SUBMISSION_FK'] == '9999992642'}
One potential cause of failure is that you're looking for 9999992642 but you state before the value is '9999992642'
You just use something like below. I am storing element_values as a hash and i am selecting records based on the key/value pair.
scope :find_by_field_values, lambda {
|field_name, field_value|
(where("element_values like ?", "%\"#{field_name}\":\"%#{field_value}%"))
}
just try this based on your scenario.

Resources