Search in nested Postgresql JSONB column - ruby-on-rails

How to search in nested jsonb column in Rails.
Model: Shop
jsonb column: shop_data
shop_data: {
common_data: {
"image_url" => "https://sample.com/img.jpg"
"token" => "AOsa2123ASDasdaasasda"
"uid" => ""
"expires_at" => ""
}
}
Wanna make a scopes that will checks for records where:
1. shop_data->common_data->expires_at IS NOT NULL
2. shop_data->common_data->image_url IS NULL

1. scope :expired, -> { where("shop_data -> 'common_data' ->> 'expires_at' IS NOT NULL") }
2. scope :null_image, -> { where("shop_data -> 'common_data' ->> 'image_url' IS NULL") }
source JSON functions in postgres
Hope it helps

Related

Exposing a new table and column to rails with elastic search & searchkick

I have added a new field to searchkick, and can get the results to return for that model but I am trying to find the parent model as a result.
People can have many tags. I can search for People just fine by any attribute that exist on that schema. I can search for Tags just fine and searchkick and elasticsearch will return the result.
I want to search by the tag name and then return the people associated with that result
search_query = if #q.present?
{
query: {
multi_match: {
query: #q.strip.downcase,
fields: %w(first_name last_name email_address phone_number address_1 name),
type: 'cross_fields',
operator: 'AND'
}
}
}
else
{
query: {
query_string: {
query: ('*'),
default_operator: 'AND'
}
},
# order: { signup_at: :desc },
# page: search_params[:page],
# per_page: (Kaminari.config.default_per_page unless request.format == :csv)
}
end
#results = Person.search search_query
#people = #results.records.where(active: true).order("people.#{sort_column}" + ' ' + sort_direction).page(search_params[:page])
This currently works fine for searching people only attributes. If I replace
#results = Tag.search search_query When inputting a tag name I get the resulting tag.
An older query was in place that worked fine, but had to be changed to allow full name searching. The old query was
query: {
query_string: {
query: (#q.strip.downcase),
default_operator: 'AND'
}
},
And that returned the associated tags with the rest of the code remaining unchanged.
Here is the search_data method that exist on the person model` # For Searchkick
def search_data
attributes.
each { |_k, v| v.downcase if v.is_a? String }.
merge({
tag: tags.map { |t| t.name.downcase },
trait: traits.map { |t| t.name.downcase },
trait_value: person_traits.map { |pt| pt.value.downcase },
question: questions.map(&:id),
answer: answers.map { |a| a.value.downcase },
last_participated: last_participation_date.to_s,
signup_at: signup_at.to_s
})
end
Please let me know if I can provide other information to help.
Since you have tags in the search_data method on Person, you can do:
Person.search("sometag", fields: [:tag])
Make sure your search_data method returns correct data with:
Person.first.search_data
And make sure you have reindexed.
Person.reindex

How to query over range of range key of dynamodb?

I'm using 'aws-sdk', '~> 2.6.44'. I've an activities table where I store all the activities performed by a user.
params = {
table_name: 'activities', # required
key_schema: [ # required
{
attribute_name: 'actor', # required User.1
key_type: 'HASH', # required, accepts HASH, RANGE
},
{
attribute_name: 'created_at', # timestamp
key_type: 'RANGE'
}
],....
I want to query this table with all the activities performed by user in past 1 day. Looks like the AWS documentation site has the documentation for SDK version 3.
tableName = 'activities'
params = {
table_name: tableName,
key_condition_expression: "#user = :actor and #time between :start_time and :end_time",
expression_attribute_names: {
"#user" => 'actor',
"#time" => "created_at"
},
expression_attribute_values: {
actor: 'User.1',
":start_time" => (Time.now - 1.days).to_i,
":end_time" => (Time.now + 1.days).to_i
}
}
DynamodbClient.client.get_item(params)
# Throws: ArgumentError: no such member :key_condition_expression
I tried with filter expression:
tableName = 'activities'
params = {
table_name: tableName,
key: {
actor: 'User.1'
},
filter_expression: "created_at BETWEEN (:id1, :id2)",
expression_attribute_values: { ":id1" => (Time.now - 1.days).to_i,":id2" => (Time.now + 1.days).to_i},
projection_expression: "actor"
}
DynamodbClient.client.get_item(params)
# Throws ArgumentError: no such member :filter_expression
What should be right way to query DynamoDB table with a ranged option for range key?
Looks like I should use query if I'm not trying to retrieve a specific record.
Following query worked:
tableName = 'activities'
params = {
table_name: tableName,
key_condition_expression: "#user = :actor and #time between :start_time and :end_time",
expression_attribute_names: {
"#user" => 'actor',
"#time" => "created_at"
},
expression_attribute_values: {
":actor" => 'User.1',
":start_time" => (Time.now - 1.days).to_i,
":end_time" => (Time.now + 1.days).to_i
}
}
DynamodbClient.client.query(params)
=> #<struct Aws::DynamoDB::Types::QueryOutput items=[{"actor"=>"User.1", "action"=>"Like", "created_at"=>#<BigDecimal:7fa6418b86e0,'0.150683976E10',18(27)>, "source"=>"FeedSource.661", "body"=>{"id"=>#<BigDecimal:7fa6418b82f8,'0.72E2',9(18)>}, "target"=>"FeedEntry.8419"}], count=1, scanned_count=1, last_evaluated_key=nil, consumed_capacity=nil>
:)

Finding all instances that match value within json stored in Table in Rails

I have a table "Transfer" in my database. In this table transfer I have a column "archive" in which I store a Json object.
So I have something like that:
archive:{
"AuthorId"=>"6621381"
}
My goal is to find all the transfers where "AuthorId"=>"6621381". Is it possible to do that with rails ?
Something that looks like:
Transfer.where(archive: {"AuthorId" => "6621381"})
Use the ->> operator to access the object field as text :
Transfer.where("archive ->> 'AuthorId' = ?", "123")
=> [0] #<Transfer:0x00000002903a60> {
:archive => {
"AuthorId" => "123"
}
}
]
It also works with other operators, such as LIKE/ILIKE :
Transfer.where("archive ->> 'AuthorId' ILIKE ?", "12%")
=> [
[0] #<Transfer:0x00000002893058> {
:archive => {
"AuthorId" => "123"
}
},
[1] #<Transfer:0x00000002892c98> {
:archive => {
"AuthorId" => "124"
}
}
]

Rails update multiple records find based on other id

Using Rails 3.2. As shown in the doc on update method, the update finds based on id:
update(id, attributes)
# id - This should be the id or an array of ids to be updated.
# Updates multiple records
people = { 1 => { "first_name" => "David" }, 2 => { "first_name" => "Jeremy" } }
Person.update(people.keys, people.values)
What if I want to update an array found based on other columns? For example:
people = { 'cook' => { "first_name" => "David" }, 'server' => { "first_name" => "Jeremy" } }
Find people with role = cook, then update first_name = David; find people with role = server, then update first_name = jeremy.
I want it to be done in 1 query if possible, and not by SQL. Thanks.
You can Achieve this with #update_all
people = { 'cook' => { "first_name" => "David" }, 'server' => { "first_name" => "Jeremy" } }
Person.update_all(people.keys, people.values)
In that case I would write my own sql statement. I depends on which database backend you are using.
http://www.postgresql.org/docs/9.1/static/plpgsql-control-structures.html
https://dev.mysql.com/doc/refman/5.0/en/case.html
The update method doesn't execute 1 SQL query when passed an array of ids and values. If you view the source code for update, you will see it loops through the array and executes 2 queries for each record (a find and an update query) to then return an array of updated objects.
If you're happy accepting that you will need to make 2 queries per row, then you can use the following code for finding people by role.
people = { 'cook' => { "first_name" => "David" }, 'server' => { "first_name" => "Jeremy" } }
updated_people = people.keys.map.with_index do |role, index|
object = Person.where(role: role).first!
object.update(people.values[index])
object
end
Note: This code only updates the first record it finds per role because I've assumed there will only be one cook with the first name 'David'.
If you want to only use 1 SQL statement, you should look at doing it in SQL like devanand suggested.

How to filter search by attribute only if it exists using ElasticSearch and Tire?

Right now I wrote
Tire.search INDEX_NAME do
query do
filtered do
query { string term }
filter :or, { missing: { field: :app_id } },
{ terms: { app_id: app_ids } }
end
end
end.results.to_a
Well returning items that either have no app_id or one that matches your terms sounds like a job for an or filter - I'd try
filter :or, [
{:not => {:exists => {:field => :app_id}}},
{:terms => {:app_id => app_ids}}
]

Resources