JSON (not jsonb column) column merge for multiple objects in rails - ruby-on-rails

Initially I have the JSON hash value like,
a = { "1": 1, "2": 2 }. ( Initial json hash)
Now I need to add the new key-value pair to the json hash
{ "3": 3 }.( New hash key-pair )
After merged the new value, My hash looks like
a = { "1": 1, "2": 2,"3": 3 }. ( Result json hash )
Can you please share your logic for satisfying the above conditions for multiple objects?
Note: 1. My column is not a jsonb. It's a json column.
2. I am using the postsgres database.
3. Merge the key-value pair to multiple objects columns.

Thats simple. We can make use of ruby syntax merge to get the expected result.
a = { "1": 1, "2": 2 }
b = { "3": 3 }
result = a.merge(b) # will give you { "1": 1, "2": 2, "3": 3 }

If you are using Postgres 9.5+, you can convert it to jsonb and concatenate it using || operator and then cast it to json type.
UPDATE t
SET json_col = (json_col ::jsonb || '{ "3": 3 }' ::jsonb)::json;
For older versions, you may have to convert it to text combine and do some manipulations to convert to json type.
UPDATE t
SET json_col = ( replace(json_col :: text, '}', ',')
|| replace('{ "3": 3 }', '{', '' ) ) :: json ;
Demo

Related

KSQL - Select Columns from Array of Struct as Arrays

Similar to KSQL streams - Get data from Array of Struct, my input JSON looks like:
{
"Obj1": {
"a": "abc",
"b": "def",
"c": "ghi"
},
"ArrayObj": [
{
"key1": "1",
"key2": "2",
"key3": "3"
},
{
"key1": "4",
"key2": "5",
"key3": "6"
},
{
"key1": "7",
"key2": "8",
"key3": "9"
}
]
}
I have created a stream with:
CREATE STREAM Example1(Obj1 STRUCT<a VARCHAR, b VARCHAR, c VARCHAR>, ArrayObj ARRAY<STRUCT<key1 VARCHAR, key2 VARCHAR, key3 VARCHAR>>) WITH (kafka_topic='sample_topic', value_format='JSON', partitions=1);
However, I would like only a single row of output from each input JSON document, with the data from each column in the array flattened into arrays, like:
a b key1 key2 key3
abc def [1, 4, 7] [2, 5, 8] [3, 6, 9]
Is this possible with KSQL?
At present you can only flatten ArrayObj in the way you want if you know up front how many elements it will have:
CREATE STREAM flatten AS
SELECT
Obj1.a AS a,
Obj1.b AS b,
ARRAY[ArrayObj[1]['key1'], ArrayObj[2]['key1'], ArrayObj[3]['key1']] as key1,
ARRAY[ArrayObj[1]['key2'], ArrayObj[2]['key2'], ArrayObj[3]['key2']] as key2,
ARRAY[ArrayObj[1]['key3'], ArrayObj[2]['key3'], ArrayObj[3]['key3']] as key3,
FROM Example1;
I guess if you new the array was going to be up to a certain size you could just a case statement to selectively extract the elements, e.g.
-- handles arrays of size 2 or 3 elements, i.e. third element is optional.
CREATE STREAM flatten AS
SELECT
Obj1.a AS a,
Obj1.b AS b,
ARRAY[ArrayObj[1]['key1'], ArrayObj[2]['key1'], ArrayObj[3]['key1']] as key1,
ARRAY[ArrayObj[1]['key2'], ArrayObj[2]['key2'], ArrayObj[3]['key2']] as key2,
CASE
WHEN ARRAY_LENGTH(ArrayObj) >= 3)
THEN ARRAY[ArrayObj[1]['key3'], ArrayObj[2]['key3'], ArrayObj[3]['key3']]
ELSE
null
as key3,
FROM Example1;
If that doesn't suit your needs then the design discussion going on at the moment around lambda function support in ksqlDB may be of interest: https://github.com/confluentinc/ksql/pull/5661

Evaluate stringified json having interpolatable value

Suppose variable b=2 and a stringified json
j = '{"b": "#{b}", "c": null}'
The desired result is:
{
"b" => "2",
"c" => nil
}
My observation:
Since json string contains null, we can not eval it because ruby will say undefined variable or method null. Also, I don't want to replace null with nil.
The only option left is to parse and evaluate.
So I tried the following:
eval(JSON.parse(j).to_s)
which results to
{
"b" => "\#{b}"
}
Please help to achieve desired result?
it should be like this: j = "{'b': '#{b}', 'c': null}"
UPDATE:
Sorry. It should be like this
b = 2
JSON.parse("{ \"b\": \"#{b}\", \"c\": null }")

Parse a complex hash and return changes to keys

I'm using json-compare gem to compare two different json files.
Example file 1:
{"suggestions": [
{
"id1": 1,
"title1": "Test",
"body1": "Test"
}
]
}
Example file 2:
{"suggestions": [
{
"id2": 1,
"title2": "Test",
"body2": "Test"
}
]
}
The gem works well and spits out a hash that looks like this:
{:update=>
{"suggestions" =>
{:update=>
{0=>
{:append=>
{"id2"=>1, "title2"=>"Test", "body2"=>"Test"},
:remove=>
{"id1"=>1, "title1"=>"Test", "body1"=>"Test"},
}
}
}
}
}
How can I parse this and return all the places where json Keys were changed? For the sake of simplicity, how would I put to the console:
id1 changed to id2
title1 changed to title2
body1 changed to body2
For the purpose of what I'm building I don't need to know changes to the values. I just need to know that id1 became id2, etc.
Except if you are relaying on key ordering there is no way to tell that id1 got replaced by id2 and title2 by title1, or that id1 became title1 and id2 became title2. Sounds like you would need specific logic related to the actual key names (in this example searching for different integer suffixes).
Maybe this can be enough for the purpose:
def find_what_changed_in(mhash, result = [])
result << mhash
return if mhash.keys == [:append, :remove]
mhash.keys.each { |k| find_what_changed_in(mhash[k], result) }
result.last
end
find_what_changed_in(changes)
#=> {:append=>{"id2"=>1, "title2"=>"Test", "body2"=>"Test"}, :remove=>{"id1"=>1, "title1"=>"Test", "body1"=>"Test"}}
Where:
changes = {:update=>
{"suggestions" =>
{:update=>
{0=>
{:append=>
{"id2"=>1, "title2"=>"Test", "body2"=>"Test"},
:remove=>
{"id1"=>1, "title1"=>"Test", "body1"=>"Test"},
...

RoR - Find if part of a string matches an array of hashes

I know there are a lot of similar questions but I am struggling to find a specific answer.
i have an array of hashes with key of Symbol and a value of Price, I am looking to filter the array to only include the hashes that have a Symbol that ends in the letters ETL
Data looks like:
[ [
{
"symbol": "ABCDEF",
"price": "4"
},
{
"symbol": "GHIETL",
"price": "5"
}
]
You can use something like this:
array.select { |hash| hash[:symbol].end_with? "ETL" }
From the Ruby docs for select:
Returns an array containing all elements of enum for which the given block returns a true value.
You can also provide multiple suffixes to end_with? if you need to filter by multiple suffixes. For example:
array.select { |hash| hash[:symbol].end_with? "ETL", "DEF" }

Rails PSQL query JSON for nested array and objects

So I have a json (in text field) and I'm using postgresql and I need to query the field but it's nested a bit deep. Here's the format:
[
{
"name":"First Things",
"items":[
{
"name":"Foo Bar Item 1",
"price":"10.00"
},
{
"name":"Foo Item 2",
"price":"20.00"
}
]
},
{
"name":"Second Things",
"items": [
{
"name":"Bar Item 3",
"price":"15.00"
}
]
}
]
And I need to query the name INSIDE the items node. I have tried some queries but to no avail, like:
.where('this_json::JSON #> [{"items": [{"name": ?}]}]', "%#{name}%"). How should I go about here?
I can query normal JSON format like this_json::JSON -> 'key' = ? but need help with this bit.
Here you need to use json_array_elements() twice, as your top level document contains array of json, than items key has array of sub documents. Sample psql query may be the following:
SELECT
item->>'name' AS item_name,
item->>'price' AS item_price
FROM t,
json_array_elements(t.v) js_val,
json_array_elements(js_val->'items') item;
where t - is the name of your table, v - name of your JSON column.

Resources