I want to sort hash by position, I am using sort_by but it is not sorting out, as it should
hash = {
"a": {"name": "a", "type": "text", "position": 1, "required": "false"},
"b": {"name": "b", "type": "text", "position": 4, "required": "false"},
"c": {"name": "c", "type": "text", "position": 2, "required": "false"},
"d": {"name": "d", "type": "text", "position": 3, "required": "false"}
}
to sort this I am using following command
temp = hash.sort_by { |k,v| k[0]['position'] }
There is no error but I am getting save above hash without any sorting. Even I am using temp to create new hash and but it is same. where I want to it should be sorted by position 1,2,3,4. It is part of Ruby on Rails where I am creating these fields.
sort_by is called with two arguments, k and v which refer to the entry's key and value.
Since you want to sort by position, you have to use v[:position]:
hash.sort_by { |k, v| v[:position] }
#=> [[:a, {:name=>"a", :type=>"text", :position=>1, :required=>"false"}],
# [:c, {:name=>"c", :type=>"text", :position=>2, :required=>"false"}],
# [:d, {:name=>"d", :type=>"text", :position=>3, :required=>"false"}],
# [:b, {:name=>"b", :type=>"text", :position=>4, :required=>"false"}]]
Related
I have a Hash like this, which should be "merged" to its uniq nested values
[
{
"slug": "color",
"values": [{ "slug": "amethyst" },
{ "slug": "coral" }],
},
{
"slug": "color",
"values": [{ "slug": "amethyst" }],
},
{
"slug": "power-source",
"values": [{ "slug": "110V"}],
}
]
at the same time it should count the duplicate values but made uniq in an items array:
{ "slug": "color",
"items": [
{
"slug": "amethyst",
"count": 2
},
{
"slug": "coral",
"count": 1
}]
},
{
"slug": "power-source",
"items": [
{
"slug": "110V",
"count": 1
}]
}
]
is there a "Rails method" to achieve this?
Thank you
I think there's nothing built-in in Rails that allows you to get such a custom requirement, but you can achieve it by playing around with different methods and their return values:
data
.group_by { |hash| hash[:slug] }
.transform_values do |values|
values
.flat_map { |vals| vals[:values] }
.group_by { |value| value[:slug] }
.transform_values(&:count)
end.map do |slug, items|
[slug, items.map { |item, count| {slug: item, count: count} }]
end.map { |slug, items| {slug: slug, items: items} }
# [{:slug=>"color",
# :items=>[{:slug=>"amethyst", :count=>2}, {:slug=>"coral", :count=>1}]},
# {:slug=>"power-source", :items=>[{:slug=>"110V", :count=>1}]}]
As you see, you can first group every hash in the array by their slug value, then transform the values that hash contains, mapping and flattening every array by their values key and then grouping to get their total.
After that you can just create the hash with its keys/values you need.
It might simplify the things a bit if you end up with a single hash, whose keys are the "slugs" and contains the items as its values.
I need delete all key created_at and updated_at in array of hashs . My hash looks like :
assessment_with_desc = {
"id": 1,
"name": "First assessment",
"created_at": "2020-03-14T20:13:27.006Z",
"updated_at": "2020-03-14T20:13:27.006Z",
"description_with_child_models": [
{
"id": 3,
"title": "First category",
"created_at": "2020-02-20T15:32:46.379Z",
"updated_at": "2020-03-14T20:16:11.530Z",
"accessment_id": 1,
"sub_categories": [
{
"id": 1,
"title": "First sub_category",
"category_id": 3,
"created_at": "2020-02-20T15:40:49.793Z",
"updated_at": "2020-02-20T15:40:49.793Z",
"stages": [
{
"id": 5,
"title": "First stage",
"sub_category_id": 1,
"created_at": "2020-02-20T15:44:10.603Z",
"updated_at": "2020-02-20T15:44:10.603Z"
}
]
}
]
}
]
}
I did it, but it work only in this case assessment_with_desc.delete('created_at') and assessment_with_desc.delete('updated_at'):
assessment_with_desc.delete('created_at')
assessment_with_desc.delete('updated_at')
assessment_with_desc['description_with_child_models'].delete('created_at')
assessment_with_desc['description_with_child_models'].delete('updated_at')
assessment_with_desc['description_with_child_models'][0]['sub_categories'].delete('created_at')
assessment_with_desc['description_with_child_models'][0]['sub_categories'].delete('updated_at')
assessment_with_desc['description_with_child_models'][0]['sub_categories'][0]['stages'].delete('created_at')
assessment_with_desc['description_with_child_models'][0]['sub_categories'][0]['stages'].delete('updated_at')
You can recursively invoke a function to remove the key/value in case the key is equal to :created_at or :updated_at:
def recursively_delete_timestamps(object)
object.transform_values do |value|
next value unless value.is_a?(Array)
value.map do |inner_hash|
recursively_delete_timestamps(inner_hash)
end
end.reject do |key, _|
key.in?(%i[created_at updated_at])
end
end
recursively_delete_timestamps(assessment_with_desc)
# {:id=>1,
# :name=>"First assessment",
# :description_with_child_models=>
# [{:id=>3,
# :title=>"First category",
# :accessment_id=>1,
# :sub_categories=>
# [{:id=>1,
# :title=>"First sub_category",
# :category_id=>3,
# :stages=>[{:id=>5, :title=>"First stage", :sub_category_id=>1}]}]}]}
Notice, the hash remains unchanged. Nor transform_values, map or reject modify the original object, but return a new one.
I am trying to load data from redis db. I have a api only rails app and trying to render the json data as per requirement.
Currently I am able to get the data from redis in the following format.
[
{
"id": 1,
"name": "Stephenie Meyer",
"created_at": "2018-04-17T07:40:50.417Z",
"updated_at": "2018-04-17T07:40:50.417Z"
},
{
"id": 2,
"name": "V.C. Andrews",
"created_at": "2018-04-17T07:40:50.613Z",
"updated_at": "2018-04-17T07:40:50.613Z"
},
{
"id": 3,
"name": "Sophie Kinsella",
"created_at": "2018-04-17T07:40:50.646Z",
"updated_at": "2018-04-17T07:40:50.646Z"
}
]
How can convert this in a way such that the key value pairs of name,created and updated will be hash to a id key-value pair.
Into this
{"id": 1,
{
"name": "Stephenie Meyer",
"created_at": "2018-04-17T07:40:50.417Z",
"updated_at": "2018-04-17T07:40:50.417Z"
}
}
helper method for getting redis data.
def fetch_authors
authors = $redis.get('authors')
if authors.nil?
authors = Author.all.to_json
$redis.set("authors", authors).to_json
$redis.expire("authors", 5.hour.to_i)
end
JSON.load authors
end
And displaying on index page using
def index
#authors = fetch_authors
render json: #authors
end
The closest to what you want would probably be:
input = ...
input.map { |hash| [hash.delete(:id) || hash.delete('id'), hash] }.to_h
#⇒ {{1=>{:name=>...},
# {2=>{:name=>...},
# {3=>{:name=>...}}
Not exactly what you want because that's not correct syntax but you can achieve something similar with group_by
arr = [
{
"id": 1,
"name": "Stephenie Meyer",
"created_at": "2018-04-17T07:40:50.417Z",
"updated_at": "2018-04-17T07:40:50.417Z"
},
{
"id": 2,
"name": "V.C. Andrews",
"created_at": "2018-04-17T07:40:50.613Z",
"updated_at": "2018-04-17T07:40:50.613Z"
},
{
"id": 3,
"name": "Sophie Kinsella",
"created_at": "2018-04-17T07:40:50.646Z",
"updated_at": "2018-04-17T07:40:50.646Z"
}
]
arr.group_by { |e| e[:id] }
This will return
{
1 => [
{
:id => 1,
:name => "Stephenie Meyer",
:created_at => "2018-04-17T07:40:50.417Z",
:updated_at => "2018-04-17T07:40:50.417Z"
}
],
2 => [
{
:id => 2,
:name => "V.C. Andrews",
:created_at => "2018-04-17T07:40:50.613Z",
:updated_at => "2018-04-17T07:40:50.613Z"
}
],
3 => [
{
:id => 3,
:name => "Sophie Kinsella",
:created_at => "2018-04-17T07:40:50.646Z",
:updated_at => "2018-04-17T07:40:50.646Z"
}
]
}
I have a simple "rss" (ApplicationRecord) table indexed by an id. I would like to have a structured JSON that group each user from a family in an array structure. And then each family in a global array. How can I do that ?
my current plain code to put my data in a json file is :
json.rss #rss do |rs|
json.id rs.id
json.name rs.name
json.family rs.family
json.lastdate rs.lastdate
json.last rs.last
json.s1w rs.s1w
json.s2w rs.s2w
end
But the target file that I want is this one :
{
"rss": [
{
"familyname": "Smith",
"children": [
{
"id": "1",
"name": "bob",
"lastdate": "2010-09-23",
"last": "0.88",
"s1w": "0.83",
"s2w": "0.88"
},
{
"id": 2,
"name": "Mary",
"lastdate": "2011-09-23",
"last": "0.89",
"s1w": "0.83",
"s2w": "0.87"
}
]
},
{
"familyname": "Wesson",
"children": [
{
"id": "1",
"name": "john",
"lastdate": "2001-09-23",
"last": "0.88",
"s1w": "0.83",
"s2w": "0.88"
},
{
"id": 2,
"name": "Bruce",
"lastdate": "2000-09-23",
"last": "0.89",
"s1w": "0.83",
"s2w": "0.87"
}
]
}
]
}
The grouping you are trying to achieve can be done in Ruby with:
#rss.group_by(&:family).values
This is assuming #rss is an array-like collection of objects that have a .family method. The result: is an array of arrays of objects grouped by family.
Now it will be up to use to use Jbuilder's array! method to build the desired JSON output.
I am still trying to create an app where the user could transform his name or a word with chemical elements (like breaking bad logo).
The user will enter a word in a text field and when he'll submit it will return him the word with the corresponding chemical symbols if they match, or it will display the "raw" letters if they don't match.
ex: If no symbol matches I am keepking the initial entry so it could be: hello => He ll O (bold char represent the existing chemical symbols)
I know this could be done in js, but the challenge is ROR (btw I don't know any js...)
In an earlier question I had just a hash like:
symbols =
{"cr" => "Cr",
"sb" => "Sb",
"ag" => "Ag",
"ar" => "Ar",
"as" => "As",
"at" => "At",
"n" => "N",
"ba" => "Ba",
"bk" => "Bk"}
and I was using name.downcase.gsub!(Regexp.union(symbols.keys), symbols)to transform the user entry. Actually I need more datas... that's why I chosen the json file.
Like on this picture i will need to use:
"number"
"small"
"molar"
(and the "name" will appear in a caption below)
I have organized a .json file with all the symbols I may need in the app and stored it in my config/periodic_table.json (pasted just a sample cause it's very long).
1°) If a user enter "hello" how do I loop to search for the "he" hash and print the "name","number", "small" and the "molar"
2°) I will use the json as a database (I will use heroku to deploy) so do I have anything to transform for using json and pg together?
[ "symbols"
{
"h": {
"name": "Hydrogen",
"number": 1,
"small": "H",
"molar": 1.00794
},
"he": {
"name": "Helium",
"number": 2,
"small": "He",
"molar": 4.002602
},
"b": {
"name": "Boron",
"number": 5,
"small": "B",
"molar": 10.811
},
"c": {
"name": "Carbon",
"number": 6,
"small": "C",
"molar": 12.0107
},
"n": {
"name": "Nitrogen",
"number": 7,
"small": "N",
"molar": 14.0067
}
}
]
I will need to loop first with the symbols that contain 3 chars, then 2 then 1... shall i change anythin in the json, like an harray for the hashes that contains 3 chars, another for 2 chars , and for 1char?
Correct Json Format
[{
"symbols":{
"h":{
"name": "Hydrogen",
"number": 1,
"small": "H",
"molar": 1.0079
},
"he":{
"name": "Helium",
"number": 2,
"small": "He",
"molar": 4.002602
},
"b": {
"name": "Boron",
"number": 5,
"small": "B",
"molar": 10.811
},
"c": {
"name": "Carbon",
"number": 6,
"small": "C",
"molar": 12.0107
},
"n": {
"name": "Nitrogen",
"number": 7,
"small": "N",
"molar": 14.0067
}
}
}]