I started using redis (with ruby on rails) recently and I want to know what is the best way to store this kind of data.
data1 = {
'name2' : {
'age' : xxx,
'height' : xxx,
},
'name2' : {
'age' : xxx,
'weight' : xxx,
}
}
data2 = {
'class1' : {
'num' : xxx,
'location' : xxx,
'teacher' : xxx,
},
'class2' : {
'num' : xxx,
'location' : xxx,
'teacher' : xxx,
}
}
I have tried using the hash map (hset, hmset, hget, hmget) commands but they dont seem to work with the sub keys like "age" and "height".
It appears that you are trying to store some JSON in Redis. Using the redis-rb gem it's pretty trivial to do this. For example you could do the following:
redis = Redis.new
redis.set("key", data1)
Then when you would like to retrieve this data, I would do something like this:
data = JSON.parse(redis.get("key"))
This will retrieve the JSON object that you have stored in Redis with the key named "key", and then parse that into a Ruby hash. I hope this helps!
In Redis hashes you can't store nested elements directly that's why you get those errors.
If you want to be able to access directly items like data1.name1 or data2.class2 then using hashes is the right thing. And to store them you can put everything inside data1.name1 as a json:
HSET data1 name1 {'num' : xxx,'location' : xxx,'teacher' : xxx,}
and to load the data it would be:
HGET data1 name1
But if you don't want to load these fields directly and you can load all what is inside data1 or data2 then vaughanj answer is what you need.
Related
In Neo4j , is there a APOC function that i can use to get value a from jsonObject by passing a Key. For example :
My JsonObject is :
{ "masterName" : {"name1" : "A1" , "name2" : "A2", "name3": "A3", "name4" : "A4"}}
and while importing my csv that has "name" field (values : name1, name2, name3, etc) for which I want to lookup above JsonObjet and get respective value to create nodes.
Assuming that you have your json in a field called myJSON, you could
WITH ‘name1’ AS key, line
WITH apoc.convert.fromJsonMap(line.myJSON)[key] AS name
I want to dummy in the real-time database in this structure:
{
"0" : {
"0c1592ca-0fa5-43b9-88d2-c9cd77b30611" : {
"token" : "0cu9CJPb_DIUfbr-Ay8vh6:-KQXn....",
"member_id" : "123456789102",
"update_at" : "2021/06/14 08:08:08"
},
"<uid>" : {
"token" : "167 random characters",
"member_id" : "12 random numbers",
"update_at" : "YYYY/mm/DD HH:mm:ss"
}
},
"1" : {
....
},
"2" : {
....
},
.....
"9" : {
....
}
}
A record is like this:
"36 random characters" : {
"token" : "167 random characters",
"member_id" : "12 random numbers",
"update_at" : "YYYY/mm/DD HH:mm:ss"
}
I've tried to import JSON files from the firebase console for a million records per node. But I got a crash from the second node, like the image below. I can't import easily like before.
Is there any other way that I can dummy 10 million child nodes like above, faster and stable?
https://i.stack.imgur.com/nuBoM.png
The error message says that the JSON is invalid, so you might want to pass it through a JSON validator like https://jsonlint.com/.
Aside from that, I can imagine that you browser, the console, or the server runs into memory problems with this number of nodes in one write (see limits). I recommend using the API to instead read the JSON file locally, and then add it to Firebase in chunks, or use a tool like https://github.com/FirebaseExtended/firebase-import.
Also see https://www.google.com/search?q=firebase+realtime+database+upload+large+JSON
I have a Neo4j DB with relationships that have properties such as [:FRIENDS {since: "11/2015"}]. I need to represent the "since" property in the GraphQl Schema. RELAY has something call "edges" an apparently this is how they implement this feature but I am not using RELAY.....I didn't see anything in Apollo (maybe I missed it). Can someone show me how to do this?
Ok...so in order to get what I wanted which was to present both the node and the relationship (edge) to graphql I did what I would call a work-around by returning object.assign(node,relationship) to graphql....the downside is that I have to define a type nodeRel {} to receive the combined objects but it works. Also, the node and relationship objects can't have similar named properties. I can now answer the question how long John and Mary are friends or what groups John belongs to and how long he has been a member....Schema snippet:
... memberOf : [Group]
groupStatus : [MemberProfile]
attended : [Meeting]
submittedReport : [Report]
post : [Post]
}
type MemberProfile {
name : String
location : String
created : String
since : String
role : String
financial : Boolean
active : Boolean
}
Resolver:
groupStatus(voter) {
let session = driver.session(),
params = { voterid: voter.voterid },
query = `
MATCH (v:Voter)-[r:MEMBER_OF]->(g:Group)
WHERE v.voterid = $voterid
RETURN g AS group,r AS rel;
`
return session
.run(query, params)
.then(result => {
return result.records.map(record => {
return Object.assign(record.get("group").properties, record.get("rel").properties)
})
})
},
I hope this help someone else....
I tried many things but of no use.
I have already raised a question on stackoverflow earlier but I am still facing the same issue.
Here is the link to old stackoverflow question
creating multiple nodes with properties in json in neo4j
Let me try out explaining with a small example
This is the query I want to execute
{
"params" : {
"props" : [
{
"LocalAsNumber" : 0,
"NodeDescription" : "10TiMOS-B-4.0.R2 ",
"NodeId" : "10.227.28.95",
"NodeName" : "BLR_WAO_SARF7"
}
]
},
"query" : "MATCH (n:Router) where n.NodeId = {props}.NodeId RETURN n"}
For simplicity I have added only 1 props array otherwise there are around 5000 props. Now I want to execute the query above but it fails.
I tried using (props.NodeId}, {props[NodeID]} but everything fails.
Is it possbile to access a individual property in neo4j?
My prog is in c++ and I am using jsoncpp and curl to fire my queries.
If you do {props}.nodeId in the query then the props parameter must be a map, but you pass in an array. Do
"props" : {
"LocalAsNumber" : 0,
"NodeDescription" : "10TiMOS-B-4.0.R2 ",
"NodeId" : "10.227.28.95",
"NodeName" : "BLR_WAO_SARF7"
}
You can use an array of maps for parameter either with a simple CREATE statement.
CREATE ({props})
or if you loop through the array to access the individual maps
FOREACH (prop IN {props} |
MERGE (i:Interface {nodeId:prop.nodeId})
ON CREATE SET i = prop
)
Does this query string work for you?
"MATCH (n:Router) RETURN [p IN {props} WHERE n.NodeId = p.NodeId | n];"
I have a Record model with many dynamic attributes. I want to make a request to the model an send the response as JSON to the client. But i want to exclude fields like _id and all foreign_keys in this model.
I found an interessting answer how to exclude the values of some keys: How do I exclude fields from an embedded document in Mongoid?, but the keys in the response still exists.
I got:
{
"_id": 1,
"name": "tom"
}
And the without method makes:
{
"_id": nil,
"name": "tom"
}
But i want:
{
"name": "tom"
}
Is it possible to remove or exclude some keys and the values from the result?
You don't want to remove fields from the mongoid document, what you want to do is remove fields from the generated json.
In your controller, do
render :json => #model.to_json(:except => :_id)
Documentation for the to_json method http://apidock.com/rails/ActiveRecord/Serialization/to_json
taken from the mongodb documentation at: http://docs.mongodb.org/manual/reference/method/db.collection.find/
Exclude Certain Fields from the Result Set
The following example selects documents that match a selection criteria and excludes a set of fields from the resulting documents:
db.products.find( { qty: { $gt: 25 } }, { _id: 0, qty: 0 } )
The query returns all the documents from the collection products where qty is greater than 25. The documents in the result set will contain all fields except the _id and qty fields, as in the following:
{ "item" : "pencil", "type" : "no.2" }
{ "item" : "bottle", "type" : "blue" }
{ "item" : "paper" }
i suppose mongoid is setting the _id attribute to nil since mongoid models have a defined set of attributes (even if they are dynamic, _id, _type etc are defined). maybe you can try it with the mongodb driver.
but i think RedXVII answer is the more practical way to go