Can anyone tell me how to run a PROFILE'd query using teh neo4j REST API such as
PROFILE MATCH (n:LABEL) return n;
When I run this either in Java using the RestCypherQueryEngine or the even using a raw HTTP post directly I get
message: "Invalid input 'P': expected SingleStatement (line 1, column 1) "PROFILE MATCH (n:LABEL) return n;" ^"
exception: "SyntaxException"
I though I had read somewhere that this is possible not only through the server console
The old cypher endpoint (i.e /db/data/cypher) had a ?profile=true query parameter that adds profiling information to the result.
e.g.
curl -H accept:application/json -H content-type:application/json
-d'{"query":"MATCH (n) RETURN count(*)","params":{}}'
http://localhost:7474/d/data/cypher?profile=true
{
"columns" : [ "count(*)" ],
"data" : [ [ 0 ] ],
"plan" : {
"args" : {
"returnItemNames" : [ "count(*)" ],
"_rows" : 1,
"_db_hits" : 0,
"symKeys" : [ " INTERNAL_AGGREGATE75acebd9-82d7-4a65-921c-2049c4bde4e7" ]
},
"dbHits" : 0,
"name" : "ColumnFilter",
"children" : [ {
"args" : {
"keys" : [ ],
"_rows" : 1,
"aggregates" : [ "( INTERNAL_AGGREGATE75acebd9-82d7-4a65-921c-2049c4bde4e7,CountStar())" ],
"_db_hits" : 0
},
"dbHits" : 0,
"name" : "EagerAggregation",
"children" : [ {
"args" : {
"_rows" : 0,
"_db_hits" : 0,
"identifier" : "n"
},
"dbHits" : 0,
"name" : "AllNodes",
"children" : [ ],
"rows" : 0
} ],
"rows" : 1
} ],
"rows" : 1
}
}
Related
Is it possible to create Salesreceipt without product/service value through QBO API? I have tried through API but it's not reflecting rate value and storing description value only.
If I remove ItemRef attribute(in request body) then it's reflecting rate and amount values and it's assigning some default and random product/service.
It is possible directly in QBO UI.
Request body where only description value storing:
{
"TxnDate" : "2016-05-27",
"Line" : [ {
"Amount" : 2222.00,
"Description" : "hi chk",
"DetailType" : "ItemReceiptLineDetail",
"ItemReceiptLineDetail" : {
"ItemRef" : { },
"Qty" : 1,
"UnitPrice" : 2222
} }
],
"CustomerRef" : {
"value" : "67"
},
"CustomerMemo" : {
"value" : "Thanks for your business! We appreciate referrals!"
},
"TotalAmt": 2222.00,
"PrivateNote" : "",
"CustomField" : [ {
"DefinitionId" : "1",
"Type" : "StringType",
"StringValue" : ""
} ]
}
Request body where default product/service assigning:
{
"TxnDate" : "2016-05-27",
"Line" : [ {
"Amount" : 2222.00,
"Description" : "hi chk",
"DetailType" : "ItemReceiptLineDetail",
"ItemReceiptLineDetail" : {
"Qty" : 1,
"UnitPrice" : 2222
} }
],
"CustomerRef" : {
"value" : "67"
},
"CustomerMemo" : {
"value" : "Thanks for your business! We appreciate referrals!"
},
"TotalAmt": 2222.00,
"PrivateNote" : "",
"CustomField" : [ {
"DefinitionId" : "1",
"Type" : "StringType",
"StringValue" : ""
} ]
}
No.
QuickBooks Online does not support this.
My problem is the following:
I run an elasticsearch query in a rails app using specific settings to my index and my search analyzer, the problem is that it doesnt return any results in the app, in the other hand when i try to run it directly from my elasticsearch docker, i have tokens returned. If i use these tokens in my app query, i get results...
so this is my elasticsearch query:
curl -XGET 'localhost:9200/development-stoot-services/_analyze?analyzer=search_francais' -d 'cours de guitare'
{"tokens":[{"token":"cour","start_offset":0,"end_offset":5,"type":"<ALPHANUM>","position":1},{"token":"guitar","start_offset":9,"end_offset":16,"type":"<ALPHANUM>","position":3}]}
here is the query from my rails app to elasticsearch:
query = {
"query" : {
"bool" : {
"must" : [
{
"range" : {
"deadline" : {
"gte" : "2016-05-26T10:27:19+02:00"
}
}
},
{
"terms" : {
"state" : [
"open"
]
}
},
{
"query_string" : {
"query" : "cours de guitare",
"default_operator" : "AND",
"fields" : [
"title",
"description",
"brand",
"category_name"
]
}
}
]
}
},
"filter" : {
"and" : [
{
"geo_distance" : {
"distance" : "40km",
"location" : {
"lat" : 48.855736,
"lon" : 2.32927300000006
}
}
}
]
},
"sort" : [
{
"created_at" : "desc"
}
]
}
the last query does not return any result, but if i try a query with the tokens returned by elasticsearch ('cour', 'guitar') i have expected results. So i guess there is a problem between rails and elasticsearch that i dont find...
Can anyone help on that ?
Try to modify your query like this, i.e. you need to specify the search_francais analyzer in your query_string in order to analyze cours de guitare the same way you did with the _analyze endpoint:
...
{
"query_string" : {
"query" : "cours de guitare",
"default_operator" : "AND",
"analyzer": "search_francais", <--- add this line
"fields" : [
"title",
"description",
"brand",
"category_name"
]
}
},
...
I have nested collection and I add "Data" then foreach "Data" I add its own "Tags". I found unwind for "Data" with foreach for "Tags".
I wanna add person which I import its info from outside of collection manually.
I execute below Cypher query by statements:
I have imitated from:
Cypher Import Statement
AND
Cypher Unwind
I checked my json via enter link description here And it is validated.
{ "statements": [ { "statement": " WITH { "categories": [ {"dataid" : "11" , "dataname" : "data1" , "datalanguage" : "en" , "datatype" : "type1" ,"content" : "content1" , "tags" : [{"myid" : "11" , "tagid" : 10 , "tagname" : "tag1" }] } , {"dataid" : "22" , "dataname" : "data2" , "datalanguage" : "en" , "datatype" : "type2" ,"content" : "content2" , "tags" : [{"myid" : "22" , "tagid" : 20 , "tagname" : "tag2" }] } ] } AS document UNWIND document.categories AS category MERGE (dt:Data {name: category.dataname}) ON CREATE SET dt.id = category.dataid , dt.type = category.datatype , dt.language = category.datalanguage , dt.content = category.datacontent MERGE (p:Person { name : 'Mahsa' , lastname : 'Mahsa' } ) ON CREATE SET p.id =1 MERGE (p)-[r:owner]->(dt) FOREACH (mytag IN category.tags | MERGE (t:Tag {name: mytag.tagname}) ON CREATE SET t.id = mytag.tagid MERGE (dt)-[r2:tagged { Freq : 12 ]->(t) )" } ] }
But it returns as result: (I checked many times for "Unexpected character" but I could not find )
{"results":[],"errors":[{"code":"**Neo.ClientError.Request.InvalidFormat**","message":"**Unable to deserialize request: Unexpected character ('c' (code 99)): was expecting comma to separate OBJECT entries**\n at [Source: HttpInputOverHTTP#132e16b; line: 1, column: 48]"}]}
I made my nested collection as:
string dataCollection2 = "{ \"categories\": [ {\"dataid\" : \"11\" , \"dataname\" : \"data1\" , \"datalanguage\" : \"en\" , \"datatype\" : \"type1\" ," +
"\"content\" : \"content1\" , \"tags\" : [{\"myid\" : \"11\" , \"tagid\" : 10 , \"tagname\" : \"tag1\" }] }" +
" , {\"dataid\" : \"22\" , \"dataname\" : \"data2\" , \"datalanguage\" : \"en\" , \"datatype\" : \"type2\" ," +
"\"content\" : \"content2\" , \"tags\" : [{\"myid\" : \"22\" , \"tagid\" : 20 , \"tagname\" : \"tag2\" }] } ] }";
var obj1 = JValue.Parse(#"'" + dataCollection2 + "'");
I imported my json to http://jsonlint,com and I got it is validated:
{
"categories": [
{
"dataid": 11,
"dataname": "data1",
"datalanguage": "en",
"datatype": "type1",
"content": "content1",
"tags": [
{
"myid": 11,
"tagid": 10,
"tagname": "tag1"
}
]
},
{
"dataid": 22,
"dataname": "data2",
"datalanguage": "en",
"datatype": "type2",
"content": "content2",
"tags": [
{
"myid": 22,
"tagid": 20,
"tagname": "tag2"
}
]
}
]}
The fundamental problem with the string you are passing to the REST API is that you are not passing a legal Cypher query. Cypher property "maps", which look a bit like JSON, are NOT JSON.
In your case, the important difference is that property names must NOT be delimited by double-quotes. Only string property values can be delimited by double-quotes.
So, categories, dataId, dataName, etc., must not be surrounded by double-quotes.
You also have a typo near the end of the query. [r2:tagged { Freq : 12 ] should be [r2:tagged { Freq : 12} ].
For Adding nested collection to neo4j:
Firstly: For having right input to neo4j as parameter I must remove "{
"categories": [ ] }" from the first and the end of json.
Secondly: I should put this json as parameter to statement
Thirdly: I have to use two "UNWIND and WITH" (instead of using foreach) for separating each row from collection either for parent or for child.
"Foreach" DOES similar to "UNWIND and WITH".
My final collection and query are:
Collection:
{"id" : 1, "name" : "Data1", "language" : "en", "tags": {"id" : 11, "name": "tag 11" } } , {"id" : 2, "name": "Data2" , "tags": [ {"id" : 33, "name": "tag 33"} , {"id" : 44, "name": "tag44"} ] }
Query:
{ "statements": [ { "statement": " UNWIND { datas } AS data MERGE (p:Person { name : 'God' , lastname : 'God' } ) ON CREATE SET p.id =2 MERGE (d:Data {name: data.name}) ON CREATE SET d.id = data.id , d.language = data.language MERGE (p)-[r:owner]->(d) WITH d, data.tags AS mytags UNWIND mytags AS mytag MERGE (t:Tag {name: mytag.name}) ON CREATE SET t.id = mytag.id MERGE (d)-[r2:tagged { Freq : 12 } ]->(t) " , "parameters": { "datas" : [{"id" : 1, "name" : "Data1", "language" : "en", "tags": {"id" : 11, "name": "tag 11" } } , {"id" : 2, "name": "Data2" , "tags": [ {"id" : 33, "name": "tag 33"} , {"id" : 44, "name": "tag44"} ] }] } } ] }
How to map a json like this in Restkit 2.0.I can't find a tutorial and all tutorials there is for previous versions.I know how it is done in 0.10.0 but dont have an idea how to do this array and nested arrays in 0.20.0
{
"days" : [
{
"day" : 1,
"id" : 1,
"set1" : [
{
"exercise_id" : 1,
"exerciseunits" : [
{
"count" : 3,
"id" : 1,
"weight" : 60
}
],
"id" : 1,
"name" : null,
"subbodypart_id" : 1,
"subbodypartname" : "Chest"
}
]
}
],
"description" : "desc",
"id" : 1,
"name" : "asdfg"
}
Use the updated documentation in the wiki about object mapping. More than that you can download the current RestKit master with examples (e.g. Twitter Core Data) where nested objects are used as well.
I was following the ElasticSearch guide online to represent coordinates as "lat, lng" but it doesnt seem to be working until I flip everything around to "lng, lat". I even have to flip around top_left and bottom_right in order for the query to work.
Is anyone experiencing the same problem? Clearly this is not how the documentation says to use it, but it's only working when I format it this way.
Rails format
def self.search(params)
tire.search( page: params[:page], per_page: 2 ) do
query { all }
filter :geo_bounding_box, location: { top_left: " -121.88596979687497, 37.33588487375733", bottom_right: " -122.43528620312497, 37.553946238118264" }
end
end
CURL format
curl -X GET "http://localhost:9200/articles/article/_search?page=&per_page=2&size=2&pretty=true" -d '{"query":{"match_all":{}},"facets":{"condition":{"terms":{"field":"condition","size":10,"all_terms":false}}},"filter":{"geo_bounding_box":{"location":{"top_left":" -121.88596979687497, 37.33588487375733","bottom_right":" -122.43528620312497, 37.553946238118264"}}},"size":2}'
Console response
{
"took" : 1,
"timed_out" : false,
"_shards" : {
"total" : 5,
"successful" : 5,
"failed" : 0
},
"hits" : {
"total" : 169,
"max_score" : 1.0,
"hits" : [ {
"_index" : "articles",
"_type" : "article",
"_id" : "4f72bc7d0bdb820f02000002",
"_score" : 1.0, "_source" : {"content":"words here!","location":[37.444995,-122.160628],"name":"harro"}
}, {
"_index" : "articles",
"_type" : "article",
"_id" : "4fdf0cf20bdb82336c000002",
"_score" : 1.0, "_source" : {"content":"Run of the mill","location":[37.33588487375733,-121.88596979687497],"name":"Billy Bob"}
} ]
},
"facets" : {
"condition" : {
"_type" : "terms",
"missing" : 5597,
"total" : 0,
"other" : 0,
"terms" : [ ]
}
}
When geo point is specified as a string, it should be in "lat,lon" format. When it is specified as an array, it should be in [lon, lat] format.