Neo4j 2.0: Matching nodes in a matrix to create relationships errors out - neo4j

when I run this cypher from the browser console, I get an Unknown error. I'm not sure how to troubleshoot that.
MATCH (s:ContactMembership)
MATCH (contact:Contact {ContactId : s.ContactId})
MATCH (contactmembershiptype:ContactMembershipType
{ContactMembershipTypeId : s.ContactMembershipTypeId})
MERGE (contact)-[:CONTACT_CONTACTMEMBERSHIPTYPE
{ContactId : s.ContactId, ContactMembershipTypeId : s.ContactMembershipTypeId}]->
(contactmembershiptype)
ContactMembership has about 52k nodes
Contact has 42k
ContactMembershipType has 6
Each contact can have multiple membership types so there can be multiple relationships but each contactMembership node has a single contactid.
Should I be using Create instead of merge?? Not sure how to get more detail on the Unknown error...

Turns out that the Unknown Error was just a timeout on the console window. When running the command in the shell, I can see that it completes in just over 154 seconds.
It would be nice if the 2.0 browser console gave a better description of the error, like 'Timed out waiting for a response'.

Try below :
MATCH (s:ContactMembership),(contact:Contact {ContactId : s.ContactId}),
(contactmembershiptype:ContactMembershipType {ContactMembershipTypeId :
s.ContactMembershipTypeId})
WITH s,contact,contactmembershiptype
MERGE (contact)-[:CONTACT_CONTACTMEMBERSHIPTYPE
{ContactId : s.ContactId, ContactMembershipTypeId : s.ContactMembershipTypeId}]->
(contactmembershiptype)

Related

What is "Custom Program Error 0xa7" in anchor?

My Anchor program is giving me a Transaction simulation failed: Error processing Instruction 1: custom program error: 0xa7 with nothing useful in the logs.
How do I even begin to debug this?
Custom Program Error 0xa7 is Error: 167: The given account is not owned by the executing program.
This might happen if you pass in an account that's expected to be owned by a program, but isn't.
This can happen accidentally if you forget to set declare_id!(/* ... */) to the program id you're trying to hit.
Consider logging the program id that you're using in your javascript client:
console.log(program.programId)
And then seeing if that matches the public key that's in your target/idl/yourprogram.json file.

Neo4j to Gephi import : Failed to invoke procedure Invalid UTF-8

I tried to import my data from Neo4j into Gephi but it doesn't work.
I have the following result in Neo4j :
Failed to invoke procedure apoc.gephi.add: Caused by: com.fasterxml.jackson.core.JsonParseException: Invalid UTF-8 start byte 0xfb at [Source: (apoc.export.util.CountingInputStream); line: 1, column: 136]
As previously mentioned, it looks like neo4j is not exporting using UTF-8, so that, I would check how neo4j is generating the output.
Another possibility is that, when neo4j writing the output, something went slightly wrong.
I got this very same problem in the past when concurrently managing content in a file.
A thread crashed and closed "not correctly enough" the file. I mean, when reviewing the file, everything looks pretty normal, but some characters have been introduced which are not UTF-8. A tool like Atom can help you.
Best

java.lang.ArrayIndexOutOfBoundsException: -1 when calling algo.trianglecount

I am trying to follow Mark and Amy's Medium post about link prediction with NEO4J, Link Prediction with NEO4J
Working great until I need to run the triangle detection algorithm:
CALL algo.triangleCount('Author', 'CO_AUTHOR_EARLY', {
write:true,
writeProperty:'trianglesTrain',
clusteringCoefficientProperty:'coefficientTrain'})
I get the following error:
`ERROR` Neo.ClientError.Procedure.ProcedureCallFailed
Failed to invoke procedure `algo.triangleCount`: Caused by: java.lang.ArrayIndexOutOfBoundsException: -1
As far as I can tell I have run everything else correctly (getting the same results in the python part etc.)
I was running Neo4j (enterprise) 3.5.11 when I first got the error, upgraded to 3.5.12 - still have the issue. I have the Apoc and Graph Algo plugins installed

Read a list from stream using Yap-Prolog

I want to run a (python3) process from my (yap) prolog script and read its output formatted as a list of integers, e.g. [1,2,3,4,5,6].
This is what I do:
process_create(path(python3),
['my_script.py', MyParam],
[stdout(pipe(Out))]),
read(Out, OutputList),
close(Out).
However, it fails at read/2 predicate with the error:
PL_unify_term: PL_int64 not supported
Exception ignored in: <_io.TextIOWrapper name='<stdout>' mode='w' encoding='UTF-8'>
BrokenPipeError: [Errno 32] Broken pipe
I am sure that I can run the process correctly because with [stdout(std)] parameter given to process_create the program outputs [1,2,3,4,5,6] as expected.
Weird thing is that when I change the process to output some constant term (as constant_term) it still gives the same PL_int64 error. Appending a dot to the process' output ([1,2,3,4,5,6].) doesn't solve the error. Using read_term/3 gives the same error. read_string/3 is undefined in YAP-Prolog.
How can I solve this problem?
After asking at the yap-users mailing list I got the solution.
Re-compiled YAP Prolog 6.2.2 with libGMP option and now it works. It may also occur in 32-bit YAP.

i4gl :- SQL statement error number -201

In i4gl IDE I am facing this exception in a report program for printing deductions from pay for all personnel of a particular category.
This program is working fine for employees of one category but fails in case of employees of other category.
Please suggest a solution.Thanking in anticipation.
I have mentioned the lines of code where error message points fingers to and also the error message below:
let _stmt = "select * from pay_slip where per_no = '",r_hist.per_no ,
"' and yrmn = '",_yrmn,"' and govt_str ='", __hdr clipped,"'"
prepare prep_stmt from _stmt
execute prep_stmt into r_slip.*
error message:
Program stopped at "DED_SUM_EX.4gl", line number 109.
SQL statement error number -201.
A syntax error has occurred.
I would suggest:
DISPLAY your SQL statement before running it.
Run the SQL statement in DB-Access to see what is going on, the error will be more explicit.

Resources