Is there a way to POST graphML to gremlin/neo4j? - neo4j

So it looks like the gremlin API requires a url to import a GraphML file to the server (http://docs.neo4j.org/chunked/stable/gremlin-plugin.html#rest-api-load-a-sample-graph). I was hoping there'd be some API where you could just POST the GraphML to it, does something like this exist?
I realise I could write a Neo4j extension to essentially do this, but I was wondering if one already existed...

There a shell extension at https://github.com/jexp/neo4j-shell-tools#graphml-import providing this feature. It should not be too hard to convert that into a server extension.

If the graph is not huge, perhaps you can try passing the file as a string to the gremlin extension and use the script in the doc you cited. Therefore your gremlin script expects a String variable that contains your graph and it creates the file (by writing the string graph to the file):
def fos= new FileOutputStream('path_to_my_file.xml')
fos.write(myGraphAsString)
You can then load this file:
g.clear()
g.loadGraphML('file:/path_to_my_file.xml')

Related

How to import InfluxDB tables into QuestDB?

I am trying to move data from InfluxDB to QuestDB,
I was able to export my tables as JSON by following: https://stackoverflow.com/a/27913640/1267728
How do I now import these JSON files into QuestDB?
Convert from JSON to CSV
QuestDB supports importing data via CSV file, so first you would need to flatten the JSON and ensure that column names are modified to reflect nested properties.
There is a Java library called Json2Flat that already does this.
Import the CSV file
Using the REST API, you can import the data into QuestDB
curl -F data=file.csv http://localhost:9000/imp
For more details of how to use the REST API, please go to the official documentation.
Check the data
To verify that the import is successful, you can check via the Web Console or via CURL…
curl -G --data-urlencode "query=select * from 'file.csv'" http://localhost:9000/exp
Just adding here that QuestDB recently improved the performance of CSV ingestion. More info at https://questdb.io/docs/guides/importing-data/
If you want to avoid converting from JSON (and probably more performant as well than exporting to JSON for large tables), you can use the influxd inspect export-lp command that exports all your data as ILP points. You can choose to export a single bucket.
Once you have the ILP files, you can import as explained at this other StackOverflow post What's the best way to upload an ILP file into QuestDB?

how to let neo4j NSMNTX respect rdf:ID when importing rdf from multiple sources

I am trying to import multiple rdf files into neo4j as described here
My problem is that even though elements have the same rdf:ID they end up being imported as different neo4j nodes with different uris prefixed by the different file names like file:/x.xml#_00141f6c-69b1-4a1a-a83b-333d0bb9d586 and file:/y.xml#_00141f6c-69b1-4a1a-a83b-333d0bb9d586.
I have tried to use:
call semantics.addNamespacePrefix("local","file:/x.xml#")
call semantics.addNamespacePrefix("local","file:/y.xml#")
before importing but to no avail. I have additionally tried to set handleVocabUris: "MAP" as an option for the import function.
Is there an import option that I am missing which allows these nodes to be unified? Is there generally an elegant way to reunify them after importing?
My current workaround is to copy each file into a temp file before loading so that the prefixes are the same. Neo4j joins the nodes with the same uri into one, which is exactly what I need.
Still happy to hear about an elegant way to do this though..

Create RDF File Using SOAF Ontology

With Apache Jena, we can generate FOAF file like this:
model.createResource("http://example.org/alice", FOAF.Person)
.addProperty(FOAF.name, "Alice")
.addProperty(FOAF.mbox, model.createResource("mailto:alice#example.org"))
.addProperty(FOAF.knows, model.createResource("http://example.org/bob"));
I want to generate a SOAF file (extension of FOAF).
Is there any method or API to do this?
Jena has a utility "schemagen" that generates vocabulary files from RDFS. It is how FOAF.java is made. There is nothing special about vocabularies, they don't have to be installed specially in a particular package. Make a SOAF.java and compile it into your program or look at FOAF.java.

Is there a way to use parameters in a LOAD CSV command?

I have a Cypher script for populating a Neo4j (2.2.3) database. Currently, the names of all the CSV files are hard coded. Is there a way to parameterize the CSV files, in case I'd like to switch to a different web server or switch to using the local file system?
Update
I forgot to mention that my use case is via neo4j-shell. Is there also a way to define parameters for use by the shell or can that only be done through the REST API? Thanks!
You can use parameters in the shell, just export them as "environment" variables.
List them with env:
export name=Tim
env
match (p:Person {firstName:{name}}) return p;
Yes, the URL for the CSV file is a string in the Cypher query so you can parameterize it like any other Cypher query. Check out the docs here and here.

Custom configuration file in MVC4

I'm building an ASP.Net MVC4 application and the customer wants to be able to supply an XML configuration file, to configure a vendor list in the application, something like this:
<Vendor>
<Vendor name="ABC Computers" deliveryDays="10"/>
<Vendor name="XYZ Computers" deliveryDays="15"/>
</Vendors>
The file needs to be dropped onto a network location (i.e. not on the web server) and I don't have a database to import and store the data.
The customer also wants the ability to update it daily. So I'm thinking I'll have to do some kind of import (and validate the file) when the application starts up.
Any good ideas on the best way to accomplish this?
- The data needs to be quickly accessible
- Ideally I just want to import/store it once, or be able to access it quickly
- I need to be able to validate the file, so it might be prudent to be able to be able to switch to a backup
One thought was to use something like Entity Framework and simply read the file whenever I needed it, but if possible I'd hold it in memory in the application if possible.
Cheers
Vincent
No need to import it into a database or use Entity Framework. You can simply use .NET Xml Serialization to accomplish this.
The command line tool xsd.exe will generate c# classes from your Xml file. From the command line:
xsd.exe myfile.xml
xsd.exe /c myfile.xsd
The first command will infer and create an xml schema file (myfile.xsd) from your xml. The second command will convert the schema file to c# classes.
Then use the XmlSerializer class to deserialize your xml file into objects (assuming multiple objects in one file):
MyCollection myObjects= null;
string path = "mydata.xml";
XmlSerializer serializer = new XmlSerializer(typeof(MyCollection));
StreamReader reader = new StreamReader(path);
myObjects = (MyCollection)serializer.Deserialize(reader);
reader.Close();
You can use the .xsd file generated above to validate your xml files. Here's a link showing how: http://msdn.microsoft.com/en-us/library/ms162371.aspx.

Resources