I am using dse-5-0-5 and graphloader to load data from GraphML. While giving command:
graphloader ./scripts/graphml2Vertex/recipeMappingGraphML.groovy -graph testGraphML -address 172.31.35.238 -load_failure_log /home/centos/DSE/dse-graph-loader-5.0.5/scripts/graphml2Vertex/loadfailure.log -dryrun true
I am getting error:
groovy.lang.MissingMethodException: No signature of method: com.datastax.dsegraphloader.api.GraphSource$VerticesBuilder.withVertexId() is applicable for argument types: () values: []
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:58)
at org.codehaus.groovy.runtime.callsite.PojoMetaClassSite.call(PojoMetaClassSite.java:49)
at org.codehaus.groovy.runtime.callsite.CallSiteArray.defaultCall(CallSiteArray.java:48)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:113)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:117)
at Script1.run(Script1.groovy:19)
at com.datastax.dsegraphloader.cli.GroovyScriptExecutor.evaluate(GroovyScriptExecutor.java:106)
at com.datastax.dsegraphloader.cli.Executable.execute(Executable.java:72)
at com.datastax.dsegraphloader.cli.Executable.main(Executable.java:171)
I have imported the data as graphML from orientdb (by using g.saveGraphML(filename.xml) funtion ) and now trying to include the same data in DSE graph using graphloader (import graphMl) .Can you please tell the cause of this kind of error?
-Varun
I just discovered yesterday that the withVertexId() needs to be removed in order for graphloader to run the Mapping Script. I'll be changing the documentation.
Related
I am trying to read an Avro file with embedded schema using the following command:
avro-tools tojson data.avro
I am getting the following exception though
22/11/08 14:34:56 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Exception in thread "main" org.apache.avro.AvroRuntimeException: java.io.EOFException
at org.apache.avro.file.DataFileStream.next(DataFileStream.java:238)
at org.apache.avro.tool.DataFileReadTool.run(DataFileReadTool.java:98)
at org.apache.avro.tool.Main.run(Main.java:67)
at org.apache.avro.tool.Main.main(Main.java:56)
Caused by: java.io.EOFException
at org.apache.avro.io.BinaryDecoder.ensureBounds(BinaryDecoder.java:542)
at org.apache.avro.io.BinaryDecoder.readInt(BinaryDecoder.java:173)
at org.apache.avro.io.BinaryDecoder.readBytes(BinaryDecoder.java:332)
at org.apache.avro.io.ResolvingDecoder.readBytes(ResolvingDecoder.java:242)
at org.apache.avro.generic.GenericDatumReader.readBytes(GenericDatumReader.java:544)
at org.apache.avro.generic.GenericDatumReader.readBytes(GenericDatumReader.java:535)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:194)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
at org.apache.avro.generic.GenericDatumReader.readField(GenericDatumReader.java:260)
at org.apache.avro.generic.GenericDatumReader.readRecord(GenericDatumReader.java:248)
at org.apache.avro.generic.GenericDatumReader.readWithoutConversion(GenericDatumReader.java:180)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:161)
at org.apache.avro.generic.GenericDatumReader.read(GenericDatumReader.java:154)
at org.apache.avro.file.DataFileStream.next(DataFileStream.java:251)
at org.apache.avro.file.DataFileStream.next(DataFileStream.java:236)
... 3 more
The file is supposed to have the schema embedded and one object inside.
Conduktor is able to read the file, but avro tools isn't.
The file was generated using the following code:
val outputStream = ByteArrayOutputStream()
val writer = AvroDataOutputStream<GenericRecord>(outputStream, { it }, data.schema, CodecFactory.nullCodec())
writer.write(data)
writer.flush()
writer.close()
return outputStream.toByteArray()
How can I view this file using command line?
It would be nice to suppress or fix the avro-tools warning also.
We are running Micronaut with Thymeleaf views and the Layout dialect (we add it manually by overriding Micronaut's ThymeleafFactory). Below are the dependencies (Micronaut version is 3.2.7):
implementation 'io.micronaut.views:micronaut-views-core:3.1.2'
implementation 'io.micronaut.views:micronaut-views-thymeleaf:3.1.2'
implementation 'nz.net.ultraq.thymeleaf:thymeleaf-layout-dialect:3.0.0'
The problematic code is this:
<html layout:decorate="~{/layout-top}">
This seems to work fine when running with ./gradlew run, but crashes when running from a fat (shadow) jar using java -jar .... This would point to classpath issues, but we couldn't figure out what would those be.
Below the error message when running the shadow jar:
Caused by: groovy.lang.MissingMethodException: No signature of method: io.micronaut.views.thymeleaf.WebEngineContext.getOrCreate() is applicable for argument types: (String, nz.net.ultraq.thymeleaf.layoutdialect.context.extensions.IContextExtensions$_getPrefixForDialect_closure1) values: [DialectPrefix::org.thymeleaf.standard.StandardDialect, nz.net.ultraq.thymeleaf.layoutdialect.context.extensions.IContextExtensions$_getPrefixForDialect_closure1#26b0c4d0]
at org.codehaus.groovy.runtime.ScriptBytecodeAdapter.unwrap(ScriptBytecodeAdapter.java:70)
at org.codehaus.groovy.runtime.callsite.PojoMetaClassSite.call(PojoMetaClassSite.java:46)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:148)
at nz.net.ultraq.thymeleaf.layoutdialect.context.extensions.IContextExtensions.getPrefixForDialect(IContextExtensions.groovy:54)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.codehaus.groovy.runtime.metaclass.ReflectionMetaMethod.invoke(ReflectionMetaMethod.java:54)
at org.codehaus.groovy.runtime.metaclass.NewInstanceMetaMethod.invoke(NewInstanceMetaMethod.java:54)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:247)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:139)
at nz.net.ultraq.thymeleaf.layoutdialect.models.extensions.IProcessableElementTagExtensions.equalsIgnoreXmlnsAndWith(IProcessableElementTagExtensions.groovy:60)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at org.codehaus.groovy.runtime.metaclass.ReflectionMetaMethod.invoke(ReflectionMetaMethod.java:54)
at org.codehaus.groovy.runtime.metaclass.NewInstanceMetaMethod.invoke(NewInstanceMetaMethod.java:54)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite$PojoMetaMethodSiteNoUnwrapNoCoerce.invoke(PojoMetaMethodSite.java:247)
at org.codehaus.groovy.runtime.callsite.PojoMetaMethodSite.call(PojoMetaMethodSite.java:56)
at org.codehaus.groovy.runtime.callsite.AbstractCallSite.call(AbstractCallSite.java:148)
at nz.net.ultraq.thymeleaf.layoutdialect.decorators.DecorateProcessor.doProcess(DecorateProcessor.groovy:103)
at org.thymeleaf.processor.element.AbstractAttributeModelProcessor.doProcess(AbstractAttributeModelProcessor.java:77)
We debugged this and isolated the failing code in nz.net.ultraq.thymeleaf.layoutdialect.context.extensions.IContextExtensions:
static String getPrefixForDialect(IContext self, Class<IProcessorDialect> dialectClass) {
return self.getOrCreate(DIALECT_PREFIX_PREFIX + dialectClass.name) { ->
def dialectConfiguration = self.configuration.dialectConfigurations.find { dialectConfig ->
return dialectClass.isInstance(dialectConfig.dialect)
}
return dialectConfiguration?.prefixSpecified ?
dialectConfiguration?.prefix :
dialectConfiguration?.dialect?.prefix
}
}
It seems that the IContext argument is not what's supposed to be, but we couldn't really find the root cause for this. Nor why this is behaving differently with the two different methods of running the same code.
Upon further investigation, we discovered that this is related to this bug in the shadow jar plugin: https://github.com/johnrengelman/shadow/issues/490
The library thymeleaf-layout-dialect is using a nz.net.ultraq.extensions:groovy-extensions:1.1.0
which, in turn, registers some Groovy extensions through META-INF/services/org.codehaus.groovy.runtime.ExtensionModule
The shadow jar plugin doesn't handle these correctly (it only handles META-INF/groovy/... paths).
As per ticket comments here https://github.com/johnrengelman/shadow/issues/490 , there is a workaround, but it's deeply unpleasant.
Try to use BeamSQL for unnest the nested type of PCollection. Lets assume the PCollection which have the Employees and its details. Here details are in nested collection. So if we use the BeamSQL like "SELECT PCOLLECTION.details FROM PCOLLECTION" then getting nested type of details as array collection in the separate PCollection. However when I want to get specific column from the nested type collection as details, then getting error like unable to find the column name. Tried the BeamSQL like (similar like BigQuery SQL) "SELECT X.address FROM PCOLLECTION, Unnest(details) as X" then getting nullpointer exception. Used 2.12.0 apache beam version.
Appreciate some one please help on this.
Below is the sample data of details nested Value (details has email, phone columns. so per row, 'n' no of list of details. Here it has two list of details):
WARNING: printValue:Row:[[Row:[lourdurajan#gmail.com, 9840618047], Row:[lourdurajan#sanmina.com, 9840618047]]]
Here is the Java stacktrace for second select statement:
SELECT `X`.`email`
FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`,
UNNEST(`PCOLLECTION`.`details`) AS `X`
May 08, 2019 11:23:30 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: SQLPlan>
LogicalProject(email=[$3])
LogicalCorrelate(correlation=[$cor0], joinType=[inner], requiredColumns=[{2}])
BeamIOSourceRel(table=[[beam, PCOLLECTION]])
Uncollect
LogicalProject(details=[$cor0.details_2])
LogicalValues(tuples=[[{ 0 }]])
May 08, 2019 11:23:30 AM org.apache.beam.sdk.extensions.sql.impl.BeamQueryPlanner convertToBeamRel
INFO: BEAMPlan>
BeamCalcRel(expr#0..4=[{inputs}], email=[$t3])
BeamUnnestRel(unnestIndex=[2])
BeamIOSourceRel(table=[[beam, PCOLLECTION]])
[WARNING]
java.lang.NullPointerException
at org.apache.beam.sdk.extensions.sql.impl.utils.CalciteUtils.toSchema(CalciteUtils.java:171)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel$Transform.expand(BeamUnnestRel.java:93)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamUnnestRel$Transform.expand(BeamUnnestRel.java:87)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:488)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:66)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.lambda$buildPCollectionList$0(BeamSqlRelUtils.java:47)
at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)
at java.util.Iterator.forEachRemaining(Iterator.java:116)
at java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1801)
at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)
at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)
at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.buildPCollectionList(BeamSqlRelUtils.java:48)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:64)
at org.apache.beam.sdk.extensions.sql.impl.rel.BeamSqlRelUtils.toPCollection(BeamSqlRelUtils.java:36)
at org.apache.beam.sdk.extensions.sql.SqlTransform.expand(SqlTransform.java:111)
at org.apache.beam.sdk.extensions.sql.SqlTransform.expand(SqlTransform.java:79)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:537)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:488)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:370)
at com.sanmina.BeamSQLUnnest.main(BeamSQLUnnest.java:217)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run(ExecJavaMojo.java:282)
at java.lang.Thread.run(Thread.java:748)
You can achieve this using BigQueryIO.
String Query ="SELECT `X`.`email`
FROM `beam`.`PCOLLECTION` AS `PCOLLECTION`,
UNNEST(`PCOLLECTION`.`details`) AS `X`"
BigQueryIO.readTableRows().fromQuery(query).usingStandardSql()
I just started using scio and dataflow. Trying my code to one input file, worked fine. But when I add more files to the input, got the following exception:
java.lang.RuntimeException: org.apache.beam.sdk.coders.CoderException: cannot encode a null String
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:280)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:609)
at com.spotify.scio.util.Functions$$anon$3.processElement(Functions.scala:158)
Caused by: org.apache.beam.sdk.coders.CoderException: cannot encode a null String
at org.apache.beam.sdk.coders.StringUtf8Coder.getEncodedElementByteSize(StringUtf8Coder.java:136)
at org.apache.beam.sdk.coders.StringUtf8Coder.getEncodedElementByteSize(StringUtf8Coder.java:37)
at org.apache.beam.sdk.coders.Coder.registerByteSizeObserver(Coder.java:291)
at com.spotify.scio.coders.RecordCoder.registerByteSizeObserver(Coder.scala:279)
at org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.registerByteSizeObserver(WindowedValue.java:564)
at org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.registerByteSizeObserver(WindowedValue.java:480)
at org.apache.beam.runners.dataflow.worker.IntrinsicMapTaskExecutorFactory$ElementByteSizeObservableCoder.registerByteSizeObserver(IntrinsicMapTaskExecutorFactory.java:399)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputObjectAndByteCounter.update(OutputObjectAndByteCounter.java:125)
at org.apache.beam.runners.dataflow.worker.DataflowOutputCounter.update(DataflowOutputCounter.java:64)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:43)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn$1.output(SimpleParDoFn.java:272)
at org.apache.beam.runners.core.SimpleDoFnRunner.outputWindowedValue(SimpleDoFnRunner.java:309)
at org.apache.beam.runners.core.SimpleDoFnRunner.access$700(SimpleDoFnRunner.java:77)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:621)
at org.apache.beam.runners.core.SimpleDoFnRunner$DoFnProcessContext.output(SimpleDoFnRunner.java:609)
at com.spotify.scio.util.Functions$$anon$3.processElement(Functions.scala:158)
at com.spotify.scio.util.Functions$$anon$3$DoFnInvoker.invokeProcessElement(Unknown Source)
at org.apache.beam.runners.core.SimpleDoFnRunner.invokeProcessElement(SimpleDoFnRunner.java:275)
at org.apache.beam.runners.core.SimpleDoFnRunner.processElement(SimpleDoFnRunner.java:240)
at org.apache.beam.runners.dataflow.worker.SimpleParDoFn.processElement(SimpleParDoFn.java:325)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ParDoOperation.process(ParDoOperation.java:44)
at org.apache.beam.runners.dataflow.worker.util.common.worker.OutputReceiver.process(OutputReceiver.java:49)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.runReadLoop(ReadOperation.java:201)
at org.apache.beam.runners.dataflow.worker.util.common.worker.ReadOperation.start(ReadOperation.java:159)
at org.apache.beam.runners.dataflow.worker.util.common.worker.MapTaskExecutor.execute(MapTaskExecutor.java:76)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:394)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:363)
at org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:291)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:135)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:115)
at org.apache.beam.runners.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:102)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
I guess one of my input files may include some malformed data. But how to bypass the bad data? There is a similar question with Java Beam com.google.cloud.dataflow.sdk.coders.CoderException: cannot encode a null String
So I tried this:
val scText = sc.textFile(input)
scText.setCoder(NullableCoder.of(StringUtf8Coder.of()))
It didn't help. Can someone help me on this? Thanks.
The scio team provided a solution to this problem. Basically add --nullableCoders=true in command line argument.
i'm trying to open a neo4j database by using blueprints implementation, but i got the following exceptions:
Neo4jGraph graph = new Neo4jGraph("/Users/pipe/Dev/neo4j-community-2.1.0-M01/data/graph.db");
this cause
Caused by: javax.faces.el.EvaluationException: java.lang.RuntimeException: Bad value '-192M' for setting 'neostore.propertystore.db.strings.mapped_memory': value does not match expression:\d+[kmgKMG]?
at javax.faces.component.MethodBindingMethodExpressionAdapter.invoke(MethodBindingMethodExpressionAdapter.java:102)
at com.sun.faces.application.ActionListenerImpl.processAction(ActionListenerImpl.java:101)
... 32 more
Caused by: java.lang.RuntimeException: Bad value '-192M' for setting 'neostore.propertystore.db.strings.mapped_memory': value does not match expression:\d+[kmgKMG]?
at com.tinkerpop.blueprints.impls.neo4j.Neo4jGraph.<init>(Neo4jGraph.java:165)
at com.tinkerpop.blueprints.impls.neo4j.Neo4jGraph.<init>(Neo4jGraph.java:135)
at org.pipe.java.web.netnografica.persistenza.graphdb.DAONodo.toGraphml(DAONodo.java:204)
at org.pipe.java.web.netnografica.controllo.ControlloGenerale.esportaGraphml(ControlloGenerale.java:133)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.el.parser.AstValue.invoke(AstValue.java:278)
at org.apache.el.MethodExpressionImpl.invoke(MethodExpressionImpl.java:274)
at com.sun.faces.facelets.el.TagMethodExpression.invoke(TagMethodExpression.java:105)
at javax.faces.component.MethodBindingMethodExpressionAdapter.invoke(MethodBindingMethodExpressionAdapter.java:88)
... 33 more
Caused by: java.lang.IllegalArgumentException: Bad value '-192M' for setting 'neostore.propertystore.db.strings.mapped_memory': value does not match expression:\d+[kmgKMG]?
at org.neo4j.helpers.Settings$DefaultSetting.apply(Settings.java:782)
at org.neo4j.helpers.Settings$DefaultSetting.apply(Settings.java:702)
at org.neo4j.graphdb.factory.GraphDatabaseSetting$SettingWrapper.apply(GraphDatabaseSetting.java:215)
at org.neo4j.graphdb.factory.GraphDatabaseSetting$SettingWrapper.apply(GraphDatabaseSetting.java:189)
at org.neo4j.kernel.configuration.ConfigurationValidator.validate(ConfigurationValidator.java:50)
at org.neo4j.kernel.configuration.Config.applyChanges(Config.java:121)
at org.neo4j.kernel.InternalAbstractGraphDatabase.create(InternalAbstractGraphDatabase.java:339)
at org.neo4j.kernel.InternalAbstractGraphDatabase.run(InternalAbstractGraphDatabase.java:253)
at org.neo4j.kernel.EmbeddedGraphDatabase.<init>(EmbeddedGraphDatabase.java:106)
at org.neo4j.kernel.EmbeddedGraphDatabase.<init>(EmbeddedGraphDatabase.java:81)
at org.neo4j.kernel.EmbeddedGraphDatabase.<init>(EmbeddedGraphDatabase.java:63)
at com.tinkerpop.blueprints.impls.neo4j.Neo4jGraph.<init>(Neo4jGraph.java:155)
... 44 more
there seems to be a need to provide a properties file. Is correct?
*Edited to answer to Michael Hunger:
Well .. I changed the version of blueprints, now is 2.5.0-SNAPSHOT, but nothing changed. So i provided the configs using the map ask asked by the constructor
Map<String, String> configurazione = new HashMap<String, String>();
configurazione.put("neostore.propertystore.db.strings.mapped_memory", "250M");
configurazione.put("neostore.propertystore.db.arrays.mapped_memory", "100M");
configurazione.put("neostore.relationshipstore.db.mapped_memory", "3845M");
configurazione.put("neostore.nodestore.db.mapped_memory", "350M");
configurazione.put("neostore.propertystore.db.mapped_memory", "350M");
configurazione.put("neostore.nodestore.db.mapped_memory", "769M");
Neo4j2Graph grafo = new Neo4j2Graph("/Users/pipe/Dev/neo4j-community-2.1.0-M01/data/graph.db", configurazione);
Now the exception is changed, and i really don't know what is wrong.. I linked in paste bin to report the complete stack.
http://pastebin.com/XpipSysp
at last a NoSuchMethodError is thrown. What am I missing?
Thanks a lot.
Which blueprints version are you using?
Blueprints 2.5-SNAPSHOT is compatible with Neo4j 2.0.0.
Please note there is a separate module for Neo4j 2.0 called blueprints-neo4j2
And the classes are called Neo4j2Graph Neo4j2Vertex etc.
You should also be able to provide config to the Neo4j2Graph.