I am trying to use Spring data elastic search.
I also read this Spring Data Elastic Search with Nested Fields and mapping
Getting the following exception.
Caused by: org.elasticsearch.action.search.SearchPhaseExecutionException: Failed to execute phase [dfs], all shards failed; shardFailures {[-EbBygdLRkKs49wqsf5ewQ][support_team_idx][0]: RemoteTransportException[[Artie][local[6]][indices:data/read/search[phase/dfs]]]; nested: SearchParseException[[support_team_idx][0]: from[0],size[10]: Parse Failure [Failed to parse source [{"from":0,"size":10,"query":{"nested":{"query":{"bool":{"must":[{"term":{"supportteam.name":"test"}},{"term":{"supportteam.description":"test"}}]}},"path":"SupportTeam"}}}]]]; nested: QueryParsingException[[support_team_idx] [nested] failed to find nested object under path [SupportTeam]]; }
Entity:
#Document(indexName = "support_team_idx", type = "support_team_type", indexStoreType= "memory", shards = 1, replicas = 0, refreshInterval = "-1")
public class SupportTeam extends BaseEntity implements Serializable {
private static final long serialVersionUID = 1L;
#Field(type=FieldType.String, index=FieldIndex.analyzed)
private String name;
#Field(type=FieldType.String, index=FieldIndex.analyzed)
private String description;
#ManyToOne(fetch = FetchType.EAGER, cascade = {CascadeType.MERGE})
#JoinColumn(name = "org_unit_id", unique = false)
#Field(type=FieldType.Nested)
private OrgUnit orgUnit;
#OneToMany(mappedBy = "supportTeam", fetch = FetchType.EAGER)
#Field(type = FieldType.Nested)
private Set<SupportTeamMember> supportTeamMembers;
public SupportTeam() {}
}
log.info("Initializing Indexs");
RootConfig.elasticsearchTemplate().deleteIndex(SupportTeam.class);
RootConfig.elasticsearchTemplate().createIndex(SupportTeam.class);
RootConfig.elasticsearchTemplate().putMapping(SupportTeam.class);
RootConfig.elasticsearchTemplate().refresh(SupportTeam.class, true);
....
SearchQuery searchQuery = new NativeSearchQueryBuilder()
.withQuery(nestedQuery("SupportTeam",
boolQuery().must(termQuery("supportteam.name", .
searchText)).must(termQuery("supportteam.description", searchText))))
.withIndices("support_team_idx").build();
Related
My contract model class
#Data
#Document(indexName = "contract",type = "contract")
public class Contract implements Serializable
{
#JsonProperty("contract_number")
#Id
#Parent(type = "p")
#Field(type = FieldType.Text,index =true)
private String contract_number;
private String startDate;
private String endDate;
private String supportTypeCode;
#Field(type = FieldType.Nested,searchAnalyzer = "true")
private List<Product> products;
My product class
#Data
public class Product implements Serializable
{
#Field(type = FieldType.Keyword)
private String baseNumber;
#Field(type = FieldType.Keyword)
private String rowId;
#Field(type = FieldType.Keyword)
private String effectiveDate;
}
Using spring data I,m trying to fetch data based on baseNumber which is present in product class.
But not able to get data.
I tried using below JPA Method but it is not working.
Optional<Contract> findByProducts_BaseNumber(String s)
I am quite confused about how to maintain a mapping between Contract and Product class.
That should be
findByProductsBaseNumber(String s);
or
findByProducts_BaseNumber(String s);
as explained in the documentation
For me below solution worked I'm using elastic 7.6 version java API.
SearchSourceBuilder searchSourceBuilder = new SearchSourceBuilder();
MatchQueryBuilder matchQueryBuilder = QueryBuilders.matchQuery("products.baseNumber", baseNumber);
searchSourceBuilder.query(matchQueryBuilder);
searchSourceBuilder.from(0);
searchSourceBuilder.size(5);
SearchRequest searchRequest = new SearchRequest();
searchRequest.indices(INDEX);
searchRequest.source(searchSourceBuilder);
SearchHits hits = null;
try
{
hits = restHighLevelClient.search(searchRequest, RequestOptions.DEFAULT).getHits();
final List<Contract> collect = Arrays.stream(hits.getHits()).map(
sourceAsMap -> objectMapper.convertValue(sourceAsMap.getSourceAsMap(), Contract.class)).collect(
Collectors.toList());
return collect.get(0);
}
catch (IOException e)
{
e.printStackTrace();
}
I am reading from a Pub/Sub topic which running fine now I need to insert into a Table on clickHouse.
I am learning please excuse the tardiness.
PipelineOptions options = PipelineOptionsFactory.create();
//PubSubToDatabasesPipelineOptions options;
Pipeline p = Pipeline.create(options);
PCollection<String> inputFromPubSub = p.apply(namePrefix + "ReadFromPubSub",
PubsubIO.readStrings().fromSubscription("projects/*********/subscriptions/crypto_bitcoin.dataflow.bigquery.transactions").withIdAttribute(PUBSUB_ID_ATTRIBUTE));
PCollection<TransactionSmall> res = inputFromPubSub.apply(namePrefix + "ReadFromPubSub", ParDo.of(new DoFn<String, TransactionSmall>() {
#ProcessElement
public void processElement(ProcessContext c) {
String item = c.element();
//System.out.print(item);
Transaction transaction = JsonUtils.parseJson(item, Transaction.class);
//System.out.print(transaction);
c.output(new TransactionSmall(new Date(),transaction.getHash(), 123));
}}));
res.apply(ClickHouseIO.<TransactionSmall>write("jdbc:clickhouse://**.**.**.**:8123/litecoin?password=*****", "****"));
p.run().waitUntilFinish();
My TransactionSmall.java
import java.io.Serializable;
import java.util.Date;
public class TransactionSmall implements Serializable {
private Date created_dt;
private String hash;
private int number;
public TransactionSmall(Date created_dt, String hash, int number) {
this.created_dt = created_dt;
this.hash = hash;
this.number = number;
}
}
My table definition
clickhouse.us-east1-b.c.staging-btc-etl.internal :) CREATE TABLE litecoin.saurabh_blocks_small (`created_date` Date DEFAULT today(), `hash` String, `number` In) ENGINE = MergeTree(created_date, (hash, number), 8192)
CREATE TABLE litecoin.saurabh_blocks_small
(
`created_date` Date,
`hash` String,
`number` In
)
ENGINE = MergeTree(created_date, (hash, number), 8192)
I am getting error like
java.lang.IllegalArgumentException: Type of #Element must match the DoFn typesaurabhReadFromPubSub2/ParMultiDo(Anonymous).output [PCollection]
at org.apache.beam.sdk.transforms.ParDo.getDoFnSchemaInformation (ParDo.java:577)
at org.apache.beam.repackaged.direct_java.runners.core.construction.ParDoTranslation.translateParDo (ParDoTranslation.java:185)
at org.apache.beam.repackaged.direct_java.runners.core.construction.ParDoTranslation$ParDoTranslator.translate (ParDoTranslation.java:124)
at org.apache.beam.repackaged.direct_java.runners.core.construction.PTransformTranslation.toProto (PTransformTranslation.java:155)
at org.apache.beam.repackaged.direct_java.runners.core.construction.ParDoTranslation.getParDoPayload (ParDoTranslation.java:650)
at org.apache.beam.repackaged.direct_java.runners.core.construction.ParDoTranslation.isSplittable (ParDoTranslation.java:665)
at org.apache.beam.repackaged.direct_java.runners.core.construction.PTransformMatchers$6.matches (PTransformMatchers.java:269)
at org.apache.beam.sdk.Pipeline$2.visitPrimitiveTransform (Pipeline.java:282)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit (TransformHierarchy.java:665)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit (TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit (TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.visit (TransformHierarchy.java:657)
at org.apache.beam.sdk.runners.TransformHierarchy$Node.access$600 (TransformHierarchy.java:317)
at org.apache.beam.sdk.runners.TransformHierarchy.visit (TransformHierarchy.java:251)
at org.apache.beam.sdk.Pipeline.traverseTopologically (Pipeline.java:460)
at org.apache.beam.sdk.Pipeline.replace (Pipeline.java:260)
at org.apache.beam.sdk.Pipeline.replaceAll (Pipeline.java:210)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:170)
at org.apache.beam.runners.direct.DirectRunner.run (DirectRunner.java:67)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:315)
at org.apache.beam.sdk.Pipeline.run (Pipeline.java:301)
at io.blockchainetl.bitcoin.Trail.main (Trail.java:74)
at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke (Method.java:498)
at org.codehaus.mojo.exec.ExecJavaMojo$1.run (ExecJavaMojo.java:282)
at java.lang.Thread.run (Thread.java:748)
what would be the best way and cleanest way to achieve this without explicitly creating objects?
Thanks
This is likely happening because Beam relies on the coder specification for a PCollection when it infers the schema for it. It seems to be having trouble inferring the input schema for your ClickhouseIO transform.
You can compel Beam to have a schema by specifying a coder with schema inference, such as AvroCoder. You'd do:
#DefaultCoder(AvroCoder.class)
public class TransactionSmall implements Serializable {
private Date created_dt;
private String hash;
private int number;
public TransactionSmall(Date created_dt, String hash, int number) {
this.created_dt = created_dt;
this.hash = hash;
this.number = number;
}
}
Or you can also set the coder for the PCollection on your pipeline:
PCollection<TransactionSmall> res = inputFromPubSub.apply(namePrefix + "ReadFromPubSub", ParDo.of(new DoFn<String, TransactionSmall>() {
#ProcessElement
public void processElement(ProcessContext c) {
String item = c.element();
Transaction transaction = JsonUtils.parseJson(item, Transaction.class);
c.output(new TransactionSmall(new Date(),transaction.getHash(), 123));
}}))
.setCoder(AvroCoder.of(TransactionSmall.class));
res.apply(ClickHouseIO.<TransactionSmall>write("jdbc:clickhouse://**.**.**.**:8123/litecoin?password=*****", "****"));
I am trying to use DynamicDestinations to write to a partitioned table in BigQuery where the partition name is mytable$yyyyMMdd. If I bypass dynamicdestinations and supply a hardcoded table name in .to(), it works; however, with dynamicdestinations I get the following exception:
java.lang.IllegalArgumentException: unable to serialize org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite$1#6fff253c
at org.apache.beam.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:53)
at org.apache.beam.sdk.util.SerializableUtils.clone(SerializableUtils.java:90)
at org.apache.beam.sdk.transforms.ParDo$SingleOutput.<init>(ParDo.java:591)
at org.apache.beam.sdk.transforms.ParDo.of(ParDo.java:435)
at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite.expand(PrepareWrite.java:51)
at org.apache.beam.sdk.io.gcp.bigquery.PrepareWrite.expand(PrepareWrite.java:36)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:514)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:473)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:297)
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expandTyped(BigQueryIO.java:987)
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:972)
at org.apache.beam.sdk.io.gcp.bigquery.BigQueryIO$Write.expand(BigQueryIO.java:659)
at org.apache.beam.sdk.Pipeline.applyInternal(Pipeline.java:514)
at org.apache.beam.sdk.Pipeline.applyTransform(Pipeline.java:454)
at org.apache.beam.sdk.values.PCollection.apply(PCollection.java:284)
at com.homedepot.payments.monitoring.eventprocessor.MetricsAggregator.main(MetricsAggregator.java:82)
Caused by: java.io.NotSerializableException: com.google.api.services.bigquery.model.TableReference
at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)
And here is the code:
PCollection<Event> rawEvents = pipeline
.apply("ReadFromPubSub",
PubsubIO.readProtos(EventOuterClass.Event.class)
.fromSubscription(OPTIONS.getSubscription())
)
.apply("Parse", ParDo.of(new ParseFn()))
.apply("ExtractAttributes", ParDo.of(new ExtractAttributesFn()));
EventTable table = new EventTable(OPTIONS.getProjectId(), OPTIONS.getMetricsDatasetId(), OPTIONS.getRawEventsTable());
rawEvents.apply(BigQueryIO.<Event>write()
.to(new DynamicDestinations<Event, String>() {
private static final long serialVersionUID = 1L;
#Override
public TableSchema getSchema(String destination) {
return table.schema();
}
#Override
public TableDestination getTable(String destination) {
return new TableDestination(table.reference(), null);
}
#Override
public String getDestination(ValueInSingleWindow<Event> element) {
String dayString = DateTimeFormat.forPattern("yyyyMMdd").withZone(DateTimeZone.UTC).toString();
return table.reference().getTableId() + "$" + dayString;
}
})
.withFormatFunction(new SerializableFunction<Event, TableRow>() {
public TableRow apply(Event event) {
TableRow row = new TableRow();
Event evnt = (Event) event;
row.set(EventTable.Field.VERSION.getName(), evnt.getVersion());
row.set(EventTable.Field.TIMESTAMP.getName(), evnt.getTimestamp() / 1000);
row.set(EventTable.Field.EVENT_TYPE_ID.getName(), evnt.getEventTypeId());
row.set(EventTable.Field.EVENT_ID.getName(), evnt.getId());
row.set(EventTable.Field.LOCATION.getName(), evnt.getLocation());
row.set(EventTable.Field.SERVICE.getName(), evnt.getService());
row.set(EventTable.Field.HOST.getName(), evnt.getHost());
row.set(EventTable.Field.BODY.getName(), evnt.getBody());
return row;
}
})
.withCreateDisposition(BigQueryIO.Write.CreateDisposition.CREATE_IF_NEEDED)
.withWriteDisposition(BigQueryIO.Write.WriteDisposition.WRITE_APPEND)
);
Any pointers in the correct direction would be greatly appreciated.
Thanks!
From inspecting the exception message and the code above, it seems that the EventTable field used within your anonymous DynamicDestinations class contains a TableReference field which is not serializable.
One workaround would be to convert the anonymous DynamicDestinations to a static inner class and define a constructor which stores only the serializable pieces of the EventTable needed to implement the interface.
For example:
private static class EventDestinations extends DynamicDestinations<Event, String> {
private final TableSchema schema;
private final TableDestination destination;
private final String tableId;
private EventDestinations(EventTable table) {
this.schema = table.schema();
this.destination = new TableDestination(table.reference(), null);
this.tableId = table.reference().getTableId();
}
// ..
}
Looks like you're trying to fill a specific partition based on the event. Why not use:
SerializableFunction<ValueInSingleWindow<Event>, TableDestination>?
I'm doing a test movie project to learn neo4j and SDN and here is a problem I'm facing:
As you know movie director may be a producer or a writer or even an actor. In my java class architecture I have Person superclass. And it has children: Producer, Director, Actor and Writer. All these children nodes are on the same level, thus they are incompatible types.
While on the other hand, in neo4j I have nodes that are at the same time Producer, Director and Writer.
So I have a problem when I want to get all directors via repository.findAll() method (or via custom method with Cypher query). Spring tells me:
java.lang.IllegalArgumentException: Can not set java.util.Set field
com.test.db.domain.Producer.producedMovies to
com.test.db.domain.Director
I use Neo4j 2.0.1 and Spring Data Neo4j 3.0.0.RELEASE.
What is the right way to solve such kind of issue?
Update:
Here is the code:
public class App {
public static void main(String[] args) {
ApplicationContext context = new ClassPathXmlApplicationContext("service-beans.xml");
Neo4jTemplate neo4jTemplate = context.getBean(Neo4jTemplate.class);
DirectorRepository repository = context.getBean(DirectorRepository.class);
try (Transaction transaction = neo4jTemplate.getGraphDatabase().beginTx()) {
EndResult<Director> all = repository.findAll_Upd(); //repository.findAll();
Iterator<Director> iterator = all.iterator();
while(iterator.hasNext()) {
System.out.println(iterator.next());
}
transaction.success();
}
}
}
#Transactional
public interface DirectorRepository extends GraphRepository<Director> {
#Query("match(n:Director) return n")
EndResult<Director> findAll_Upd();
}
#NodeEntity
public class Person implements Comparable<Person> {
#GraphId
Long nodeId;
// #Indexed(unique=true)
String id;
#Indexed(indexType= IndexType.FULLTEXT, indexName = "people")
String name;
private Short born;
private Date birthday;
private String birthplace;
private String biography;
private Integer version;
private Date lastModified;
private String profileImageUrl;
...
}
public class Director extends Person {
#Fetch #RelatedTo(elementClass = Movie.class, type = RelationshipConstants.DIRECTED)
private Set<Movie> directedMovies = new HashSet<>();
}
public class Producer extends Person {
#Fetch #RelatedTo(elementClass = Movie.class, type = RelationshipConstants.PRODUCED)
private Set<Movie> producedMovies = new HashSet<>();
}
public class Actor extends Person {
#RelatedToVia(type = RelationshipConstants.ACTED_IN)
List<Role> roles;
}
#NodeEntity
public class Movie implements Comparable<Movie> {
#GraphId
Long nodeId;
#Indexed(unique = true)
String id;
#Indexed(indexType= IndexType.FULLTEXT, indexName = "search")
String title;
int released;
String tagline;
#Fetch #RelatedTo(type = RelationshipConstants.ACTED_IN, direction = INCOMING)
Set<Actor> actors;
#Fetch #RelatedTo(type = RelationshipConstants.PRODUCED, direction = INCOMING)
Set<Producer> producers;
#RelatedToVia(type = RelationshipConstants.ACTED_IN, direction = INCOMING)
List<Role> roles;
}
And the full Exception:
Exception in thread "main"
org.springframework.data.mapping.model.MappingException: Setting
property producedMovies to null on Rob Reiner [null] at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.setProperty(SourceStateTransmitter.java:85)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.copyEntityStatePropertyValue(SourceStateTransmitter.java:91)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.access$000(SourceStateTransmitter.java:40)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter$2.doWithAssociation(SourceStateTransmitter.java:61)
at
org.springframework.data.mapping.model.BasicPersistentEntity.doWithAssociations(BasicPersistentEntity.java:291)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.copyPropertiesFrom(SourceStateTransmitter.java:57)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityFetchHandler.fetchValue(Neo4jEntityFetchHandler.java:75)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityFetchHandler.fetch(Neo4jEntityFetchHandler.java:60)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityConverterImpl$1.doWithAssociation(Neo4jEntityConverterImpl.java:135)
at
org.springframework.data.mapping.model.BasicPersistentEntity.doWithAssociations(BasicPersistentEntity.java:291)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityConverterImpl.cascadeFetch(Neo4jEntityConverterImpl.java:125)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityConverterImpl.loadEntity(Neo4jEntityConverterImpl.java:114)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityConverterImpl.read(Neo4jEntityConverterImpl.java:104)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityPersister$CachedConverter.read(Neo4jEntityPersister.java:170)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityPersister.createEntityFromState(Neo4jEntityPersister.java:189)
at
org.springframework.data.neo4j.support.Neo4jTemplate.createEntityFromState(Neo4jTemplate.java:223)
at
org.springframework.data.neo4j.fieldaccess.RelationshipHelper.createEntitySetFromRelationshipEndNodes(RelationshipHelper.java:150)
at
org.springframework.data.neo4j.fieldaccess.RelatedToFieldAccessor.createEntitySetFromRelationshipEndNodes(RelatedToFieldAccessor.java:86)
at
org.springframework.data.neo4j.fieldaccess.RelatedToCollectionFieldAccessorFactory$RelatedToCollectionFieldAccessor.getValue(RelatedToCollectionFieldAccessorFactory.java:76)
at
org.springframework.data.neo4j.fieldaccess.DefaultEntityState.getValue(DefaultEntityState.java:97)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.copyEntityStatePropertyValue(SourceStateTransmitter.java:90)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.access$000(SourceStateTransmitter.java:40)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter$2.doWithAssociation(SourceStateTransmitter.java:61)
at
org.springframework.data.mapping.model.BasicPersistentEntity.doWithAssociations(BasicPersistentEntity.java:291)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.copyPropertiesFrom(SourceStateTransmitter.java:57)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityConverterImpl.loadEntity(Neo4jEntityConverterImpl.java:112)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityConverterImpl.read(Neo4jEntityConverterImpl.java:104)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityPersister$CachedConverter.read(Neo4jEntityPersister.java:170)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityPersister.createEntityFromState(Neo4jEntityPersister.java:189)
at
org.springframework.data.neo4j.support.mapping.Neo4jEntityPersister.projectTo(Neo4jEntityPersister.java:216)
at
org.springframework.data.neo4j.support.Neo4jTemplate.projectTo(Neo4jTemplate.java:240)
at
org.springframework.data.neo4j.support.conversion.EntityResultConverter.doConvert(EntityResultConverter.java:73)
at
org.springframework.data.neo4j.conversion.DefaultConverter.convert(DefaultConverter.java:44)
at
org.springframework.data.neo4j.support.conversion.EntityResultConverter.convert(EntityResultConverter.java:165)
at
org.springframework.data.neo4j.conversion.QueryResultBuilder$1.convert(QueryResultBuilder.java:103)
at
org.springframework.data.neo4j.conversion.QueryResultBuilder$1.access$300(QueryResultBuilder.java:81)
at
org.springframework.data.neo4j.conversion.QueryResultBuilder$1$1.underlyingObjectToObject(QueryResultBuilder.java:121)
at
org.neo4j.helpers.collection.IteratorWrapper.next(IteratorWrapper.java:47)
at com.test.util.App.main(App.java:34) at
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606) at
com.intellij.rt.execution.application.AppMain.main(AppMain.java:120)
Caused by: java.lang.IllegalArgumentException: Can not set
java.util.Set field com.test.db.domain.Producer.producedMovies to
com.test.db.domain.Director at
sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:164)
at
sun.reflect.UnsafeFieldAccessorImpl.throwSetIllegalArgumentException(UnsafeFieldAccessorImpl.java:168)
at
sun.reflect.UnsafeFieldAccessorImpl.ensureObj(UnsafeFieldAccessorImpl.java:55)
at
sun.reflect.UnsafeObjectFieldAccessorImpl.set(UnsafeObjectFieldAccessorImpl.java:75)
at java.lang.reflect.Field.set(Field.java:741) at
org.springframework.util.ReflectionUtils.setField(ReflectionUtils.java:102)
at
org.springframework.data.mapping.model.BeanWrapper.setProperty(BeanWrapper.java:90)
at
org.springframework.data.mapping.model.BeanWrapper.setProperty(BeanWrapper.java:68)
at
org.springframework.data.neo4j.support.mapping.SourceStateTransmitter.setProperty(SourceStateTransmitter.java:83)
... 43 more
Update
I have also tried projectTo () method (projections), but still I get the same exception:
Director director = neo4jTemplate.projectTo(person, Director.class);
I'm not sure on the neo4j mapping - but your java model doesn't match your business description - you more need a Person class with a Set < MovieJob> member- where MovieJob is your abstract class which Actor, Director, Producer are subclasses.
EntityManager em = getEntityManager();
EntityTransaction etx = em.getTransaction();
etx.begin();
Query query = em.createNamedQuery("login_procedure").setParameter("param1","user").setParameter("param2", "pw");
Integer result = 23;
try {
System.out.println("query = " + query.getSingleResult());
} catch (Exception e) {
result = null;
e.printStackTrace();
}
etx.commit();
em.close();
...executing this code I get
[EL Warning]: 2011-02-10 17:32:16.846--UnitOfWork(1267140342)--Exception
[EclipseLink-4002] (Eclipse
Persistence Services -
1.2.0.v20091016-r5565): org.eclipse.persistence.exceptions.DatabaseException
Internal Exception:
org.firebirdsql.jdbc.FBSQLException:
GDS Exception. 335544569. Dynamic SQL
Error SQL error code = -104 Token
unknown - line 1, column 36
= Error Code: 335544569 Call: EXECUTE PROCEDURE LOGIN_PROCEDURE(USER_NAME =
?, USER_PASSWORD = ?) bind => [user,
pw] Query:
DataReadQuery(name="login_procedure" )
The -104 SQL error usually indicates a SQL syntax error.
Everything is processed without any error until query.getSingleResult() is called. Calling query.getResultList() doesn't change anything. I've tried several 1.x and 2.x EclipseLink versions. The Firebird DB version is 2.1.
The JPA2 declaration is:
#Entity
#NamedStoredProcedureQuery(
name = "login_procedure",
resultClass = void.class,
procedureName = "LOGIN_PROCEDURE",
returnsResultSet = false,
parameters = {
#StoredProcedureParameter(queryParameter = "param1", name = "USER_NAME", direction = Direction.IN, type = String.class),
#StoredProcedureParameter(queryParameter = "param2", name = "USER_PASSWORD", direction = Direction.IN, type = String.class)
}
)
#Table(name = "USERS")
public class Login implements Serializable {
#Id
private Long id;
}
UPDATE:
After tinkering a little bit more, I believe there might be an error in the EclipseLink implementation as EXECUTE PROCEDURE LOGIN_PROCEDURE(USER_NAME = ?, USER_PASSWORD = ?) isn't valid Firebird 2.1 syntax for calling procedures.
By specifying the name="USER_NAME" you are making Eclipselink use the 'USER_NAME=?' syntax instead of just passing in the unnamed parameter. Try removing the name definition.
Inspired by this post, I've found a solution/workaround:
public class JPATest {
final Session session;
JPATest() {
final String DATABASE_USERNAME = "SYSDBA";
final String DATABASE_PASSWORD = "masterkey";
final String DATABASE_URL = "jdbc:firebirdsql:dbServer/3050:e:/my/db.fdb";
final String DATABASE_DRIVER = "org.firebirdsql.jdbc.FBDriver";
final DatabaseLogin login = new DatabaseLogin();
login.setUserName(DATABASE_USERNAME);
login.setPassword(DATABASE_PASSWORD);
login.setConnectionString(DATABASE_URL);
login.setDriverClassName(DATABASE_DRIVER);
login.setDatasourcePlatform(new FirebirdPlatform());
login.bindAllParameters();
final Project project = new Project(login);
session = project.createDatabaseSession();
session.setLogLevel(SessionLog.FINE);
((DatabaseSession) session).login();
}
public static void main(String[] args) {
final JPATest jpaTest = new JPATest();
jpaTest.run();
}
protected void run() {
testProcCursor();
}
/*
* Run Proc with scalar input and cursor output
*/
#SuppressWarnings("unchecked")
private void testProcCursor() {
final StoredProcedureCall call = new StoredProcedureCall();
call.setProcedureName("LOGIN");
call.addUnamedArgument("USER_NAME"); // .addNamedArgument doesn't work
call.addUnamedArgument("USER_PASSWORD");
final DataReadQuery query = new DataReadQuery();
query.setCall(call);
query.addArgument("USER_NAME");
query.addArgument("USER_PASSWORD");
final List<String> queryArgs = new ArrayList<String>();
queryArgs.add("onlinetester");
queryArgs.add("test");
final List outList = (List) session.executeQuery(query, queryArgs);
final ListIterator<DatabaseRecord> listIterator = ((List<DatabaseRecord>) outList).listIterator();
while (listIterator.hasNext()) {
final DatabaseRecord databaseRecord = listIterator.next();
System.out.println("Value -->" + databaseRecord.getValues());
}
}
}
Apparently named parameters aren't supported in my specific configuration but using unnamed parameters in annotations, hasn't solved the problem either. However using unnamed parameters, as specified above, solved the problem for me.