I am using the 1.5.1 version of the Cassandra adapter for Java, and I am attempting to insert a record to a table having a Map column that uses a UDT as the column value (key is just a varchar). I created the table using cqlsh and am able to insert to it in CQL just fine. When i try to use a CassandraTemplate...insert(), though, all I get is:
UserTypeResolver must not be null
My UDT class is defined as:
#UserDefinedType("iss_type")
public class IssueType {
#CassandraType(type = DataType.Name.VARCHAR)
private String issue_code;
#CassandraType(type = DataType.Name.VARCHAR)
private String issue_name;
#CassandraType(type = DataType.Name.TIMESTAMP)
private Date issue_start;
#CassandraType(type = DataType.Name.TIMESTAMP)
private Date issue_end;
And my table is defined as
#Table
public class IssueMap {
#PrimaryKeyColumn(
type = PrimaryKeyType.PARTITIONED)
private UUID map_id;
#Column
private Map<String, IssueType> issue_map;
In my config, i have:
#Bean
public CassandraMappingContext cassandraMapping()
throws ClassNotFoundException {
BasicCassandraMappingContext mappingContext = new BasicCassandraMappingContext();
mappingContext.setUserTypeResolver(new SimpleUserTypeResolver(cluster().getObject(),"campaign_management"));
return mappingContext;
}
Is there something I am missing here? Or is it just not possible to define columns this way? I realize I could just use a query builder to create the CQL and do the insert, but I was hoping to be able to make full use of the Cassandra template features.
Related
I am learning elasticsearch with spring data so can someone help me understand better what elasticsearch query is doing here. I am trying to return back only a set of results based off of a certain value, in this case env. It seems to me that this JPQL query, is not making a difference to only return what I ask for. I have also used an #Query with no difference.
-- here is part of my repository class
public interface MyFormRepo extends ElasticsearchRepository<MyForm, String> {
//??? these function calls are not effecting my return
#Query("{\"bool\": {\"must\": [{\"match\": {\"env\": \"?0\"}}]}}")
Page<MyForm> getAllByEnv(String env, Pageable pageable);
Page<MyForm> findAllByEnv(String env, Pageable pageable);
-- Here is part of my entity class
#Document(indexName = "my_form")
public class MyForm {
#Id
private String id;
#Field(type = Text)
private String schema;
#Field(type = Long)
private long version;
#Field(type = Text)
private String env;
Here is what I understand. Elasticsearch has this concept called Fuzziness (https://www.elastic.co/guide/en/elasticsearch/reference/current/common-options.html#fuzziness) so in it's searches which is based on Levenshtein distance (https://en.wikipedia.org/wiki/Levenshtein_distance). Spring data does not allow us to modify this out of the box and is considered Fuzziness.AUTO by default. https://docs.spring.io/spring-data/elasticsearch/docs/current/reference/html/#elasticsearch.misc. As for the queries neither of them will do anything different. Both findByAllEnv and getAllBeEnv has a fuziness.AUTO, as for the #Query I found a good reason stated at this site: What is difference between match and bool must match query in Elasticsearch. What I ended up finding is that I must implemented a custom repo for that I have found this example/explaination How to query Elastic with Spring-data-elastic.
I have a neo4j database, and one of the nodes in the database has the following:
public class Location
{
.
.
.
#Property(name = "zone_quantities")
private List<ZoneQuantity> zoneQuantityList;
}
When I try to use ogm to map the node into the object, I get the following error:
Caused by: org.neo4j.ogm.exception.core.InvalidPropertyFieldException: 'com.livspace.atp.domain.InvSku#zoneQuantityList' is not persistable as property but has not been marked as transient.
Neo4j version - 3.1.6
#Getter
#Setter
#NoArgsConstructor
#AllArgsConstructor
public class ZoneQuantity {
private String zone;
private Integer quantity;
}
Neo4j-OGM does not know what to do with the custom type property ZoneQuantity.
It is either a relationship and has to get described as such e.g.
#Relationship("HAS_ZONE")
private List<ZoneQuantity> zoneQuantityList;
or you need to provide an AttributeConverter for your custom type (collection) that converts it into a database compatible type. For example something like this converter https://github.com/neo4j/neo4j-ogm/blob/master/core/src/main/java/org/neo4j/ogm/typeconversion/NumberCollectionStringConverter.java#L37
I'm trying to get working deletion of one document in spring data elasticsearch repository. And can't find way how to solve this error:
[userindex] RoutingMissingException[routing is required for
[userindex]/[address]/[12]
I have two linked documents:
#Document(indexName = "userindex", type = "user")
public class User {
#Field(index = FieldIndex.not_analyzed, type = FieldType.Long)
private Long userId;
...
}
#Document(indexName = "userindex", type = "address")
public class Address {
#Field(type = FieldType.String)
private String name;
#Field(index = FieldIndex.not_analyzed, type = FieldType.String)
private String addressId;
#Field(type = FieldType.String, store = true)
#Parent(type = "user")
private String parentId;
...
}
When I'm trying to delete one address via ElasticsearchCrudRepository<Address, Long> by using standard method delete(Long id) I receiving RoutingMissingException mentioned above.
If I'm trying to do it using ElasticSeach client, like this:
client.prepareDelete().setIndex("userindex")
.setType("address")
.setParent("user")
.setId(id.toString())
.execute().get();
everything works fine, but seems to me working directly with client is not the spring-data way.
Also I can't find any way how to customize delete method with annotation org.springframework.data.elasticsearch.annotations.Query.
I checked sources of org.springframework.data.elasticsearch.core.ElasticsearchTemplate and can't find any way how to add support for delete query.
Anybody knows how to solve it instead of using a client?
The version of spring-data-elasticsearch is 2.0.1
Update 03.05.2017
First of all, in my code was an error with my deletion, don't how it worked before, but it should be:
client.prepareDelete().setIndex("userindex")
.setType("address")
.setParent("500")
.setId(id.toString())
.execute().get();
Here 500 is parent id instead of type name.
And now about the spring-data way. There is no spring-data way in elasticsearch integration.
Proof:
DATAES-257
DATAES-331
If you want to do it the Spring way, you can use ElasticsearchTemplate which is much similar to RestTemplate.
ElasticsearchTemplate has deleteIndex() function which can delete the whole index. Also, you can do lots of other stuff with the template.
Example project with delete index is here: https://github.com/TechPrimers/spring-data-elastic-example-4/blob/master/src/main/java/com/techprimers/elastic/resource/SearchResource.java
Code:
#Autowired
ElasticsearchTemplate template;
template.deleteIndex(Users.class);
I have a procedure, which returns rows.
Each row has one column, which is a string, like so..
create procedure myproc(IN var1 varchar(255), IN var2 varchar(255))
begin
select col3 from table1 where col1 = var1 and col2 = val2;
end;
/
I want to invoke this procedure using Spring Data.
The Spring Data manual says, I could invoke this procedure
from a method in a Repository, which is annotated with #Procedure
like so..
#Procedure(procedureName = "myproc")
List<String> myproc(String val1, String val2);
My question is.. how should the repository class, which will contain the annotated method,
be declared as
Should it be like this..
class MyRepository extends JpaRepository<T,ID>{
}
If yes, what type parameter should I use for T and ID
Should I necessarily create an entity class, to replace T above
If yes, what is a suitable type for ID in this case
And would I have to create a table in the database to hold this entity?
Spring data jpa follows the 'domain driven design'.
So I think it is mandatory to pass Domain.
Your domain is like
#Entity
public class Person {
#Id
#GeneratedValue
private Long id;
.... getter setter...
}
Your repository should be like below
public interface PersonRepository extends JpaRepository<Person, Long> {
#Procedure(procedureName = "myproc")
List<String> myproc(String val1, String val2);
}
You need to pass your domain class as T
You need to pass data type of id field of Person class as a ID, In this case it's a Long
For more info by example https://dzone.com/articles/calling-stored-procedures-from-spring-data-jpa
I have an entity Place:
#NodeEntity
#TypeAlias(value="Place")
public class Place implements Serializable{
private static final long serialVersionUID = 1L;
#GraphId
private Long nodeId;
#JsonProperty("id")
#Indexed(unique=true)
private String id;
//...
}
And I try to get a node based on its id attribute this way:
String pfc = "1234";
(Node)template.getIndex(Place.class, "id").get("id", pfc).getSingle()
But I'm having this exception :
java.lang.IllegalStateException: Index name for class java.lang.String id rel: false idx: true must differ from the default name: Place
Must I necessairly add a name to the index?
If yes, how should I do for the existent data?
Which version are you using? In SDN 3.x for Neo4j 2.x the indexes and constraints are used automatically.
I would use a PlaceRepository with a findById() method.
Or in general cypher queries that access place like this:
MATCH (p:Place {id:{id}})-->(x)
RETURN x
You can access them manually with template.merge() or template.findByLabelAndProperty() which I would only recommend when you know what you're doing.