How to create a query using QueryBuilders for a nested List.
Something like that:
import org.elasticsearch.index.query.QueryBuilders;
...
QueryBuilders.matchQuery(...);
I need to find a document where code equals "test" and value containing "nested" text.
Sample data in a nested class:
code = "test"
value = "my nested value"
Below are the class definitions.
import lombok.Data;
import org.springframework.data.annotation.Id;
import org.springframework.data.elasticsearch.annotations.Document;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
#Data
#Document(indexName = "MY_DOCUMENT")
public class MyDocument {
#Id
#Field(type = FieldType.Keyword)
private String id;
#Field(type = FieldType.Nested, name = "SUB_DOC")
private List<SubDocument> subDoc;
....
}
import lombok.Data;
import org.springframework.data.elasticsearch.annotations.Field;
import org.springframework.data.elasticsearch.annotations.FieldType;
#Data
class SubDocument {
#Field(type = FieldType.Text, name = "CODE")
private String code;
#Field(type = FieldType.Text, name = "VALUE")
private String value;
}
Dependencies from pom.xml
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-elasticsearch</artifactId>
<version>2.7.1</version>
</dependency>
Additionally, is it possible to create this type of query in a form acceptable to Lucene?
Something like that:
import org.elasticsearch.index.query.QueryBuilders;
import org.elasticsearch.index.query.AbstractQueryBuilder;
...
String stringQuery = "+SUB_DOC.VALUE:... +SUB_DOC.CODE:...";
AbstractQueryBuilder<?> q = QueryBuilders.queryStringQuery(stringQuery);
As you are explicitly asking for a solution using Elasticsearch QueryBuilders this is not really related to Spring Data Elasticsearch. You are just using Spring Data Elasticsearch to pass the Elasticsearch query down to the Elasticsearch client.
If you would use the Spring Data Elasticsearch classes, you could build a query like this:
var query = new CriteriaQuery(
new Criteria("subDoc.code").is("test")
.and("subDoc.value").contains("nested"));
var searchHits = operations.search(query, MyDocument.class);
Related
I have one rest endpoint which uses one class as the openapi schema:
import javax.ws.rs.Path;
import org.eclipse.microprofile.openapi.annotations.media.Schema;
import org.eclipse.microprofile.openapi.annotations.responses.APIResponse;
#Path("/orders")
#RequestScoped
public class OrdersRest {
#APIResponse(responseCode = "200", description = "Create a new order", content = {
#Content(mediaType = "application/json", schema = #Schema(implementation = OrderDto.class)) })
public Response create(OrderDto request) throws SessionNotFound {
The OrderDto class has one attribute that refers to another class, that exists on my project:
public class SessionDto {
private SessionSettingsProperties[] sessionSettingsProperties;
When I access the swagger-ui, I am receiving the error:
Errors
Resolver error at paths./session.get.responses.200.content.application/json.schema.properties.sessionSettingsProperties.items.$ref
Could not resolve reference: Could not resolve pointer: /components/schemas/SessionSettingsProperties does not exist in document
I'm using the Quarkus 1.13.1.Final version.
I've found this issue in Quarkus project that looks similar, but I believe is not exactly the same problem I have.
I was able to fix my problem including one org.eclipse.microprofile.openapi.annotations.media.Schema annotation in my field of the SessionDto class like this:
import org.eclipse.microprofile.openapi.annotations.media.Schema;
public class SessionDto {
#Schema(implementation = SessionSettingsProperties[].class)
private SessionSettingsProperties[] sessionSettingsProperties;
I have the following mapping
#Document(indexName = "some-index")
#Data
public class ElasticDocument {
#Id
#Field(type = FieldType.Text)
private String id;
#Field(type = FieldType.Date, format = DateFormat.custom)
#JsonFormat(shape = JsonFormat.Shape.STRING, pattern = "uuuu-MM-dd'T'HH:mm:ss.SSS")
private LocalDateTime issuedTimestamp;
}
The following repository
#Repository
public interface DocumentRepository extends ElasticsearchRepository<ElasticDocument, String> {
}
But the following query from spring data elastic search 4.0.3.RELEASE throws a conversion error:
Page<ElasticDocument> elasticDocuments = documentRepository.findAll(PageRequest.of(0, 10));
[MappingElasticsearchConverter.java:290] [Type LocalDateTime of
property ElasticDocument.issuedTimestamp is a TemporalAccessor
class but has neither a #Field annotation defining the date type nor a
registered converter for reading! It cannot be mapped from a complex
object in Elasticsearch!
[No converter found capable of converting from type [java.lang.Long]
to type [java.time.LocalDateTime]]
[org.springframework.core.convert.ConverterNotFoundException: No
converter found capable of converting from type [java.lang.Long] to
type [java.time.LocalDateTime]
I'm using elasticsearch 7.9.1 and spring data elasticsearch 4.0.3.RELEASE and from what i understood starting with spring data elasticsearch 4.x we don't need to create a custom conversion as long as i added the Field annotation at mapping
You need to add the pattern for your custom format in the #Field annotation
#Field(type = FieldType.Date, format = DateFormat.custom, pattern = "uuuu-MM-dd'T'HH:mm:ss.SSS")
Since spring-data-elastic version 4.2, dateformat.custom is deprecated. You can use any of the enum date_hour_minute_second_xxx in org.springframework.data.elasticsearch.annotations.DateFormat. E.g
#Field(type = FieldType.Date, format = DateFormat.date_hour_minute_second_millis)
private LocalDateTime createdDate;
I'm just starting to learn how to write procedures. My simple proof of concept still isn't passing muster when Neo4j starts up. Here is the code:
import java.util.ArrayList;
import java.util.stream.Stream;
import org.neo4j.graphdb.GraphDatabaseService;
import org.neo4j.logging.Log;
import org.neo4j.procedure.Context;
import org.neo4j.procedure.Mode;
import org.neo4j.procedure.Procedure;
public class Procedures {
#Context
public GraphDatabaseService db;
#Context
public Log log;
#Procedure( name = "create_user", mode = Mode.WRITE )
public Stream<Create_user_response> create_user() {
ArrayList<Create_user_response> myList = new ArrayList<>();
Create_user_response res1 = new Create_user_response();
res1.out = 1;
myList.add(res1);
Stream<Create_user_response> myStream = myList.stream();
return myStream;
}
}
Here's my Create_user_response class:
public class Create_user_response {
public int out;
}
When Neo4j starts up it complains that my procedure needs to return a stream of records. I'm new to streams so I must be doing something wrong but just can't figure it out.
I appreciate any help. Thanks.
Turns out I had a bad dependency,
<dependency>
<groupId>org.neo4j.test</groupId>
<artifactId>neo4j-harness</artifactId>
<version>3.2.3</version>
<scope>test</scope>
</dependency>
wasn't working. I found it in a tutorial and its in Maven, but for some reason I must not have been doing something correctly with it.
I would like to have a groovy script that can access my domain classes and extract all properties from there.
I have not written any groovy-scripts so far within my Grails application.
How do I do this?
I am thinking of something like
run-script <scriptname>
In the script I would like to
For all Domain classes
For all Fields
println (<database-table-name>.<database-field-name>)
What would be the easiest approach to achieve this.
Below I'm including a script code using which you can list down all the domain classes with their properties. This script generates a Map that contains the db mapping for domain and its properties. If you have a different requirement, you can achieve that using the same approach.
import org.codehaus.groovy.grails.commons.DefaultGrailsDomainClass
import org.codehaus.groovy.grails.commons.DomainClassArtefactHandler
import org.codehaus.groovy.grails.orm.hibernate.persister.entity.GroovyAwareSingleTableEntityPersister as GASTEP
import org.hibernate.SessionFactory
//Include script dependencies required for task dependencies
includeTargets << grailsScript("Bootstrap")
target(grailsDomianMappings: "List down field details for all grails domain classes") {
//Task dependencies required for initialization of app. eg: initialization of sessionFactory bean
depends(compile, bootstrap)
System.out.println("Running script...")
//Fetch session factory from application context
SessionFactory sessionFactory = appCtx.getBean("sessionFactory")
//Fetch all domain classes
def domains = grailsApp.getArtefacts(DomainClassArtefactHandler.TYPE)
GASTEP persister
List<String> propertyMappings = []
Map<String, List<String>> mappings = [:]
//Iterate over domain classes
for (DefaultGrailsDomainClass domainClass in domains) {
//Get class meta data
persister = sessionFactory.getClassMetadata(domainClass.clazz) as GASTEP
propertyMappings = []
//fetch table name mapping
String mappedTable = persister.tableName
//fetch all properties for domain
String[] propertyNames = persister.propertyNames
propertyNames += persister.identifierPropertyName
//fetch column name mappings for properties
propertyNames.each {
propertyMappings += persister.getPropertyColumnNames(it).first()
}
mappings.put(mappedTable, propertyMappings)
}
//Print data
mappings.each { String table, List<String> properties ->
properties.each { String property ->
System.out.println("${table}.${property}")
}
System.out.println("++++++++++++++++++++++++++++++++++++++++++++++")
}
}
setDefaultTarget(grailsDomianMappings)
I have the following classes
#XmlRootElement(name = "ExecutionRequest")
#XmlAccessorType(XmlAccessType.FIELD)
public class ExecutionRequest {
#XmlElement(name="Command")
private String command;
#XmlElementWrapper(name="ExecutionParameters")
#XmlElement(name="ExecutionParameter")
private ArrayList<ExecutionParameter> ExecutionParameters;
}
#XmlRootElement
#XmlAccessorType(XmlAccessType.FIELD)
public class ExecutionParameter {
#XmlElement(name = "Key")
private String key;
#XmlElement(name = "Value")
private String value;
}
and when I marshall the ExecutionRequest object, I get the following XML -
<ExecutionRequest>
<Command>RetrieveHeader</Command>
<ExecutionParameters>
<ExecutionParameter>
<Key>tid</Key>
<Value>ASTLGA-ALTE010220101</Value>
</ExecutionParameter>
<ExecutionParameter>
<Key>ctag</Key>
<Value>dq</Value>
</ExecutionParameter>
</ExecutionParameters>
</ExecutionRequest>
It is working fine as per JAXB binding.
But I want the XML to have all key value collection within one ExecutionParameter like -
<ExecutionRequest>
<Command>RetrieveHeader</Command>
<ExecutionParameters>
<ExecutionParameter>
<Key>tid</Key>
<Value>ASTLGA-ALTE010220101</Value>
<Key>ctag</Key>
<Value>dq</Value>
</ExecutionParameter>
</ExecutionParameters>
</ExecutionRequest>
Is there any way to obtain xml like this by changing annotation.
Let me know in case of clarifications.
Thanks in advance.
There isn't metadata for that. You could get a compact XML representation (that is easily parseable) by mapping key and value with #XmlAttribute.
<ExecutionParameters>
<ExecutionParameter Key="a" Value="b"/>
<ExecutionParameter Key="c" Value="d"/>
</ExecutionParameters>
UPDATE
If you have to support this XML format, then you could use JAXB with XSLT to get the desired result:
// Create Transformer
TransformerFactory tf = TransformerFactory.newInstance();
StreamSource xslt = new StreamSource(
"src/example/stylesheet.xsl");
Transformer transformer = tf.newTransformer(xslt);
// Source
JAXBContext jc = JAXBContext.newInstance(ExecutionRequest.class);
JAXBSource source = new JAXBSource(jc, request);
// Result
StreamResult result = new StreamResult(System.out);
// Transform
transformer.transform(source, result);
For More Information
http://blog.bdoughan.com/2012/11/using-jaxb-with-xslt-to-produce-html.html