index percolate queries using spring data jpa - spring-data-elasticsearch

Here is my Dto for percolator query class.
#Data
#Document(indexName = "#{#es.indexName}")
#Builder(builderClassName = "RuleBuilder")
public class Rule {
#Id
private String id = UUID.randomUUID().toString();
private QueryBuilder query;
private RuleDataDto data;
public static class RuleBuilder {
private String id = UUID.randomUUID().toString();
}
}
Index Mapping
{
"mappings": {
"properties": {
"query": {
"type": "percolator"
},
"data": {
"properties": {
"subType": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
"type": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
},
"content": {
"type": "text"
}
}
}
}
based on criteria I am generating queries and trying to index those to percolator. but getting below exception
query malformed, no start_object after query name
what should be the query field Type? can someone help me on this

You are trying to store an Elasticsearch object (QueryBuilder) in Elasticsearch without specifying the mapping type.
You will need to annotate your query property as percolator type and might change the type of your property to a String:
#Document(indexName = "#{#es.indexName}")
public class Rule {
#Id
private String _id;
#Field(type = FieldType.Percolator)
private String query;
// ...
}
Or, if you want to have some other class for the query property you'll need a custom converter that will convert your object into a valid JSON query - see the documentation for the percolator mapping.

Related

guidance with iterating over nested json and extract values

new to coding and java so trying to figure out how I can iterate over this JSON and extract the nested values
{
"prop1": {
"description": "",
"type": "string"
},
"prop2": {
"description": "",
"type": "string"
},
"prop3": {
"description": "",
"type": "string"
},
"prop4": {
"description": "",
"type": "string"
}
}
So far I have this :
public class JSONReadExample
{
public static void main(String[] args) throws Exception {
Object procedure = new JSONParser().parse(new FileReader("myFile.json"));
ObjectMapper objectMapper = new ObjectMapper();
String procedureString = objectMapper.writeValueAsString(procedure);
Map<String, Object> map
= objectMapper.readValue(procedureString, new TypeReference<Map<String,Object>>(){});
map.forEach((key, value) -> System.out.println(key + ":" + value));
How can I retrieve the "type" such as "string" or "number" nested value for each key?
Try to specific type of Map like this:
Map<String, LinkedHashMap<String, String>> map = objectMapper.readValue(procedureString, new TypeReference<>() {});
And then you can retrieve "type" by using:
map.forEach((key, value) -> System.out.println(key + ":" + value.get("type")));

The boolean fields which are prefixed with "is" are not stored in index after upgrading to spring-data-elasticsearch:3.2.5.RELEASE

After upgrading to the spring-boot-starter:2.2.5.RELEASE, spring-cloud-dependencies:Hoxton.SR3, spring-cloud-stream-dependencies:Horsham.SR3 & spring-data-elasticsearch:3.2.5.RELEASE. The boolean fields are not stored in index/document. It was working earlier with spring boot 2.1.11.
I tried to create document manually using the ElasticSearch REST API. When tried directly with REST API, the boolean fields are stored in index.
Is there any changes done how the mappings are declared for boolean fields?
I'm using the ElasticsearchTemplate.index(IndexQuery) API to create an index document, where the IndexQuery is built with an document object having some boolean fields.
The following are the boolean fields in the CatalogIndex.java file.
#Document(indexName = "catalogentity")
public class CatalogIndex {
private boolean isType;
private boolean isAbstract;
private boolean isFinal;
private String stateId;
private String stageId;
//some other fields
public boolean isType() {
return isType;
}
public void setType(final boolean type) {
isType = type;
}
public boolean isAbstract() {
return isAbstract;
}
public void setAbstract(final boolean anAbstract) {
isAbstract = anAbstract;
}
public boolean isFinal() {
return isFinal;
}
public void setFinal(final boolean aFinal) {
isFinal = aFinal;
}
//some other setter and getters
The mappings are as follows
{
"properties": {
"type": {
"type": "boolean"
},
"abstract": {
"type": "boolean"
},
"final": {
"type": "boolean"
},
"stateId": {
"type": "text"
},
"stageId": {
"type": "keyword"
}
}
}
Thanks in advance,
Santhosh
The following is the working configuration for boolean fields. I was not sure why it was working fine before upgrade.
The following are the boolean fields in the CatalogIndex.java file.
#Document(indexName = "catalogentity")
public class CatalogIndex {
private boolean isType;
private boolean abstract1;
private boolean final1;
private String stateId;
private String stageId;
//some other fields
public boolean isType() {
return isType;
}
public void setType(final boolean type) {
isType = type;
}
public boolean isAbstract1() {
return abstract1;
}
public void setAbstract1(final boolean abstract1) {
this.abstract1 = abstract1;
}
public boolean isFinal1() {
return final1;
}
public void setFinal1(final boolean final1) {
this.final1 = final1;
}
//some other setter and getters
The mappings are as follows
{
"properties": {
"type": {
"type": "boolean"
},
"abstract1": {
"type": "boolean"
},
"final1": {
"type": "boolean"
},
"stateId": {
"type": "text"
},
"stageId": {
"type": "keyword"
}
}
}

Wiremock match request POST by params

I have a simple POST request sending params using application/x-www-form-urlencoded encoding.
Looking in the wiremock docs I can't find a way to match the request by the params values, something like the querystring match I mean.
Furthermore it seems also impossible to contains for the body, nor to match the entire body in clear (just as base64).
Is there a way to match this kind of requests?
Another option that I found was to use contains for Stubbing Content-Type: application/x-www-form-urlencoded
{
"request": {
"method": "POST",
"url": "/oauth/token",
"basicAuthCredentials": {
...
},
"bodyPatterns": [
{
"contains": "username=someuser"
}
]
},
"response": {
....
}
}
With classic wiremock you can use bodyPatterns' matchers and regular expressions:
for example:
...
"request": {
"method": "POST",
"url": "/api/v1/auth/login",
"bodyPatterns": [
{
"matches": "(.*&|^)username=test($|&.*)"
},
{
"matches": "(.*&|^)password=123($|&.*)"
}
]
},
I had a similar problem - I wanted to check the exact parameters , but without patterm magic (so easier to maintain). As a workaround, I created a helper class :
import java.util.Iterator;
import java.util.LinkedHashMap;
public class WireMockUtil {
public static String toFormUrlEncoded(LinkedHashMap<String, String> map) {
if (map == null) {
return "";
}
StringBuilder sb = new StringBuilder();
Iterator<String> it = map.keySet().iterator();
while (it.hasNext()) {
String key = it.next();
String value = map.get(key);
appendFormUrlEncoded(key,value,sb);
if (it.hasNext()) {
sb.append('&');
}
}
return sb.toString();
}
public static String toFormUrlEncoded(String key, String value) {
StringBuilder sb = new StringBuilder();
appendFormUrlEncoded(key, value,sb);
return sb.toString();
}
public static void appendFormUrlEncoded(String key, String value, StringBuilder sb) {
sb.append(key).append('=');
if (value != null) {
sb.append(value);
}
}
}
Inside the Wiremock test you can use it via:
LinkedHashMap<String, String> map = new LinkedHashMap<>();
map.put("key1", "value1");
map.put("key2", "value2");
...
withRequestBody(equalTo(WireMockUtil.toFormUrlEncoded(map))).
Or check only dedicated parts by containing:
withRequestBody(containing(WireMockUtil.toFormUrlEncoded("key","value1"))).
You could try https://github.com/WireMock-Net/WireMock.Net
Matching query parameters and body can be done with this example json:
{
"Guid": "dae02a0d-8a33-46ed-aab0-afbecc8643e3",
"Request": {
"Url": "/testabc",
"Methods": [
"put"
],
"Params": [
{
"Name": "start",
"Values": [ "1000", "1001" ]
},
{
"Name": "end",
"Values": [ "42" ]
}
],
"Body": {
"Matcher": {
"Name": "WildcardMatcher",
"Pattern": "test*test"
}
}
}
}

How to parse a json in MVC view

I am getting the values of some datatypes in umbraco to my MVC view in cshtml I am getting a result in JSON how will I parse the JSON to bind it to a drop down list. What are the possible methods to do this. I have installed NetonSoft JSON in project also
if (home.GetProperty("residentsLogin") != null && !string.IsNullOrEmpty(home.GetPropertyValue("residentsLogin")))
{
var residentslog = home.GetPropertyValue("residentsLogin");
}
My JSON is in the corresponding format
[
{
"name": "Property1",
"url": "http://www.google.com",
"target": "_blank",
"icon": "icon-link"
},
{
"name": "Property2",
"url": "http://www.google.com",
"target": "_blank",
"icon": "icon-link"
}
]
Working code should look like this:
public class MyJsonObject
{
public string name{get;set;}
public string url { get; set; }
public string target { get; set; }
public string icon { get; set; }
}
var residentslog = #"[
{
'name': 'Property1',
'url': 'http://www.google.com',
'target': '_blank',
'icon': 'icon-link'
},
{
'name': 'Property2',
'url': 'http://www.google.com',
'target': '_blank',
'icon': 'icon-link'
}
]";
List<MyJsonObject> myJsonObjectList = JsonConvert.DeserializeObject<List<MyJsonObject>>(residentslog);
ViewBag.MySelectList = new SelectList(myJsonObjectList, "name", "url");

MultiField mapping using spring-data-elasticsearch annotations

I am trying to use spring-data-elasticsearch to set up mappings for a type that is equivalent to the json configuration below and I am running into issues.
{
"_id" : {
"type" : "string",
"path" : "id"
},
"properties":{
"addressName":{
"type":"multi_field",
"fields":{
"addressName":{
"type":"string"
},
"edge":{
"analyzer":"edge_search_analyzer",
"type":"string"
}
}
},
"companyName":{
"type":"multi_field",
"fields":{
"companyName":{
"type":"string"
},
"edge":{
"analyzer":"edge_search_analyzer",
"type":"string"
}
}
}
}
}
Here is the entity class as it stands now:
#Document(indexName = "test", type="address")
#Setting(settingPath="/config/elasticsearch/test.json") // Used to configure custom analyzer/filter
public class Address {
#Id
private Integer id;
#MultiField(
mainField = #Field(type = FieldType.String),
otherFields = {
#NestedField(dotSuffix = "edge", type = FieldType.String, indexAnalyzer = "edge_search_analyzer")
}
)
private String addressName;
#MultiField(
mainField = #Field(type = FieldType.String),
otherFields = {
#NestedField(dotSuffix = "edge", type = FieldType.String, indexAnalyzer = "edge_search_analyzer")
}
)
private String companyName;
public Address() {}
public Address(AddressDTO dto) {
id = dto.getId();
addressName = dto.getAddressName();
companyName = dto.getCompanyName();
}
public Integer getId() {
return id;
}
public void setId(Integer id) {
Preconditions.checkNotNull(id);
this.id = id;
}
public String getAddressName() {
return addressName;
}
public void setAddressName(String addressName) {
this.addressName = addressName;
}
public String getCompanyName() {
return companyName;
}
public void setCompanyName(String companyName) {
this.companyName = companyName;
}
}
What I am finding is that this results in a mapping like:
"address" : {
"properties" : {
"addressName" : {
"type" : "string",
"fields" : {
"addressName.edge" : {
"type" : "string",
"index_analyzer" : "edge_search_analyzer"
}
}
},
"companyName" : {
"type" : "string",
"fields" : {
"companyName.edge" : {
"type" : "string",
"index_analyzer" : "edge_search_analyzer"
}
}
}
}
}
Which results in two issues:
Id mapping is not done so Elasticsearch generates the ids
Searches on the properties with the custom analyzer do not work properly
If I add the following to override the use of the annotations everything works fine:
#Mapping(mappingPath="/config/elasticsearch/address.json")
Where "address.json" contains the mapping configuration from the top of this post.
So, can anyone point out where I might be going wrong with what I'm trying to achieve?
Is is possible using available annotations or am I going to have to stick with json configuration?
Also, is there a "correct" way to setup index-level configuration via spring-data-elasticsearch, or is that not a supported approach? Currently, I am using the #Setting annotation for this purpose

Resources