How to check elasticsearch tokens after running a query in Rails? - ruby-on-rails

My problem is the following:
I run an elasticsearch query in a rails app using specific settings to my index and my search analyzer, the problem is that it doesnt return any results in the app, in the other hand when i try to run it directly from my elasticsearch docker, i have tokens returned. If i use these tokens in my app query, i get results...
so this is my elasticsearch query:
curl -XGET 'localhost:9200/development-stoot-services/_analyze?analyzer=search_francais' -d 'cours de guitare'
{"tokens":[{"token":"cour","start_offset":0,"end_offset":5,"type":"<ALPHANUM>","position":1},{"token":"guitar","start_offset":9,"end_offset":16,"type":"<ALPHANUM>","position":3}]}
here is the query from my rails app to elasticsearch:
query = {
"query" : {
"bool" : {
"must" : [
{
"range" : {
"deadline" : {
"gte" : "2016-05-26T10:27:19+02:00"
}
}
},
{
"terms" : {
"state" : [
"open"
]
}
},
{
"query_string" : {
"query" : "cours de guitare",
"default_operator" : "AND",
"fields" : [
"title",
"description",
"brand",
"category_name"
]
}
}
]
}
},
"filter" : {
"and" : [
{
"geo_distance" : {
"distance" : "40km",
"location" : {
"lat" : 48.855736,
"lon" : 2.32927300000006
}
}
}
]
},
"sort" : [
{
"created_at" : "desc"
}
]
}
the last query does not return any result, but if i try a query with the tokens returned by elasticsearch ('cour', 'guitar') i have expected results. So i guess there is a problem between rails and elasticsearch that i dont find...
Can anyone help on that ?

Try to modify your query like this, i.e. you need to specify the search_francais analyzer in your query_string in order to analyze cours de guitare the same way you did with the _analyze endpoint:
...
{
"query_string" : {
"query" : "cours de guitare",
"default_operator" : "AND",
"analyzer": "search_francais", <--- add this line
"fields" : [
"title",
"description",
"brand",
"category_name"
]
}
},
...

Related

How to boost the closest created_at field in Elasticsearch?

I want to sort my query results following some boost rules and in the same time i want them to be sorted as possible by creation date, if i add a created_at sort, it changes everything and my results are not relevant anymore. So i guess the only way to do that is to boost created_at field (the newest has the biggest bonus in calculating score for that boost) but i dont know how to implement it. This is my query:
query = {
"query" : {
"bool" : {
"must" : [
{
"range" : {
"deadline" : {
"gte" : "2016-05-30T11:39:10+02:00"
}
}
},
{
"terms" : {
"state" : [
"open"
]
}
},
{
"query_string" : {
"query" : "chant",
"default_operator" : "AND",
"analyzer" : "search_francais",
"fields" : [
"title^6",
"description",
"brand",
"category_name"
]
}
}
]
}
},
"filter" : {
"and" : [
{
"geo_distance" : {
"distance" : "40km",
"location" : {
"lat" : 48.855736,
"lon" : 2.32927300000006
}
}
}
]
},
"sort" : [
{
"_score" : "desc"
},
#{
# "created_at" : "desc" ==> i tried this but it doesnt change results
#}
]
}
Try adding your condition in should block.
i)If the created date should be closer to come value in the search query or you have any idea on how close the date should be, give a range query.
ii) If you are not sure of all those values, decay function can be used. In this case, query shall be changed to function query.
{
"query" : {
"bool" : {
"must" : [
{
"range" : {
"deadline" : {
"gte" : "2016-05-30T11:39:10+02:00"
}
}
},
{
"terms" : {
"state" : [
"open"
]
}
},
{
"query_string" : {
"query" : "chant",
"default_operator" : "AND",
"analyzer" : "search_francais",
"fields" : [
"title^6",
"description",
"brand",
"category_name"
]
}
}
],
"should": [
{"created_at" : "condition here .. "}
]
}
},
"filter" : {
"and" : [
{
"geo_distance" : {
"distance" : "40km",
"location" : {
"lat" : 48.855736,
"lon" : 2.32927300000006
}
}
}
]
}
}

Is it possible to create Salesreceipt without product/service value through QBO API?

Is it possible to create Salesreceipt without product/service value through QBO API? I have tried through API but it's not reflecting rate value and storing description value only.
If I remove ItemRef attribute(in request body) then it's reflecting rate and amount values and it's assigning some default and random product/service.
It is possible directly in QBO UI.
Request body where only description value storing:
{
"TxnDate" : "2016-05-27",
"Line" : [ {
"Amount" : 2222.00,
"Description" : "hi chk",
"DetailType" : "ItemReceiptLineDetail",
"ItemReceiptLineDetail" : {
"ItemRef" : { },
"Qty" : 1,
"UnitPrice" : 2222
} }
],
"CustomerRef" : {
"value" : "67"
},
"CustomerMemo" : {
"value" : "Thanks for your business! We appreciate referrals!"
},
"TotalAmt": 2222.00,
"PrivateNote" : "",
"CustomField" : [ {
"DefinitionId" : "1",
"Type" : "StringType",
"StringValue" : ""
} ]
}
Request body where default product/service assigning:
{
"TxnDate" : "2016-05-27",
"Line" : [ {
"Amount" : 2222.00,
"Description" : "hi chk",
"DetailType" : "ItemReceiptLineDetail",
"ItemReceiptLineDetail" : {
"Qty" : 1,
"UnitPrice" : 2222
} }
],
"CustomerRef" : {
"value" : "67"
},
"CustomerMemo" : {
"value" : "Thanks for your business! We appreciate referrals!"
},
"TotalAmt": 2222.00,
"PrivateNote" : "",
"CustomField" : [ {
"DefinitionId" : "1",
"Type" : "StringType",
"StringValue" : ""
} ]
}
No.
QuickBooks Online does not support this.

Escaping # at symbol in Ruby Elastic Search gem?

I have the following code in the custom ES 'where' wrapper method
filter: { term: params }
Then we have a sample ES document that contains:
"emails" => { "email" => "johndoe#email.com" }
It is returned when my search is:
query.where("emails.email" => "johndoe")
but I get no results when:
query.where("emails.email" => "johndoe#email.com")
It seems like I have to escape at symbol somehow when using ES gem?
It's probably because your field is analyzed using the default standard analyzer and is thus tokenized at the # sign.
You can see what ES has indexed by running the command below:
curl -XGET 'localhost:9200/_analyze?analyzer=standard&pretty' -d 'johndoe#email.com'
And the result is
{
"tokens" : [ {
"token" : "johndoe",
"start_offset" : 0,
"end_offset" : 7,
"type" : "<ALPHANUM>",
"position" : 1
}, {
"token" : "email.com",
"start_offset" : 8,
"end_offset" : 17,
"type" : "<ALPHANUM>",
"position" : 2
} ]
}
As you can see, your email field has been tokenized as two different tokens and that's probably why searching for johndoe works, while searching for the full email address doesn't.
There are a few ways out from here, but one way that would work is to create your own analyzer based on a pattern_capture token filter and use it as index_analyzer for your emails.email field.
{
"settings" : {
"analysis" : {
"filter" : {
"email" : {
"type" : "pattern_capture",
"preserve_original" : 1,
"patterns" : [ "([^#]+)", "(\\p{L}+)", "(\\d+)", "#(.+)" ]
}
},
"analyzer" : {
"email" : {
"tokenizer" : "uax_url_email",
"filter" : [ "email", "lowercase", "unique" ]
}
}
}
},
"mappings": {
"emails": {
"properties": {
"email": {
"type": "string",
"analyzer": "email" <-- use the analyzer here
}
}
}
}
}
At indexing time, that analyzer will produce all of the following tokens, which will allow you to search for any parts of your email address:
johndoe#email.com
johndoe
email.com
email
com

filtered query using NativeSearchQueryBuilder in spring-data elasticsearch

I am new to elasticsearch. I have a filtered query which gives me correct results using console:
GET _search
{
"query": {
"filtered": {
"query": {
"bool" : {
"should" : [
{
"match" : { "name" : "necklace" }
},
{
"match" : { "skuCode" : "necklace" }
}
]
}
},
"filter": {
"bool" : {
"must" : [
{
"term" : { "enabled" : true }
},
{
"term" : { "type" : "SIMPLE" }
},
{
"term" : { "tenantCode" : "Triveni" }
}
]
}
}
}
}
}
I am unable to get the corresponding spring-data version going. Here is what I tried:
SearchQuery searchQuery = new NativeSearchQueryBuilder().withQuery(boolQuery().should(matchQuery("skuCode", keyword)).should(matchQuery("name", keyword))).withFilter(
boolFilter().must(termFilter("enabled", true), termFilter("type", "SIMPLE"), termFilter("tenantCode", "Triveni"))).build();
This query gives me no results.
Can somebody please help me with this?
NativeSearchQueryBuilder.withFilter is converted to so called post_filter. See Post Filter for more details. So the query you have done on the console differs from the one that is generated by spring-data elasticsearch. To mimic the query from the console you have to use the FilteredQuery instead.
Change your query building to this:
QueryBuilder boolQueryBuilder = boolQuery().should(matchQuery("skuCode", keyword)).should(matchQuery("name", keyword));
FilterBuilder filterBuilder = boolFilter().must(termFilter("enabled", true), termFilter("type", "SIMPLE"), termFilter("tenantCode", "Triveni"));
NativeSearchQueryBuilder().withQuery(QueryBuilders.filteredQuery(boolQueryBuilder, filterBuilder).build();
Although as long as you do not use aggregations, this should not affect the (hits) results.

Bulk Data Delete in elasticsearch

This is my code:
HTTParty.delete("http://#{SERVER_DOMAIN}:9200/monitoring/mention_reports/_query?q=id:11321779,11321779", {
})
I want to delete data in bulk using id but this query is not deleting data from elasticsearch
Can anyone help me figuring out how can I delete data in bulk?
index_name should be as provided as per the index name in your code. Provide the ids to be deleted in the array(1,2,3).
CGI::escape is the URL encoder.
HTTParty.delete "http://#{SERVER_DOMAIN}:9200/index_name/_query?source=#{CGI::escape("{\"terms\":{\"_id\":[1,2,3]}}")}"
This actually uses the delete by query api of elasticsearch.
Incase if you are using tire ruby client to connect to elasticsearch:
id_array = [1,2,3]
query = Tire.search do |search|
search.query { |q| q.terms :_id, id_array }
end
index = Tire.index('<index_name>') # provide the index name as you have in your code
Tire::Configuration.client.delete "#{index.url}/_query?source=#{Tire::Utils.escape(query.to_hash[:query].to_json)}"
Reference: https://github.com/karmi/tire/issues/309
Provision is provided using: https://www.elastic.co/guide/en/elasticsearch/reference/1.4/docs-bulk.html
{ "index" : { "_index" : "test", "_type" : "type1", "_id" : "1" } }
{ "field1" : "value1" }
{ "delete" : { "_index" : "test", "_type" : "type1", "_id" : "2" } }
{ "create" : { "_index" : "test", "_type" : "type1", "_id" : "3" } }
{ "field1" : "value3" }
{ "update" : {"_id" : "1", "_type" : "type1", "_index" : "index1"} }
{ "doc" : {"field2" : "value2"} }
OR
curl -XPOST 'localhost:9200/customer/external/_bulk?pretty' -d '
{"update":{"_id":"1"}}
{"doc": { "name": "John Doe becomes Jane Doe" } }
{"delete":{"_id":"2"}}
'
Refer : How to handle multiple updates / deletes with Elasticsearch?

Resources