I using this gem for elasticsearch API
I trying to convert the following curl statement to an equivalent API call
curl -X GET 'localhost:9200/_search?pretty=true' -d '{
"size": 100,
"fields": [
"#message",
"#timestamp"
],
"query": {
"term": {
"#message": "drop"
}
}
}'
I tried these but not getting intended results
Elasticsearch API
#esearch = Elasticsearch::Client.new log: true
#data2 = #esearch.search q: {
term:{
"#message" => "drop"
}
},
size:'100',
fields:'["#message", "#timestamp"]'
Transport API
client = Elasticsearch::Client.new
#data = client.perform_request 'GET', '_search', {
:size=> 100,
:query=> {
:term=> {
"message"=> "drop"
}
},
{
:fields=> [
'#message',
'#timestamp'
]
}
}
Please help
You need to wrap all of those parameters in a body element:
#data2 = #esearch.search
body: {
query: {term:{"#message" => "drop"}},
size:'100',
fields:'["#message", "#timestamp"]
}
Related
I want to use elasticsearch configuration about mapping to display user location and his/her direction to admin in my web app. so I create an index in elasticsearch like:
{
"settings": {
"index": {
"number_of_shards": 5,
"number_of_replicas": 1
},
"analysis": {
"analyzer": {
"analyzer-name": {
"type": "custom",
"tokenizer": "keyword",
"filter": "lowercase"
}
}
}
},
"mappings": {
"properties": {
"driver_id": { "type": "integer" },
"email": { "type": "text" },
"location": { "type": "geo_point" },
"app-platform": { "type": "text" },
"app-version": { "type": "text" },
"created_at": { "type": "date", "format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis"}
}
}
}
and start to inserting user location to elasticsearch with this curl
{
"driver_id": 357,
"driver_email": "Andrew#mailinatior.com",
"location": {
"lat": 37.3,
"lon": 59.52
},
"created_at": "2021-06-04 00:09:00"
}
this structure came from user mobile to my elasticsearch, after that I wrote these services to fetch data for my web-end part of my designing:
module Api
module V1
module Drivers
module Elastic
class LiveLocation
include Peafowl
attribute :driver_id, ::Integer
def call
#driver = ::Driver.find(driver_id) if driver_id.present?
result = []
options = {
headers: {
'Content-Type' => 'application/json'
},
body: #driver.present? ? options_with_driver : options
}
begin
response = HTTParty.get(elasticseach_url.to_s, options)
records = JSON.parse(response.body)['hits']['hits']
if records.present?
records.group_by { |r| r['_source']['driver_id'] }.to_a.each do |record|
driver = ::Driver.where(id: record[0]).first
if driver.present?
location = record[1][0]['_source']['location']
app_platform = record[1][0]['_source']['app-platform']
app_version = record[1][0]['_source']['app-version']
result.push(driver_id: driver.id, driver_email: driver.profile.email, location: location, app_platform: app_platform, app_version: app_version)
end
end
end
rescue StandardError => error
Rails.logger.info "Error => #{error}"
result = []
end
context[:response] = result
end
def elasticseach_url
"#{ENV.fetch('ELASTICSEARCH_BASE_URL', 'http://127.0.0.1:9200')}/#{ENV.fetch('ELASTICSEARCH_DRIVER_POSITION_INDEX', 'live_location')}/_search"
end
def options
{
query: {
bool: {
filter: [
{
range: {
created_at: {
gte: (Time.now.beginning_of_day.strftime '%Y-%m-%d %H:%M:%S')
}
}
}
]
}
},
sort: [
{
created_at: {
order: 'desc'
}
}
]
}.to_json
end
def optinos_with_driver
{
query: {
bool: {
must: [
{
term: {
driver_id: {
value: #driver.id
}
}
}
],
filter: [
{
range: {
created_at: {
gte: (Time.now.beginning_of_day.strftime '%Y-%m-%d %H:%M:%S')
}
}
}
]
}
},
sort: [
{
created_at: {
order: 'desc'
}
}
]
}.to_json
end
end
end
end
end
end
this structure working perfectly but even if the user stops while elasticsearch saves his location but I need to filter user data that if the user stops for one hour in place elasticsearch understand and not saving data. Is it possible?
I use elsticsearch 7.1
and ruby 2.5
I know it's possible in kibana but I could not using kibana at this tim.
I am not sure if this can be done via a single ES query...
However you can use 2 queries:
one to check if the user's location's during the last hour is the same
Second same then don't insert
But i don't recommend that
What you could do:
Use REDIS or any in-mem cache to maintain the user's last geo-location duration
Basis that, update or skip update to Elastic Search
PS: I am not familiar with ES geo-location API
I am using graph API to add message rule which is forward a mail from user's inbox. Rule is getting added but mails are not forwarding to specified id.
Here are some details:
var data = {
"displayName": "From partner",
"sequence": 1,
"isEnabled": true,
"conditions": {
"isAutomaticForward": true
},
"actions": {
"forwardTo": [
{
"emailAddress": {
"name": "recipient name ",
"address": "email address"
}
}
],
"stopProcessingRules": true
}
}
axios.post("https://graph.microsoft.com/v1.0/users/{userId}/mailFolders/inbox/messageRules", data,
{
headers: {
"Authorization": "Bearer " + access_token
}
}
)
.then(response => {
console.log(response.data)
})
.catch(err => {
console.log(err.response)
})
Response is as expected but mails are not forwarding.
I tried the above payload, steps and it works for me!!
(1) Create a new rule using Graph API
POST https://graph.microsoft.com/v1.0/me/mailFolders/inbox/messageRules
Content-type: application/json
{
"displayName": "From partner",
"sequence": 2,
"isAutomaticForward": true,
"actions": {
"forwardTo": [
{
"emailAddress": {
"name": "Alex Wilbur",
"address": "AlexW#contoso.onmicrosoft.com"
}
}
],
"stopProcessingRules": true
}
}
(2) Test whether the rule is working or not.
Result: It's working as expected
(3) Check that the above rule shows or not (as part of Outlook.office.com or Outlook UI's rule section)
Adding a snapshot that i captured from Outlook.office.com, mailbox settings!!
i have this http call code, the type is form
param = {
form: {
"creatives[]" => [
{
is_visible: params[:creative_banner_is_visible],
type: "banner",
value_translations: {
id: params[:creative_banner_value_id],
en: params[:creative_banner_value_en]
}
},
{
is_visible: params[:creative_video_is_visible],
type: "video",
value_translations: {
id: params[:creative_video_value_id],
en: params[:creative_video_value_en]
}
}
]
}
}
http = HTTP.headers(headers)
http.put(base_url, param)
but somehow this is translated to this on the target server
"creatives"=>[
"{:is_visible=>\"true\", :type=>\"banner\", :value_translations=>{:id=>\"Banner URL ID\", :en=>\"Banner URL EN\"}}",
"{:is_visible=>\"true\", :type=>\"video\", :value_translations=>{:id=>\"12345ID\", :en=>\"12345EN\"}}"
]
do you know how to make this http call not stringified? i used same schema on postman and work just fine
"creatives": [
{
"is_visible": true,
"type": "banner",
"value_translations": {
"id": "http://schroeder.info/elinore",
"en": "http://wehner.info/dusti"
}
},
{
"is_visible": true,
"type": "video",
"value_translations": {
"id": "85177e87-6b53-4268-9a3c-b7f1c206e002",
"en": "5134f3ca-ead7-4ab1-986f-a695e69ace96"
}
}
]
i'm using this gem https://github.com/httprb/http
EDIT
First, replace your "creatives[]" => [ ... with creatives: [ ... so the end result should be the following.
creatives = [
{
is_visible: params[:creative_banner_is_visible],
type: "banner",
value_translations: {
id: params[:creative_banner_value_id],
en: params[:creative_banner_value_en]
}
},
{
is_visible: params[:creative_video_is_visible],
type: "video",
value_translations: {
id: params[:creative_video_value_id],
en: params[:creative_video_value_en]
}
}
]
http = HTTP.headers(headers)
http.put(base_url, creatives.to_json)
Second, I don't see any problem with what you get in your target server, you just have to parse it to JSON, so if you also have a Rails app there use JSON.parse on the body.
somehow this approach fixed the issue
create_params = {}.compare_by_identity
create_params["creatives[][is_visible]"] = params[:creative_banner_is_visible]
create_params["creatives[][type]"] = 'banner'
create_params["creatives[][value_translations][id]"] = params[:creative_banner_value_id]
create_params["creatives[][value_translations][en]"] = params[:creative_banner_value_en]
create_params["creatives[][is_visible]"] = params[:creative_video_is_visible]
create_params["creatives[][type]"] = 'video'
create_params["creatives[][value_translations][id]"] = params[:creative_video_value_id]
create_params["creatives[][value_translations][en]"] = params[:creative_video_value_en]
i make this call to map by default, all datatypes as strings-
curl -XPUT 'http://localhost:9200/_all/_default_/_mapping' -d '
{
"mappings": {
"_default_": {
"dynamic_templates": [
{
"match": "*",
"mapping": {
"type": "string"
}
}
]
}
}
}
'
The mapping does not work, so i make this call to verify-
curl -XGET 'http://localhost:9200/_all/_mapping'
{
"logstash-2014.02.05": {
"_default_": {
"properties": {}
}
}
Why is the properties part empty?
You should delete the mappings key from your PUT request. You only specify mappings when you are creating an index, not when updating mappings.
We have an index of domain names in elasticsearch (we are using the tire gem with ruby to connect and maintain this) however we are having trouble with exact searches.
If I search for the term google.com in domains, it brings back google.com however it also brings back any domain with a dash (-) in such as in-google.com, research leads me to believe that - is a wildcard in ES and all I need to do is put not_analyzed but that doesn't work.
:domain => { :type => 'string' , :analyzer => 'whitespace' },
:domain_2 => { :type => 'string' , :analyzer => 'pattern' },
:domain_3 => { :type => 'string', :index => 'not_analyzed' },
:domain_4 => { :type => 'string', :analyzer => 'snowball' }
I've tried different analysers as you can see above, but they all have the same issue when searched using the 'head' plugin'.
https://gist.github.com/anonymous/8080839 is the code I'm using to generate the dataset to test with, what I'm looking for is the ability to search for JUST google and if I want *google I can implement my own wildcard?
I'm resigned to the fact that I'm going to have to delete and regenerate my index but no matter what analyser I choose or type, I still cannot get an exact match
You're not showing the sample queries you are using. Are you sure your queries and indexing uses the same text processing?
Also, you may want to check out the multi_field-approach to analyzing things multiple ways.
I've made a runnable example with a bunch of different queries that illustrate this. Note that the domain has been indexed in two ways, and note which field the queries are hitting: https://www.found.no/play/gist/ecc52fad687e83ddcf73
#!/bin/bash
export ELASTICSEARCH_ENDPOINT="http://localhost:9200"
# Create indexes
curl -XPUT "$ELASTICSEARCH_ENDPOINT/play" -d '{
"mappings": {
"type": {
"properties": {
"domain": {
"type": "multi_field",
"fields": {
"domain": {
"type": "string",
"analyzer": "standard"
},
"whitespace": {
"type": "string",
"analyzer": "whitespace"
}
}
}
}
}
}
}'
# Index documents
curl -XPOST "$ELASTICSEARCH_ENDPOINT/_bulk?refresh=true" -d '
{"index":{"_index":"play","_type":"type"}}
{"domain":"google.com"}
{"index":{"_index":"play","_type":"type"}}
{"domain":"in-google.com"}
'
# Do searches
# Matches both
curl -XPOST "$ELASTICSEARCH_ENDPOINT/_search?pretty" -d '
{
"query": {
"match": {
"_all": "google.com"
}
}
}
'
# Also matches "google.com". in-google.com gets tokenized to ["in", "google.com"]
# and the default match operator is `or`.
curl -XPOST "$ELASTICSEARCH_ENDPOINT/_search?pretty" -d '
{
"query": {
"match": {
"domain": {
"query": "in-google.com"
}
}
}
}
'
# What terms are generated? (Answer: `google.com` and `in`)
curl -XPOST "$ELASTICSEARCH_ENDPOINT/_search?pretty" -d '
{
"size": 0,
"facets": {
"domain": {
"terms": {
"field": "domain"
}
}
}
}
'
# This should just match the second document.
curl -XPOST "$ELASTICSEARCH_ENDPOINT/_search?pretty" -d '
{
"query": {
"match": {
"domain.whitespace": {
"query": "in-google.com"
}
}
}
}
'