elasticsearch-model - how to set TTL option? - ruby-on-rails

I am using elasticsearch-model (https://github.com/elastic/elasticsearch-rails/tree/master/elasticsearch-model) but I can't find a way to set the TTL on the documents. I have tried a few ways but without success.
I have a model called "log" and an index called "logs". In the model I have the following mappings:
include Elasticsearch::Model
mappings _ttl: { enabled: true, default: '7d' }
Which I was hoping it would be similar to https://www.elastic.co/guide/en/elasticsearch/reference/current/mapping-ttl-field.html#mapping-ttl-field
In the console, this is what I can see:
> Log.mappings
=> #<Elasticsearch::Model::Indexing::Mappings:0x007fcefd985368 #mapping={}, #options={:_ttl=>{:enabled=>true, :default=>"7d"}}, #type="log">
However, when using cURL, I can only the model properties and not the ttl option:
curl -XGET 'http://localhost:9200/logs/_mappings'
{
"logs": {
"mappings": {
"log": {
"properties": {
"created_at": {
"format": "dateOptionalTime",
"type": "date"
},
"id": {
"type": "long"
},
"operation": {
"type": "string"
},
"order_number": {
"type": "string"
},
"request_payload": {
"type": "string"
},
"response_payload": {
"type": "string"
},
"updated_at": {
"format": "dateOptionalTime",
"type": "date"
}
}
}
}
}
}
I've also tried to add the TTL option to the bulk import as an option (to achieve this: https://www.elastic.co/guide/en/elasticsearch/reference/1.4/docs-bulk.html#bulk-ttl) but I can't use that option:
> Log.import({
query: -> { where(id: ids) },
refresh: true,
return: 'errors',
ttl: '1d'
})
Unknown key: :ttl. Valid keys are: :start, :batch_size : {"ids":[358]}
Any idea on how I can do this?

Related

Logstash output to ES gives me error code '400'

i am creating an ELK stack to fetch tweet and analyse them. When, i start my elk stack i got this error message from Logstash
Failed to install template {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://elasticsearch:9200/_index_template/twitter'", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:84:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:324:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:311:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:398:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:310:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:318:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:412:in `template_put'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:85:in `template_install'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/template_manager.rb:29:in `install'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch/template_manager.rb:17:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch.rb:578:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch.rb:344:in `finish_register'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/outputs/elasticsearch.rb:300:in `block in register'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-output-elasticsearch-11.9.3-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:154:in `block in after_successful_connection'"]}
I think i got an error inside my index template but even on internet i didn't found what is wrong with it.
I am using:
logstash:8.5.3
elastictsearch:8.5.3
kibana:8.5.3
this is my template:
{
"template": "twitter-*",
"settings": {
"number_of_shards": 1,
"number_of_replicas": 0,
"index.mapping.total_fields.limit": 2000
},
"mappings": {
"_default_": {
"_all": {
"enabled": true
},
"properties": {
"#timestamp": {
"type": "date",
"format": "dateOptionalTime"
},
"created_at": {
"type": "date",
"format": "EEE MMM dd HH:mm:ss Z YYYY"
},
"text": {
"type": "text"
},
"user": {
"type": "object",
"properties": {
"description": {
"type": "text"
}
}
},
"coordinates": {
"type": "object",
"properties": {
"coordinates": {
"type": "geo_point"
}
}
},
"entities": {
"type": "object",
"properties": {
"hashtags": {
"type": "object",
"properties": {
"text": {
"type": "text",
"fielddata": true
}
}
}
}
},
"retweeted_status": {
"type": "object",
"properties": {
"text": {
"type": "text"
}
}
}
},
"dynamic_templates": [
{
"string_template": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"type": "keyword"
}
}
}
]
}
}
}
and this is my logstash config, i send my tweet via tcp because i have a python bot who fetch them for me.
input {
tcp {
port => 50000
}
}
filter {
}
output {
elasticsearch {
hosts => "elasticsearch:9200"
user => "elastic"
password => "${LOGSTASH_INTERNAL_PASSWORD}"
index => "twitter-%{+yyyy.MM.dd}"
document_type => "tweets"
template => "./templates/twitter.json"
template_name => "twitter"
template_overwrite => true
}
}
Thanks for any help <3
What did you try ?
I tried to modify some attributes of my template, but i still got this message.
What were you expecting?
I was expecting that logstash create my index_template, for incoming tweets.

LogicApps / Graph API - delete an email from a shared mailbox

as per title.
Whilst dealing with my own mailbox is fine I wish to ideally process a shared mailbox in LogicApps. I can read the emails but I want to clean-up by deleting the messages.
Logic Apps doesn't seem to offer that action and unless mistaken the graph api doesnt either?
Has anyone managed this?
This has been possible since the update on 6th May 2020. A number of the actions now support an Original Mailbox Address optional parameter that you can set to access the shared mailbox:
As of May 6, 2020, shared mailbox support was added for certain operations with an optional 'Mailbox address' parameter, allowing you to specify a shared mailbox address for your operation to access.
The Delete Email (V2) action supports this parameter:
You then fill in the email address of your shared mailbox:
Which will then successfully find your message id and delete the email for you:
For this requirement, I write a logic app sample for your reference:
Before we start, provide the whole logic app:
1. In first "HTTP" action, I request for the access token.
2. Then I add a "Parse JSON" action to parse the response from the first "HTTP" action to get the access token.
The schema should be:
{
"properties": {
"access_token": {
"type": "string"
},
"expires_in": {
"type": "integer"
},
"ext_expires_in": {
"type": "integer"
},
"token_type": {
"type": "string"
}
},
"type": "object"
}
3. In the second "HTTP" action, I request for all of the messages by the access token from above steps.
4. After that, we need to use another "Parse JSON" action to parse the response from "HTTP 2".
The schema should be:
{
"properties": {
"##odata.context": {
"type": "string"
},
"value": {
"items": {
"properties": {
"##odata.etag": {
"type": "string"
},
"bccRecipients": {
"type": "array"
},
"body": {
"properties": {
"content": {
"type": "string"
},
"contentType": {
"type": "string"
}
},
"type": "object"
},
"bodyPreview": {
"type": "string"
},
"categories": {
"type": "array"
},
"ccRecipients": {
"type": "array"
},
"changeKey": {
"type": "string"
},
"conversationId": {
"type": "string"
},
"conversationIndex": {
"type": "string"
},
"createdDateTime": {
"type": "string"
},
"flag": {
"properties": {
"flagStatus": {
"type": "string"
}
},
"type": "object"
},
"from": {
"properties": {
"emailAddress": {
"properties": {
"address": {
"type": "string"
},
"name": {
"type": "string"
}
},
"type": "object"
}
},
"type": "object"
},
"hasAttachments": {
"type": "boolean"
},
"id": {
"type": "string"
},
"importance": {
"type": "string"
},
"inferenceClassification": {
"type": "string"
},
"internetMessageId": {
"type": "string"
},
"isDeliveryReceiptRequested": {
"type": "boolean"
},
"isDraft": {
"type": "boolean"
},
"isRead": {
"type": "boolean"
},
"isReadReceiptRequested": {
"type": "boolean"
},
"lastModifiedDateTime": {
"type": "string"
},
"parentFolderId": {
"type": "string"
},
"receivedDateTime": {
"type": "string"
},
"replyTo": {
"type": "array"
},
"sender": {
"properties": {
"emailAddress": {
"properties": {
"address": {
"type": "string"
},
"name": {
"type": "string"
}
},
"type": "object"
}
},
"type": "object"
},
"sentDateTime": {
"type": "string"
},
"subject": {
"type": "string"
},
"toRecipients": {
"items": {
"properties": {
"emailAddress": {
"properties": {
"address": {
"type": "string"
},
"name": {
"type": "string"
}
},
"type": "object"
}
},
"required": [
"emailAddress"
],
"type": "object"
},
"type": "array"
},
"webLink": {
"type": "string"
}
},
"required": [
"##odata.etag",
"id",
"createdDateTime",
"lastModifiedDateTime",
"changeKey",
"categories",
"receivedDateTime",
"sentDateTime",
"hasAttachments",
"internetMessageId",
"subject",
"bodyPreview",
"importance",
"parentFolderId",
"conversationId",
"conversationIndex",
"isDeliveryReceiptRequested",
"isReadReceiptRequested",
"isRead",
"isDraft",
"webLink",
"inferenceClassification",
"body",
"sender",
"from",
"toRecipients",
"ccRecipients",
"bccRecipients",
"replyTo",
"flag"
],
"type": "object"
},
"type": "array"
}
},
"type": "object"
}
5. Then use a "For each" and loop the value from "Parse JSON 2".
6. In "For each", we need to add third "HTTP" action as below screenshot:
7. Run the logic app, it will delete all of the messages in shared mailbox.
By the way:
Before run the logic app, you need to search the client_id in your azure ad app registrations to find the application and add the Mail.ReadWrite permission to it. Also don't forget grant admin consent for it.

Owner_id field does not pass validation error. What is wrong with my schema?

I am looking into what is wrong with my schema. I'm attempting to insert an entry into my collection and I have gotten a slew of errors as I've changed things around but this seems to be the closest I have gotten to successfully inserting a document. I am using the mongodb-stitch-browser-sdk in a React Ionic project and I have a valid user logged in.
I am using the StitchUser.id which is a string as my owner_id (matches the id of my valid user in users collection).
Here is my schema followed by the error in Stitch logs. I was simply trying to insert a document to my Goals table. Also, there are no filters on this collection and there is only one role with the following rule.
{
"owner_id": "%%user.id"
}
This gives the user read and write permissions on the collection's that they created.
{
"bsonType": "object",
"required": [
"goalTitle",
"startDate",
"endDate",
"owner_id"
],
"properties": {
"_id": {
"bsonType": "objectId"
},
"owner_id": {
"bsonType": "string",
"validate": {
"%or": [
{
"%%prevRoot.owner_id": {
"%exists": false
}
},
{
"%%prevRoot.owner_id": "%%this"
}
]
}
},
"goalTitle": {
"bsonType": "string",
"minLength": {
"$numberInt": "1"
},
"maxLength": {
"$numberInt": "30"
}
},
"goalDescription": {
"bsonType": "string",
"minLength": {
"$numberInt": "0"
},
"maxLength": {
"$numberInt": "600"
}
},
"startDate": {
"bsonType": "string"
},
"endDate": {
"bsonType": "string"
}
}
}
Error:
role "owner" in "todo_list.Goals" does not have insert permission for document with _id: ObjectID("5e6aa8d11d233536e3ea8604"): could not validate document:
owner_id: Does not pass validation
Stack Trace:
StitchError: insert not permitted
Details:
{
"serviceAction": "insertOne",
"serviceName": "mongodb-atlas",
"serviceType": "mongodb-atlas"
}
{
"arguments": [
{
"collection": "Goals",
"database": "todo_list",
"document": {
"goalTitle": "Test Goal",
"goalDescription": "Test Description",
"endDate": "2020-03-11",
"startDate": "2020-03-10",
"owner_id": "5e6891382e6039c1c32f7d46",
"_id": {
"$oid": "5e6aa8d11d233536e3ea8604"
}
}
}
],
"name": "insertOne",
"service": "mongodb-atlas"
}
I've created another collection with no schema and the same rule checking for owner_id and documents in that collection are able to be inserted just fine. I'd have to imagine it is a schema error.

ElasticSearch: Indexing with multiple mapping types

I am trying to fully comprehend indexing with multiple mapping types in ElasticSearch. In the docs it gives example code:
PUT my_index
{
"mappings": {
"user": {
"_all": { "enabled": false },
"properties": {
"title": { "type": "string" },
"name": { "type": "string" },
"age": { "type": "integer" }
}
},
"blogpost": {
"properties": {
"title": { "type": "string" },
"body": { "type": "string" },
"user_id": {
"type": "string",
"index": "not_analyzed"
},
"created": {
"type": "date",
"format": "strict_date_optional_time||epoch_millis"
}
}
}
}
}
With this mapping how would I then create and search on an object?
For create would it be:
POST my_index/user/blogspot
or
POST my_index/user,blogspot
For searching would it be:
GET my_index/user/blogspot
or
GET my_index/user,blogspot
or something else?
An example of a POST and GET with multiple mapping would really help me out. Thank you so much!

ElasticSearch Rails - Setting a Custom Analyzer

I'm using ElasticSearch in Rails 4 through elasticsearch-rails (https://github.com/elasticsearch/elasticsearch-rails)
I have a User model, with an email attribute.
I'm trying to use the 'uax_url_email' tokenizer described in the docs:
class User < ActiveRecord::Base
include Elasticsearch::Model
include Elasticsearch::Model::Callbacks
settings analysis: { analyzer: { whole_email: { tokenizer: 'uax_url_email' } } } do
mappings dynamic: 'false' do
indexes :email, analyzer: 'whole_email'
end
end
end
I followed examples in the wiki (https://github.com/elasticsearch/elasticsearch-rails/wiki) and the elasticsearch-model docs (https://github.com/elasticsearch/elasticsearch-rails/wiki) to arrive at this.
It doesn't work. If I query elasticsearch directly:
curl -XGET 'localhost:9200/users/_mapping
It returns:
{
"users": {
"mappings": {
"user": {
"properties": {
"birthdate": {
"type": "date",
"format": "dateOptionalTime"
},
"created_at": {
"type": "date",
"format": "dateOptionalTime"
},
"email": {
"type": "string"
},
"first_name": {
"type": "string"
},
"gender": {
"type": "string"
},
"id": {
"type": "long"
},
"last_name": {
"type": "string"
},
"name": {
"type": "string"
},
"role": {
"type": "string"
},
"updated_at": {
"type": "date",
"format": "dateOptionalTime"
}
}
}
}
}
}
This ended up being an issue with how I was creating the index. I was trying:
User.__elasticsearch__.client.indices.delete index: User.index_name
User.import
I expected this to delete the index, then re-import the values. However I needed to do:
User.__elasticsearch__.create_index! force: true
User.import

Resources