Create calendar event using ruby outlook - ruby-on-rails

I am trying create a event in calendar,
Iam able get all the data like calendar,contacts and emails by following below documentaion,
https://learn.microsoft.com/en-us/outlook/rest/ruby-tutorial,
But when try to create a event using ruby_outlook getting below error
{"ruby_outlook_error"=>401,
"ruby_outlook_response"=>{"error"=>{"code"=>"InvalidAudience", "message"=>"The audience claim value is invalid 'aud'.",
"innerError"=>{"requestId"=>"75984820-5241-11ea-b6fc-fc4dd44c1550", "date"=>"2020-02-18T11:26:08"}}}}
Below code is for creating event
def def index
token = get_access_token //getting access token
if token
outlook_client = RubyOutlook::Client.new
event_payload =
{
"Subject": "Discuss the Calendar REST API",
"Body": {
"ContentType": "HTML",
"Content": "I think it will meet our requirements!"
},
"Start": {
"DateTime": "2020-03-03T18:00:00",
"TimeZone": "Pacific Standard Time"
},
"End": {
"DateTime": "2020-03-03T19:00:00",
"TimeZone": "Pacific Standard Time"
},
"Attendees": [
{
"EmailAddress": {
"Address": "john#example.com",
"Name": "John Doe"
},
"Type": "Required"
}
]
}
outlook_client.create_event(token, event_payload, nil, 'user#domain.com')
end
end

Your issue is that the token that you fetched was using the Microsoft graph API but now you are trying to create an even through the Outlook API. You cannot use a token issued for Graph ("aud": "graph.microsoft.com") against the Outlook endpoint. You need a token with "aud": "outlook.office.com".Better is use the graph API itself using the graph gem to create an event since you already have the token fetched from it.
To do that first create the MicrosoftGraph object
def create_service_auth
access_token = get_access_token
callback = Proc.new do |r|
r.headers['Authorization'] = "Bearer #{access_token}"
r.headers['Content-Type'] = 'application/json'
r.headers['X-AnchorMailbox'] = "#{ email_of_calendar_for_which_to_create_the_event }"
end
#graph = ::MicrosoftGraph.new(base_url: 'https://graph.microsoft.com/v1.0',
cached_metadata_file: File.join(MicrosoftGraph::CACHED_METADATA_DIRECTORY, 'metadata_v1.0.xml'),
&callback)
end
Then create the event -
def create_event
event = {
subject: summary,
body: {
content_type: "HTML",
content: description
},
start: {
date_time: start_time,
time_zone: timezone
},
end: {
date_time: end_time,
time_zone: timezone
},
response_requested: true,
organizer: {emailAddress: { name: "#{organizer.full_name}", address: email_of_calendar_for_which_to_create_the_event }},
attendees: [
{
email_address: {
address: attendee_email,
name: "#{attendee.full_name}"
},
type: "required"
}
]
}
result = #graph.me.events.create(event)
end

Related

Elasticsearch Find Out does user stops or moving - Possible?

I want to use elasticsearch configuration about mapping to display user location and his/her direction to admin in my web app. so I create an index in elasticsearch like:
{
"settings": {
"index": {
"number_of_shards": 5,
"number_of_replicas": 1
},
"analysis": {
"analyzer": {
"analyzer-name": {
"type": "custom",
"tokenizer": "keyword",
"filter": "lowercase"
}
}
}
},
"mappings": {
"properties": {
"driver_id": { "type": "integer" },
"email": { "type": "text" },
"location": { "type": "geo_point" },
"app-platform": { "type": "text" },
"app-version": { "type": "text" },
"created_at": { "type": "date", "format": "yyyy-MM-dd HH:mm:ss||yyyy-MM-dd||epoch_millis"}
}
}
}
and start to inserting user location to elasticsearch with this curl
{
"driver_id": 357,
"driver_email": "Andrew#mailinatior.com",
"location": {
"lat": 37.3,
"lon": 59.52
},
"created_at": "2021-06-04 00:09:00"
}
this structure came from user mobile to my elasticsearch, after that I wrote these services to fetch data for my web-end part of my designing:
module Api
module V1
module Drivers
module Elastic
class LiveLocation
include Peafowl
attribute :driver_id, ::Integer
def call
#driver = ::Driver.find(driver_id) if driver_id.present?
result = []
options = {
headers: {
'Content-Type' => 'application/json'
},
body: #driver.present? ? options_with_driver : options
}
begin
response = HTTParty.get(elasticseach_url.to_s, options)
records = JSON.parse(response.body)['hits']['hits']
if records.present?
records.group_by { |r| r['_source']['driver_id'] }.to_a.each do |record|
driver = ::Driver.where(id: record[0]).first
if driver.present?
location = record[1][0]['_source']['location']
app_platform = record[1][0]['_source']['app-platform']
app_version = record[1][0]['_source']['app-version']
result.push(driver_id: driver.id, driver_email: driver.profile.email, location: location, app_platform: app_platform, app_version: app_version)
end
end
end
rescue StandardError => error
Rails.logger.info "Error => #{error}"
result = []
end
context[:response] = result
end
def elasticseach_url
"#{ENV.fetch('ELASTICSEARCH_BASE_URL', 'http://127.0.0.1:9200')}/#{ENV.fetch('ELASTICSEARCH_DRIVER_POSITION_INDEX', 'live_location')}/_search"
end
def options
{
query: {
bool: {
filter: [
{
range: {
created_at: {
gte: (Time.now.beginning_of_day.strftime '%Y-%m-%d %H:%M:%S')
}
}
}
]
}
},
sort: [
{
created_at: {
order: 'desc'
}
}
]
}.to_json
end
def optinos_with_driver
{
query: {
bool: {
must: [
{
term: {
driver_id: {
value: #driver.id
}
}
}
],
filter: [
{
range: {
created_at: {
gte: (Time.now.beginning_of_day.strftime '%Y-%m-%d %H:%M:%S')
}
}
}
]
}
},
sort: [
{
created_at: {
order: 'desc'
}
}
]
}.to_json
end
end
end
end
end
end
this structure working perfectly but even if the user stops while elasticsearch saves his location but I need to filter user data that if the user stops for one hour in place elasticsearch understand and not saving data. Is it possible?
I use elsticsearch 7.1
and ruby 2.5
I know it's possible in kibana but I could not using kibana at this tim.
I am not sure if this can be done via a single ES query...
However you can use 2 queries:
one to check if the user's location's during the last hour is the same
Second same then don't insert
But i don't recommend that
What you could do:
Use REDIS or any in-mem cache to maintain the user's last geo-location duration
Basis that, update or skip update to Elastic Search
PS: I am not familiar with ES geo-location API

Send response to Dialogflow webhook

I'm trying to implement Dialogflow bot. I've set all intents, the response defined for the intent and webhook receiver controller. It works well but based on Fulfillment docs after Dialogflow request to my webhook service I need to send response to Dialogflow. How to do that?
Here is my webhook service:
class Api::V1::WebhooksController < ActionController::API
include ActionController::HttpAuthentication::Basic::ControllerMethods
include ActionController::HttpAuthentication::Token::ControllerMethods
http_basic_authenticate_with name: Rails.application.credentials.webhook_auth_name, password: Rails.application.credentials.webhook_auth_password
def create
action_name = params[:webhook][:queryResult][:intent][:displayName]
case action_name
when 'get-calendar'
render json: 'Show calendar', status: :ok
when 'get-room'
render json: 'Show available rooms', status: :ok
end
end
end
Which gives me that message inside Raw interaction log :
{
"queryText": "Calendar",
"parameters": {},
"fulfillmentText": "<happy>This is your calendar</happy>",
"fulfillmentMessages": [
{
"text": {
"text": [
"<happy>This is your calendar</happy>"
]
},
"lang": "en"
}
],
"intent": {
"id": "1234",
"displayName": "get-calendar",
"priority": 500000,
"webhookState": "WEBHOOK_STATE_ENABLED",
"messages": [
{
"text": {
"text": [
"<happy>This is your calendar</happy>"
]
},
"lang": "en"
}
]
},
"intentDetectionConfidence": 1234,
"diagnosticInfo": {
"webhook_latency_ms": 1234
},
"languageCode": "en",
"slotfillingMetadata": {
"allRequiredParamsPresent": true
},
"id": "1234",
"sessionId": "1234",
"timestamp": "2020-09-25T14:38:31.92Z",
"source": "agent",
"webhookStatus": {
"webhookEnabledForAgent": true,
"webhookStatus": {
"code": 3,
"message": "Webhook call failed. Error: Failed to parse webhook JSON response: Expect message object but got: \"Show\"."
}
},
"agentEnvironmentId": {
"agentId": "1324",
"cloudProjectId": "test-project"
}
}
I don't think this part is what I expected:
"message": "Webhook call failed. Error: Failed to parse webhook JSON response: Expect message object but got: \"Show\"."
DialogFlow calls your WebHook when processing an Intent, you need to ensure your code send back a JSON object which DialogFlow understands and can process. It looks like you are sending back just a text message (Show Calendar), or at least a JSON starting with `Show..'
You are using DialogFlow API v1 (this is deprecated, better move to V2) so your JSON should look like this. Normally there are SDK (Java, NodeJS, I guess Ruby too?) which provides the model and objects to work with, so you don't need to create the JSON manually but just use the API.

Azure Logic App: Read telemetry data as dynamic content from IoT hub message

I'm routing telemetry messages via IoT Events and event Grid to Logic Apps using a webhook. The logic app lets you input a sample JSON message and then use dynamic content to add information to an email alert I'm sending(O365: Send an Email V2)
I can include System Properties like "iothub-connection-device-id" But when I try to pick temeletry data I get the following error:
InvalidTemplate. Unable to process template language expressions in action 'Send_an_email_(V2)' inputs at line '1' and column '1680': 'The template language expression 'items('For_each')?['data']?['body']?['windingTemp1']' cannot be evaluated because property 'windingTemp1' cannot be selected. Property selection is not supported on values of type 'String'. Please see https://aka.ms/logicexpressions for usage details.'.
When I look at the raw output of the webhook connector it shows the following message but the telemetry points are cleary not there. I'd expect to see them in the "body" property but instead there is just the string: "eyJ3aW5kaW5nVGVtcDEiOjg2LjYzOTYxNzk4MjYxODMzLCJ3aW5kaW5nVGVtcDIiOjc4LjQ1MDc4NTgwMjQyMTUyLCJ3aW5kaW5nVGVtcDMiOjg1LjUzMDYxMDY5OTQ1MzY1LCJMb2FkQSI6MjAyOS44NDgyMTg4ODYxMTEsIkxvYWRCIjoyMDQwLjgxMDk4OTg0MDMzMzgsIkxvYWRWIjoyMDA0LjYxMTkzMjMyNTQ2MTgsIk9pbFRlbXAiOjk5LjA2MjMyNjU2MTY4ODU4fQ=="
Looking for help to determine what could be causing this and how to get the telemetry data passed through correctly so that I can inculde it dynamically in the email alert.
Thanks!
{
"headers": {
"Connection": "Keep-Alive",
"Accept-Encoding": "gzip,deflate",
"Host": "prod-24.northeurope.logic.azure.com",
"aeg-subscription-name": "TEMPALERT",
"aeg-delivery-count": "1",
"aeg-data-version": "",
"aeg-metadata-version": "1",
"aeg-event-type": "Notification",
"Content-Length": "1017",
"Content-Type": "application/json; charset=utf-8"
},
"body": [
{
"id": "c767fb91-3806-324c-ec3c-XXXXXXXXXX",
"topic": "/SUBSCRIPTIONS/XXXXXXXXXXXX",
"subject": "devices/Device-001",
"eventType": "Microsoft.Devices.DeviceTelemetry",
"data": {
"properties": {
"TempAlarm": "true"
},
"systemProperties": {
"iothub-connection-device-id": "Device-001",
"iothub-connection-auth-method": "{\"scope\":\"device\",\"type\":\"sas\",\"issuer\":\"iothub\",\"acceptingIpFilterRule\":null}",
"iothub-connection-auth-generation-id": "637264713410XXXX",
"iothub-enqueuedtime": "2020-06-01T23:05:58.3130000Z",
"iothub-message-source": "Telemetry"
},
"body": "eyJ3aW5kaW5nVGVtcDEiOjg2LjYzOTYxNzk4MjYxODMzLCJ3aW5kaW5nVGVtcDIiOjc4LjQ1MDc4NTgwMjQyMTUyLCJ3aW5kaW5nVGVtcDMiOjg1LjUzMDYxMDY5OTQ1MzY1LCJMb2FkQSI6MjAyOS44NDgyMTg4ODYxMTEsIkxvYWRCIjoyMDQwLjgxMDk4OTg0MDMzMzgsIkxvYWRWIjoyMDA0LjYxMTkzMjMyNTQ2MTgsIk9pbFRlbXAiOjk5LjA2MjMyNjU2MTY4ODU4fQ=="
},
"dataVersion": "",
"metadataVersion": "1",
"eventTime": "2020-06-01T23:05:58.313Z"
}
]
}
Here is the sample input I am using with the trigger:
[{
"id": "9af86784-8d40-fe2g-8b2a-bab65e106785",
"topic": "/SUBSCRIPTIONS/<subscription ID>/RESOURCEGROUPS/<resource group name>/PROVIDERS/MICROSOFT.DEVICES/IOTHUBS/<hub name>",
"subject": "devices/LogicAppTestDevice",
"eventType": "Microsoft.Devices.DeviceTelemetry",
"eventTime": "2019-01-07T20:58:30.48Z",
"data": {
"body": {
"windingTemp1": 95.62818310718433
},
"properties": {
"Status": "Active"
},
"systemProperties": {
"iothub-content-type": "application/json",
"iothub-content-encoding": "utf-8",
"iothub-connection-device-id": "d1",
"iothub-connection-auth-method": "{\"scope\":\"device\",\"type\":\"sas\",\"issuer\":\"iothub\",\"acceptingIpFilterRule\":null}",
"iothub-connection-auth-generation-id": "123455432199234570",
"iothub-enqueuedtime": "2019-01-07T20:58:30.48Z",
"iothub-message-source": "Telemetry"
}
},
"dataVersion": "",
"metadataVersion": "1"
}]
Summary comment to answer to help others who have same problem.
The body you provided is Base64 encoded, you can decode it with Convert.FromBase64String(String) Method.
byte[] newBytes = Convert.FromBase64String(body);
For more details, you could refer to this issue.
Update:
Add the following code in my application will solve the problem.
message.ContentEncoding = "utf-8";
message.ContentType = "application/json";

Accept Nested Attribures For '_attributes' Suffix

So I'm doing an API and I'm using the Has_many association, and since I wanna to create a model when I create his father, I decided to use accepts_nested_attributes for.
And based on my knowledge, if I don use the "_attributes" suffix it raises me a error:
ActiveRecord::AssociationTypeMismatch
In the API I do this for a Post requisition:
{
"content": {
"name": "Teste",
"schedulings_attributes":[
{
"days_attributes": [
{
"start": "2011-10-28",
"end": "2010-09-07"
},
{
"start": "2012-08-30",
"end": "2017-06-31"
}
],
"hours_attributes": [
{
"start": "2000-01-01T01:51:30.000Z",
"end": "2000-01-01T15:03:11.000Z"
},
{
"start": "2000-01-01T02:23:39.000Z",
"end": "2000-01-01T00:37:51.000Z"
}
],
"week_attributes": {
"monday": true,
"thursday": true,
"wednesday": true,
"tuesday": true,
"friday": true,
"saturday": true,
"sunday": true
}
}
]
}
}
The thing is, I don't wanna the '_attributes' suffix.
There is a way to take it of without raising an error in the active record? With some treatment at the controller?
I've discovered a way to do this with a treatment.
def scheduling_treatment(treated_params)
treated_params[:schedulings].map do |attributes|
attributes[:days_attributes] = attributes.delete(:days)
attributes[:hours_attributes] = attributes.delete(:hours)
attributes[:week_attributes] = attributes.delete(:week)
end
treated_params
end
In the method above, I receive the key schedulings, and mapping it I grab the keys and delete the old key and exchanges it for the new with the suffix '_attributes'.
Inside the controller:
def content_params
new_params = params.require(:content).permit(:id, :name, schedulings: [
days: [:start, :end],
hours: [:start, :end],
week: [:monday, :thursday, :wednesday, :tuesday, :friday, :saturday, :sunday]])
new_params = scheduling_treatment(new_params)
new_params[:schedulings_attributes] = new_params.delete(:schedulings)
new_params.permit!
end

Docusign integration with rails 4

I am using the docusign_rest gem for DocuSign REST API, and following are my DocuSign configuration.
# config/initializers/docusign_rest.rb
require 'docusign_rest'
DocusignRest.configure do |config|
config.username = 'myemail#email.com'
config.password = 'MyPassword'
config.integrator_key = 'My-key'
config.account_id = 'account_id'
config.endpoint = 'https://www.docusign.net/restapi'
config.api_version = 'v1'
end
When I try to connect and get account_id, I get nil as a response.
client = DocusignRest::Client.new
puts client.get_account_id # Returns nil.
I am using rails-4.1.4 and ruby-2.2.2
What did I miss? Please suggest.
Not sure if you figured this out or not quite yet. Here is another solution that wasn't too difficult using httparty. If you're trying to create a document for a template for example, your request might look like so:
baseUrl = "https://demo.docusign.net/restapi/v2/accounts/acct_number/envelopes"
#lease = Lease.find(lease.id)
#unit = #lease.unit
#application = #lease.application
#manager = #lease.property_manager
#application.applicants.each do |renter|
req = HTTParty.post(baseUrl,
body: {
"emailSubject": "DocuSign API call - Request Signature - Boom",
"templateId": "id of your template",
"templateRoles": [{
"name": "#{renter.background.legal_name}",
"email": "#{renter.email}",
"recipientId": "1",
"roleName": "Lessee",
"tabs": {
"texttabs": [{
"tablabel": "Rent",
"value": "#{#lease.rent}"
},{
"tablabel": "Address",
"value": "987 apple lane"
}]
}
},{
"email": "#{#manager.email}",
"name": "#{#manager.name}",
"roleName": "Lessor",
"tabs": {
"texttabs": [{
"tablabel": "Any",
"value": "#{#lease.labels}"
},{
"tablabel": "Address",
"value": "987 hoser lane"
}]
}
}],
"status": "sent"
}.to_json,
headers: {
"Content-Type" => "application/json",
'Accept' => 'application/json',
'X-DocuSign-Authentication' => '{
"Username" : "place your",
"Password" : "credentials",
"IntegratorKey" : "here"
}'
}, :debug_output => $stdout )
debug output on the final line is to allow you to debug the api request, it can be removed at any time.
This was a bug in docusign_rest 0.1.1; that method always returned nil. That bug has been fixed and the latest gem version includes that fix.

Resources