I am using Google Object Detection API for our custom model.
The dataset contains business cards so all text. We have deployed our model and it is working reasonably well when we test a business card using the visual interface.
However, to use it on the backend Nodejs server we are making use of REST APIs. API request looks like below:
curl -X POST -H "Content-Type: application/json" -H "Authorization: Bearer $(gcloud auth application-default print-access-token)" https://automl.googleapis.com/v1beta1/projects/1023422831715/locations/us-central1/models/IOD9200669320764456960:predict -d #request.json
The data returned by REST response is like below:
{
"payload": [
{
"annotationSpecId": "5824016335306227712",
"imageObjectDetection": {
"boundingBox": {
"normalizedVertices": [
{
"x": 0.050029,
"y": 0.139873
},
{
"x": 0.230016,
"y": 0.251469
}
]
},
"score": 0.998103
},
"displayName": "first_name"
},
{
"annotationSpecId": "2224232858153648128",
"imageObjectDetection": {
"boundingBox": {
"normalizedVertices": [
{
"x": 0.0465549,
"y": 0.236178
},
{
"x": 0.462747,
"y": 0.30602
}
]
},
"score": 0.98034
},
"displayName": "job_title"
},
{
"annotationSpecId": "8910530192426926080",
"imageObjectDetection": {
"boundingBox": {
"normalizedVertices": [
{
"x": 0.053251,
"y": 0.410447
},
{
"x": 0.452525,
"y": 0.559461
}
]
},
"score": 0.904657
},
"displayName": "address"
}
]
}
The problem is that this in REST case only giving the bounding boxes with the score and label. However, it is not giving the . text under that bounding box.
So, how do we get the text under the above suggested bounding boxes?
Related
Hoping someone can point me in the right direction. I'd like to be able to do this via Twilio Studio. If not, I can learn TwiML. That's about as far as my brain will stretch.
I've made a simple flow in Twilio Studio that enables the caller to record a voicemail. I would like to add an option for the current caller to be able to play the previous caller's recorded voicemail. I think I need to use a Say/Play widget for this. What do I need to use for the "URL of audio file" so that the previous recorded voicemail is played? I assume this URL will change every time that a caller leaves a voicemail, so it'll need to auto-update. Can I use "RecordingURL" somehow? Is there a solution using TwiML? Any help appreciated
Thanks!
Without a way to keep state between the call, this is not possible. The best way would be to have some type of DB you can store the recording SID and reference it in a future flow. You can use a tool like Twilio Sync to do this or Airtable, but it does require code.
I can't think of a way to do this without involving some coding.
Alternate ways are to list the recording records and pull the most recent one off the list but not ideal since you don't know when the last recording occurred.
Another approach is to modify the webhook associated with your Twilio phone number for that Studio Flow to pass in the last Recording SID into your flow and you use that SID to dynamically construct the recording URL to playback, as shown in the JSON flow below (which you can import when creating a new Studio Flow).
Don't forget to set the recordingStatuscallback to the above Twilio Function which updates the phone number webhook to pass in the recording SID. You set your Recording Status Callback of your Record Voicemail Widget to point to your unique Function domain and function path (currently set to: https://magnolia-kitty-3234.twil.io/phUpdate).
Feel free to improve on the below and share.
Twilio Function Code
exports.handler = function(context, event, callback) {
let client = context.getTwilioClient();
const telephoneNumAccountSid = "PN..."; \\ set this to your Phone Number SID
const accountSid = event.AccountSid;
const studioFlowSid = "FW..."; \\ set this to your Studio Flow SID
const webhookUrl = `https://webhooks.twilio.com/v1/Accounts/${accountSid}/Flows/${studioFlowSid}`;
const recordingSid = event.RecordingSid;
client
.incomingPhoneNumbers(telephoneNumAccountSid)
.update({ voiceUrl: `${webhookUrl}?recording=${recordingSid}`, voiceFallbackUrl: `${webhookUrl}?recording=${recordingSid}` })
.then (result => {
console.log(result.voiceUrl);
callback(null, "success");
})
.catch(err => {
console.log(err);
callback("error");
});
};
Studio Flow JSON
{
"description": "A New Flow",
"states": [
{
"name": "Trigger",
"type": "trigger",
"transitions": [
{
"event": "incomingMessage"
},
{
"next": "set_variables_1",
"event": "incomingCall"
},
{
"event": "incomingRequest"
}
],
"properties": {
"offset": {
"x": 0,
"y": 0
}
}
},
{
"name": "split_1",
"type": "split-based-on",
"transitions": [
{
"next": "say_play_2",
"event": "noMatch"
},
{
"next": "gather_1",
"event": "match",
"conditions": [
{
"friendly_name": "{{trigger.call.recording}}",
"arguments": [
"{{trigger.call.recording}}"
],
"type": "is_not_blank",
"value": "Is Not Blank"
}
]
}
],
"properties": {
"input": "{{trigger.call.recording}}",
"offset": {
"x": 110,
"y": 350
}
}
},
{
"name": "say_play_1",
"type": "say-play",
"transitions": [
{
"event": "audioComplete"
}
],
"properties": {
"play": "https://api.twilio.com/2010-04-01/Accounts/{{flow.variables.accountSid}}/Recordings/{{flow.variables.recording}}.mp3",
"offset": {
"x": 450,
"y": 1110
},
"loop": 1
}
},
{
"name": "set_variables_1",
"type": "set-variables",
"transitions": [
{
"next": "split_1",
"event": "next"
}
],
"properties": {
"variables": [
{
"value": "{{trigger.call.recording}}",
"key": "recording"
},
{
"value": "{{trigger.call.AccountSid}}",
"key": "accountSid"
}
],
"offset": {
"x": 60,
"y": 170
}
}
},
{
"name": "say_play_2",
"type": "say-play",
"transitions": [
{
"next": "record_voicemail_1",
"event": "audioComplete"
}
],
"properties": {
"voice": "Polly.Joanna-Neural",
"offset": {
"x": -120,
"y": 650
},
"loop": 1,
"say": "Please leave a message at the beep!",
"language": "en-US"
}
},
{
"name": "record_voicemail_1",
"type": "record-voicemail",
"transitions": [
{
"event": "recordingComplete"
},
{
"event": "noAudio"
},
{
"event": "hangup"
}
],
"properties": {
"transcribe": false,
"offset": {
"x": -120,
"y": 860
},
"trim": "trim-silence",
"play_beep": "true",
"recording_status_callback_url": "https://magnolia-kitty-3234.twil.io/phUpdate",
"timeout": 5,
"max_length": 3600
}
},
{
"name": "gather_1",
"type": "gather-input-on-call",
"transitions": [
{
"next": "split_2",
"event": "keypress"
},
{
"event": "speech"
},
{
"event": "timeout"
}
],
"properties": {
"number_of_digits": 1,
"speech_timeout": "auto",
"offset": {
"x": 300,
"y": 650
},
"loop": 1,
"finish_on_key": "#",
"say": "There is a previous recording, press 1 if you want to listen to it or 2 if you want to leave a new voicemail.",
"stop_gather": true,
"gather_language": "en",
"profanity_filter": "true",
"timeout": 5
}
},
{
"name": "split_2",
"type": "split-based-on",
"transitions": [
{
"event": "noMatch"
},
{
"next": "say_play_1",
"event": "match",
"conditions": [
{
"friendly_name": "If value equal_to 1",
"arguments": [
"{{widgets.gather_1.Digits}}"
],
"type": "equal_to",
"value": "1"
}
]
},
{
"next": "say_play_2",
"event": "match",
"conditions": [
{
"friendly_name": "If value equal_to 2",
"arguments": [
"{{widgets.gather_1.Digits}}"
],
"type": "equal_to",
"value": "2"
}
]
}
],
"properties": {
"input": "{{widgets.gather_1.Digits}}",
"offset": {
"x": 280,
"y": 870
}
}
}
],
"initial_state": "Trigger",
"flags": {
"allow_concurrent_calls": true
}
}
Alan
Can someone explain how to create an API with APIC toolkit?
I would like to use this API to work with a Cloudant DB on IBM Bluemix or a local CouchDB to create, read and update of the geoJSON data.
Below is an easy example of typical data to store name and coordinates of point of interests.
[{
"type": "Feature",
"properties": {
"name": "Nice Place 1"
},
"geometry": {
"type": "Point",
"coordinates": [16.45961, 48.23896]
}
}, {
"type": "Feature",
"properties": {
"name": "Nice Place 2"
},
"geometry": {
"type": "Point",
"coordinates": [16.34561, 49.89612]
}
}]
LoopBack supports GeoPoint (i.e. Point in GeoJSON) datatype.
Considering your typical example, let's say you have a model named: Feature, then to use GeoPoint, your Feature.json should look like:
{
"name": "Feature",
"base": "PersistedModel",
"idInjection": true,
"options": {
"validateUpsert": true
},
"properties": {
"name": {
"type": "string"
},
"geometry": {
"type": "geopoint"
}
},
"validations": [],
"relations": {},
"acls": [],
"methods": {}
}
Now, this Feature model, having PersistedModel as base, will have common CRUD methods exposed as REST endpoints and you can store data, for example, using CURL:
curl -X POST --header "Content-Type: application/json" --header "Accept: application/json" -d "{
\"name\": \"Nice Place 1\",
\"geometry\": {
\"lat\": 16.20,
\"lng\": 48.23
}
}" "http://0.0.0.0:3000/api/Features"
Hope that helps with creating an API that supports GeoPoint.
Re: Cloudant db, I am not sure if it supports geo-spatial data out of the box, however there seems support for it: https://cloudant.com/product/cloudant-features/geospatial/
I tried with the model above with a loopback app(using cloudant as ds) and it's explorer:
Create with sample data:
{
"name": "string",
"geometry": {
"lat": 12,
"lng": 13
}
}
And get it from GET/ myGeoModels successfully:
[
{
"name": "string",
"geometry": {
"lat": 12,
"lng": 13
},
"id": "f08301abe833ad427c9c61ffd30df8ef"
}
]
APIC should have same behaviour of loopback.
I have an internal app that uses a webhook listener and some scripting to manipulate the input data. I'm posting this to it:
curl -X POST -d '{
"assignment_id": 12345,
"updated_custom_fields": [{
"name": "RNVIDAYEBB",
"value": "updated!"
},
{
"name": "QUFTXSIBYA",
"value": "and me too"
}
],
"custom_fields": [{
"id": 981,
"name": "RDEXDPVKRD",
"fields": [
{
"id": 4096,
"name": "RNVIDAYEBB",
"default": "EDJEAJICYW",
"required": true,
"value": "Blah"
},
{
"id": 4097,
"name": "QUFTXSIBYA",
"default": "",
"required": true,
"value": ""
}]
}]
}' "https://hooks.zapier.com/hooks/catch/......"
My script is as follows:
update_custom_fields_by_name_pre_write: function(bundle) {
var updatedFields = _.map(bundle.request.data.custom_fields, function(group) {
return _.map(group.fields, function(field) {
return _.extend(field, _.findWhere(bundle.request.data.updated_custom_fields, { name: field.name} ));
});
});
bundle.request.data = updatedFields;
return bundle.request;
}
I know that the merging logic is good, but it appears that the custom_fields and updated_custom_fields arrays are not present in the bundle.request.data object. Anyone know how to get access to them in the script?
It seems like you should be using update_custom_fields_by_name_catch_hook to capture the incoming static webhook data (instead of _pre_write). If you use that, you can capture the data within bundle.cleaned_request.custom_fields and bundle.cleaned_request.updated_custom_fields.
I have an index named books which has reviews as an object which can handle arrays.
While retrieving data, in a particular case I only want the review having maximum rating.
"books" :{
"reviews": {
"properties": {
"rating": {
"type": "float"
},
"comments": {
"type": "string"
}
}
},
"author" : {
"type" : "string"
}
}
Many books can have many reviews each having some rating. For a particular use case I only want the result set to have the reviews having maximum rating. I need to build a search query for that kind of result.
POST books/_search
{
"size": 51,
"sort": [
{
"reviews.rating": {
"order": "asc",
"mode" : "min"
}
}
],
"fields": [
"reviews","author"]
}
By using script_fields one can build dynamic fields but not objects. Else I could have made a dynamic object reviews having one field as rating and another as comment.
script_fields can be used to build both dynamic fields and objects:
curl -XDELETE localhost:9200/test-idx
curl -XPUT localhost:9200/test-idx -d '{
"mappings": {
"books" :{
"reviews": {
"properties": {
"rating": {
"type": "float"
},
"comments": {
"type": "string"
}
}
},
"author" : {
"type" : "string"
}
}
}
}'
curl -XPOST "localhost:9200/test-idx/books?refresh=true" -d '{
"reviews": [{
"rating": 5.5,
"comments": "So-so"
}, {
"rating": 9.8,
"comments": "Awesome"
}, {
"rating": 1.2,
"comments": "Awful"
}],
"author": "Roversial, Cont"
}'
curl "localhost:9200/test-idx/books/_search?pretty" -d '{
"fields": ["author"],
"script_fields": {
"highest_review": {
"script": "max_rating = 0.0; max_review = null; for(review : _source[\"reviews\"]) { if (review.rating > max_rating) { max_review = review; max_rating = review.rating;}} max_review"
}
}
}'
I'm planning to use the Built-in Like for mobile users, and the standard Like Button for web users to "like" a webpage.
But is the Built-in Like feature has a connection to the Like Button (Social Plugin)?
From my observation:
On web version, after I clicked the standard Like Button, the Open Graph Object can track that like instantly
Calling fql?q=SELECT share_count, like_count, comment_count, total_count, click_count FROM link_stat WHERE url="http://websitelinkhere.com";
returns
{
"data": [
{
"share_count": 0,
"like_count": 1,
"comment_count": 0,
"total_count": 1,
"click_count": 0
}
]
}
But using the Built-in Like, the Open Graph Object cannot track that 'like' at all, the like_count and total_count are both 0.
And then here's the funny part:
By checking my og.likes using https://graph.facebook.com/userid/og.likes?access_token=myAccessToken
It returns TWO likes, 1 from the Like Button and 1 from the Built-in Like Action
{
"data": [
{
"id": "10151050736776633",
"from": {
//skipped
},
"start_time": "2012-08-24T07:10:52+0000",
"end_time": "2012-08-24T07:10:52+0000",
"publish_time": "2012-08-24T07:10:52+0000",
"application": {
//skipped
},
"data": {
//skipped
},
"type": "og.likes",
"no_feed_story": false,
"likes": {
"count": 0,
"can_like": true,
"user_likes": false
},
"comments": {
"count": 0,
"can_comment": true
}
},
{
"id": "10151050736586633",
"from": {
//skipped
},
"start_time": "2012-08-24T07:10:42+0000",
"publish_time": "2012-08-24T07:10:42+0000",
"application": {
//skipped
},
"data": {
//skipped
},
"type": "og.likes",
"no_feed_story": false,
"likes": {
"count": 0,
"can_like": true,
"user_likes": false
},
"comments": {
"count": 0,
"can_comment": true
}
}
]
}
And then by using action id returned by og.likes, I can delete both likes using
curl -X DELETE \
-F 'access_token=accessToken' \
https://graph.facebook.com/10151050736776633
and
curl -X DELETE \
-F 'access_token=accessToken' \
https://graph.facebook.com/10151050736586633
Is it because I haven't submit my application to Facebook for reviewing yet?
I'm expecting the Built-in Like and Like Button work together as ONE action, but not generating og.likes independently.
Thank you for your time.
Adding og:url and using the Open Graph Object ID directly fixes the problem.
curl -X POST \
-F 'access_token=accessTokenHere' \
-F 'object=UsingOpenGraphObjectIDHereDirectly' \
https://graph.facebook.com/useridhere/og.likes
May be related to Impossibile to publish built-in Like action on Open Graph Pages with Likes