I'm works with google assistant partner sdk(c++) to integrate google assistant to our tv device.
I have question about google assistant ux guide. Is there a default resouce(sound,image ) have to use?
Amazon Alexa has a ux guide for partners. sounds and images are served to partner product developer also.
for example, alexa give file of ding sound when microphone start to recording after hotword(alexa) spoken.
My question is. Those default ux (sound) guide is existed on google assistant?
Related
I work for a company that develops a smart home device.
The company is developing an android application for smart home devices it has developed/designed.
In addition, he wants to control the smart home devices he has developed with the Google Assistant.
We currently cannot use "Custom intents" for the "Voice-enable your Android app" feature. I wish we could add google assistant feature directly to our software independently (without any google home or nest affiliation)... (https://developers.google.com/assistant/app/custom-intents)
custom intents; It is not specialized for Smart Home and is not used in local languages such as Turkish.
We are focusing on the Google Assistant feature for Smart Home. (https://developers.google.com/assistant/smarthome/overview)
Scenario 1: There is a way that Google communicates with the developer cloud via Access Token. (https://developers.google.com/assistant/smarthome/concepts/fulfillment-authentication)
Scenario 2: There is also a second way. Here google assistant is communicating with a google home or google nest device (rather than developer cloud).
(https://developers.google.com/assistant/smarthome/concepts/local) Obviously something strange is going on here. Google Assistant; It says it will contact google home. Is it the google home app installed on the phone he's talking about here, or a google home app like a speaker? Because at https://developers.google.com/assistant/smarthome/concepts/local#supported-devices, a speaker is shown as google home. If it's not talking about the google home app on the phone, or if it can't communicate with the google home app on the phone, it's not much use to us. Because the user may not have a physical google home product/device.
Neither of these scenarios fits exactly what we want to do. What we want to do is exactly this; Google Assistant should neither communicate with our cloud (to avoid lag) nor should it communicate with a google nest device (because the user may not have such a device). It should communicate directly with our app installed on the user's phone. Google Assistant or Google Home; should tell our application the purpose to be fulfilled. Let's send the request sent to us by google assistant or google home to our device or cloud (whatever options are possible).
Is there such a feature in the local home SDK of Google Assistant? Or is there another way google assistant can communicate directly with our app?
Because it is frankly very strange that google home or assistant try to communicate directly with a smart device when our app is installed on the user's phone. It can communicate with our cloud server, but why can't it communicate with our application installed on the phone?
Google assistant or google home; can communicate directly with our application installed on the phone while registering the first account instead of our oauth 2 server. Google is linking to our app instead of using oauth 2 directly. Our app connects to oauth 2 and gives a token to google. In other words, the user can make a sync with our local application on his phone. https://developers.google.com/assistant/smarthome/develop/implement-app-flip#implement-app-flip-in-your-native-apps
But I also think that in order to control smart devices, the user would have to communicate with our application installed on his phone.
Is there a way to this?
The platform does not support a way to directly control a smart home device through an app installed on a phone. Counter to your scenario, there are times when a person wants to control their device when their phone is not around.
Local Home SDK integrations are in addition to an existing cloud integration. This base cloud integration is needed in the cases that a person is not at home.
Is there any doc or people can tell me the difference between "work with google assistant" and "built-in"? If a new smart device going to add google assistant in, what should we do based on these 2 different ways?
Building IoT devices for the Google Assistant fall into two categories.
You can have the Google Assistant built into your device. This would let you talk directly to your device and invoke custom device actions. It would not really integrate with a consumer's existing devices.
Alternatively, you can work with the Google Assistant. This would be for a device that you control using an existing Google Assistant surface like a Google Home or a phone. This simplifies the integration work, as you don't need to worry about all the voice management, and may be better for integrating into other voice ecosystems.
We are exploring the use of Google Home and LG Google Assistant TVs for an enterprise application. This would require managing hundreds of devices in a single building. Amazon Alexa has "Amazon for Business" for doing this for Alexa devices.
Is there anything similar for Google Assistant devices? Is there an efficient way to manage hundreds of devices? Can device management be done remotely?
We would like to use Google Assistant devices because of superior AI technology but feel we might have to use Alexa because of the need to manage so many devices.
Please advise.
An action built for the Google Assistant runs entirely in the cloud, so you shouldn't need to worry about device management. The same action can be accessed by multiple devices at the same time, via HTTP requests.
For more information, check out Google's codelab here.
I would like to integrate Google assistant inside my app. The idea is that I have a app which provides various press services, like giving latest news and such. I would like to integrate Google assistant for handling some particular requests. For example the user may ask, "what did the Lakers yesterday?" If i search this on Google or ask to the assistant, i will get a card with the score of yesterday's game. I would like, from inside my app, to replicate this interaction, that is sending the request to Google assistant and showing the answer that Google return to the user (or at least opening Google assistant with the answer)
Is such a thing possible?
I was looking at Google Assistant service sdk (https://developers.google.com/assistant/sdk/guides/service/python/) and it says:
The Google Assistant Service gives you full control over the integration with the Assistant by providing a streaming endpoint. Stream a user audio query to this endpoint to receive a Google Assistant audio response.
Is this possible only with audio interaction? I'm not quite certain this is the solution I should look into
The Google Assistant SDK Service allows you to send both audio or text to the Assistant and you'll get back responses including audio, display text, and rich HTML visual content.
For mobile apps, there's less support compared to Python, but it's still doable. For example, there's a version of the SDK for Android Things, which means for IoT devices like a Raspberry Pi. You can go through this project and remove all the IoT references, but it's something you'd need to do yourself.
Today I downloaded the new Google Music app for iOS and it immediately had me signed in already. I assume they accomplished this using one of the other Google Apps on my phone. To keep this appropriate for StackOverflow, here's the question, specific and objective:
How (technically) can an app (like Google Music) authenticate a user using data from another app (like Google Search or Google Chrome)? I was under the impression that apps in iOS were entirely sand boxed.
I'm very curious in finding a technical explanation for this, so if someone could recommend a way to rephrase or retag the question, please feel free to edit or migrate.
iOS Keychain can be shared with your other apps via access groups, see this for example.