Using Google Mobile Vision API on Android Things - android-things

I'm trying to use Google Mobile Vision API on Android Things but apparently, it depends on Play Store as described by this stacktrace:
05-24 06:17:26.714 4993-4993/com.ee.foo
W/GooglePlayServicesUtil: Google Play Store is missing.
Error creating remote native face detector
com.google.android.gms.internal.zzsb$zza: No acceptable module found. Local
version is 0 and remote version is 0.
I know Play Store wouldn't be available for Android Things but is there a way of doing this or I should just use Cloud Vision API instead?

Related

google Speech API for iOS Out of vocabulary training set

I am working on a project which uses Google Speech API in iOS project. The project involves Voice input to recognize many terms which are basically jargons. The Google speech API gracefully fails to recognize this voice input for this jargons.
Is there a way to train google speech API to learn this jargons and easily recognize them while giving voice input in mobile iOS app?
I believe you're referring to (recently rebranded) Google Cloud Speech-to-Text API. If so, there is no way to train it right now.

Stream audio file from Google Cloud Storage to iOS Firebase client app

I'm using Firebase with an iOS client app, and I need to stream an audio file from Google Cloud Storage. I know file streaming support exists in the Android SDK with the class: StreamDownloadTask, but I cannot find an equivalent in the iOS SDK.
Best scenario would be able to achieve this without intermediary server functions, so directly between the iOS client and GCS. Is this possible?
If not, I have a Node.js server that I can use. Should I use the createReadStream GCS API function and pipe that to the client? Or is there a better way?
Any advice on the optimal way to create a stream from a GCS audio file to an iOS app would be hugely appreciated!
Firebase employee here
There is not, in fact, any Cloud Storage streaming API for iOS in the same way that there is for Android. If this is important to you, please file a feature request and explain why it's important for your particular case.

Using the Watson Speech to Text service in an iOS app with Bluemix

We are creating an iOS application on Bluemix and we are trying to link the Speech to Text service. We've bound the service to the application, but now we don't know how to utilize the service within our app.
How do we use the Speech to Text API in our iOS app with our back end hosted on Bluemix?
You have two options:
You make the call to the Watson Speech to Text service directly from your iOS application. You can either invoke the REST API directly from your iOS app using something like RestKit, or you can use the Watson Speech iOS SDK to make that invocation easier.
You can send all the received audio to your app on Bluemix (serving as a mobile back end) and invoke the Speech to Text REST API from there. This will offload computation from the mobile device, but will most likely increase the latency of getting back the audio transcription to your mobile phone.
Additionally, there is now a Watson iOS SDK which includes the Speech to Text service. This seems like an ideal solution over using the REST API directly if you plan to do a lot of work with Watson.

How To Push Music Playable Data Streamed From Spotify To A Device That Does Not Use The SDK Provided By Spotify

I apologise for the possibility of the title of my question would lead to confusion of the problem. For that I will explain my purpose in detail.
We are currently developing our own wifi speaker which is built with MIPS. The speaker comes with an app that will be used to manage it. One of the features that would we would like to include in the app is accessing contents of Spotify and be able to play them on the speakers.
Unfortunately, after going through the iOS SDK Documentation, and did some tests on Web API Console provided by the official of Spotify, I noticed that Spotify does not allow developers to directly get URL of a song, except for preview purposes. I also wasn't able to find any way to get the data bytes of the music streamed from the server. Every content comes with a corresponding URI which is used for a request.
For the device(WiFi Speaker) part, we recently tried to contact Spotify and ask for an SDK that can be used for development. However, one problem is that Spotify told us that they have SDK for x86, and ARMs architecture only. They don't have MIPS.
Now, here are my questions:
Is there any way for me to push music from an app to the WiFi Speakers without having to use SDK (for backend device)?
If Spotify can provide an SDK for our device, then how can we integrate the SDK with our platform?
I'll explain my 2nd question for clarity. Like for instance, in Android and iOS, these are popular platforms and are widely used by mobile devices. So if they provide SDKs for the two OS, then they can use default system frameworks to access the player for playing the content. (In iOS, it's the AVFoundation Framework). However, if Spotify were able to provide the SDK that we need, how would we able to integrate that with our own platform?
I will answer your question no 1:
You should be able to push music from an app using a buffer that you can read from using Core Audio and also forward to a device of your choice. I think what you are looking for can be found at CocoaLibSpotify

Access DropBox or Google Drive API from a Web app

Apologies if this is a rubbish question, but it's something I've never had to give much thought and I'm short on time. I have a Web App which is required to run offline and on iOS. This Web App has a lot of content consisting mainly of videos. One solution we are thinking of is to utilise DropBox's or Google Drive's API's to download the content and access it.
The main issue with this is whether you can access a native app (DropBox or Google Drive) from a browser or Web App. Does anyone know if this is possible?
Ideas so far are:
Access Drop Box or Google Drive native app from a web app (not sure if this is possible and is the current question)
Wrap up the web app to make it a hybrid native app using something like Phone Gap (this is plan B but will have its own issues)
Convert to a Google Chrome App to get improved access to Google Drive API (not sure how this would function on iOS)
Start from scratch and build a native iOS app (a longer term solution)
Thanks
Chris
Late, but maybe helpful: www.cloudrail.com
It's a useful Javascript library to access the Dropbox and Google Drive API
We ended up making a hybrid app using PhoneGap for iOS.

Resources