Cordova | Get live stream from microphone on iOS - ios

I am trying to build a voice calendar app that needs to use live stream from the microphone for speech recognition.
So you have a button that starts listening to the microphone and stops automatically when the user stops speaking.
I have already explored Cordova Media API that allows me to record the data to a wav file. This works but makes the process very slow since I need to wait for the recording to be finished.
I used the https://api.ai as a starting point to build the 1st version of the app which works quite well. It took care of all the "listening" part!
Next phase for me is to integrate with a few different speech recognition APIs.
The major issue for me has been the lack of native development skills, so are there any cordova plugins that can help me do this?
Update 1 - 1st April 2016
Found this https://subvisual.co/blog/posts/39-tutorial-html-audio-capture-streaming-to-node-js-no-browser-extensions
Will be trying to implement this in cordova through webrtc.
Update 2 - 1st April 2016
Installed https://github.com/eface2face/cordova-plugin-iosrtc to utilize webrtc
Update 3 - 2nd April 2016
Stuck at AudioContext.createMediaStreamSource is not a function on iOS!
AudioContext.createMediaStreamSource alternative for iOS?
Update 4 - 6th April 2016
Going Native - Time to learn iOS Development!

Sorry to hear that you gave up on Cordova, but if you still are interested: I've created a cordova plugin for iOS and Android, that enables you to capture microphone data and forward it to the web layer of your application. You can either rely on the Web Audio API to handle the incoming sound, or use any other way to encode and save the raw sound data:
https://github.com/edimuj/cordova-plugin-audioinput
Example usage:
function onAudioInput( evt ) {
// 'evt.data' is an integer array containing raw audio data
console.log( "Audio data received: " + evt.data.length + " samples" );
// ... do something with the evt.data array ...
}
// Listen to audioinput events
window.addEventListener( "audioinput", onAudioInput, false );
// Start capturing audio from the microphone
audioinput.start();

Related

WebRTC running from WKWebView AVAudioSession development roadblock

Over the past few years I have steadily developed a complete WebRTC based browser Phone using the SIP protocol. The main SIP toolbox is SIPJS (https://sipjs.com/), and it provides all the tools one needs to make and receive calls to a SIP based PBX of your own.
The Browser Phone project: https://github.com/InnovateAsterisk/Browser-Phone/ gives SIPJS it's full functionality and UI. You can simply navigate to the phone in a browser and start using it. Everything will works perfectly.
On Mobile
Apple finally allow WebRTC (getUserMedia()) on WKWebView, so it wasn't long before people started to ask how it would work on mobile. And while the UI is well suited for cellphones and tablets, just the UI isn't enough now days to be a full solution.
The main consideration is that a mobile app is typically one that has a short lifespan, in that you can't or don't leave it running in the background like you can or would with the Browser on a PC. This presents a few challenges to truly making the Browser Phone mobile friendly. iOS is going to want to shutdown the app as soon as its not the front most app - and rightly so. So there are tools for handling that, like Callkit & Push Notifications. This allows the app to be woken up, so that it can accept the call, and notify the user.
Just remember, this app is created by opening a UIViewController, adding a WKWebView, and navigating to the phone page. There is full communication between the app and the html & Javascript, so events can be passed back and forth.
WKWebView & AVAudioSession Issue:
After a LOT of reading unsolved forum posts, it's clear that AVAudioSession.sharedInstance() is simply not connected to the WKWebView, or there is some undocumented connection.
The result is that if the call starts from the app, and is sent to the background, the microphone is disabled. Clearly this isn't an option if you are on a call. Now, I can manage this limitation a little, by putting the call on hold when the app is sent to the background - although this would be confusing to the user and a poor user experience.
However, the real issue is that if the app was woken from Callkit, because the app never goes to the foreground (because Callkit is), the microphone isn't activated in the first place, and even if you do witch to the app, it doesn't activate even after that. This is simply an unacceptable user experience.
What I found interesting is that if you simply open up Safari Browser on iOS (15.x), and navigate to the phone page: https://www.innovateasterisk.com/phone/ (without making an app in xCode and loading it into a WKWebView), the microphone continues to work when the app is sent to the background. So how do Safari manage to do this? Of course this doesn't and can't salve the CallKit issue, but still interesting to see that Safari can make use of the microphone in the background, since Safari is built off WKWebView.
(I was reading about entitlements, and that this may have to be specially granted... im not sure how this works?)
The next problem with AVAudioSession is that since you cannot access the session for WkWebView, you cannot change the output of the <audio> element, so you cannot change it from say speaker to earpiece, or make it use a bluetooth device.
It simply wouldn't be feasible to redevelop the entire application using an outdated WebRTC SDK (Google no long maintain the WebRTC iOS SDK), and then build my own Swift SIP stack like SIPJS and land up with two sets of code to maintain... so my main questions are:
How can I access the AVAudioSession of WKWebView so that I can set the output path/device?
How can I have the microphone stay active when the app is sent to the background?
How can I activate the microphone when Callkit activates the application (while the application is in the background)?
for 1) Maybe someone also is following this approach and can add some insight/correct wrong assumptions: The audio in a WebRTC site is represented as a Mediastream. Maybe it is possible to get that stream from without the WKWebView and play it back within the app somehow ? This code should pass on some Buffers, but they are empty when they arrive over in swift:
//javascript
...
someRecorder = new MediaRecorder(audioStream);
someRecorder.ondataavailable = async (e) =>
{
window.webkit.messageHandlers.callBackMethod.postMessage(await e.data.arrayBuffer());
}
mediaRecorder.start(1000);
and then in swift receive it like
//swift
import UIKit
import WebKit
class ViewController: UIViewController, WKScriptMessageHandler {
...
let config = WKWebViewConfiguration()
config.userContentController = WKUserContentController()
config.userContentController.add(self, name: "callBackMethod")
let webView = WKWebView(frame: CGRect(x: 0, y: 0, width: 10, height: 10), configuration: config)
...
}
func userContentController(_ userContentController: WKUserContentController, didReceive message: WKScriptMessage) {
addToPlayingAudioBuffer(message.body)
//print(message.body) gives the output "{}" every 1000ms.
}

iOS OpenTok audio-video call has video disabled with an undocumented reason on iOS Simulator and routed sessions

While testing on the iOS Simulator a vide call with routed sessions the user gets its video disabled with an undocumented reason.
On the latest version of OpenTok 2.15.3 we can see that the possible reasons for video being disabled are:
typedef NS_ENUM(int32_t, OTSubscriberVideoEventReason) {
OTSubscriberVideoEventPublisherPropertyChanged = 1,
OTSubscriberVideoEventSubscriberPropertyChanged = 2,
OTSubscriberVideoEventQualityChanged = 3
};
on the iOS 11 simulator, right after trying to subscribe I'm getting the video enabled with OTSubscriberVideoEventReason == 2, right after it gets disabled with OTSubscriberVideoEventReason == 4 and then I get the following error on subscriberDidDisconnectFromStream:
Internal error -- WebRTC subscriber error.
Failed to set remote offer sdp:
Session error code: ERROR_CONTENT.
Session error description: Failed to set remote video description send
parameters..
kPCFailureSetRemoteDescription
The docs suggested trying to resubscribe, or reconnect... resubscribing didn't work. Furthermore it's only happening on the simulator, which makes me think that there is nothing really wrong with the setup, but the real question here is, what is the OTSubscriberVideoEventReason == 4?
TokBox Developer Evangelist here.
Yes, unfortunately, we didn't publicly document this specific case(sorry about that). We plan on adding this in the 2.16.0 release.
The reason that OTSubscriberVideoEventReason=4 is dispatched is when the video in the subscriber stream was disabled because the stream uses a video codec (such as H.264) that's not supported on the simulator.

How to add an iOS Observer on AVPlayer in nativescript plugin?

I am developing a nativescript app that stream a webRadio, and for that I have used the plugin nativescript-audio.
This plugin do the job on Android, but on the iOS part, I have replaced the AVAudioPlayer by AVPlayer because AVAudioPlayer doesn't manage web streaming. Actually, I try to understand how to manage the variables and events observer to be notified when an event is trigged or a value involved.
I tried to be notified when player has reached the end of a media :
NSNotificationCenter.defaultCenter.addObserverSelectorNameObject(
_this,
NSSelectorFromString("playerDidFinishPlaying"),
AVPlayerItemDidPlayToEndTimeNotification,
null);
TNSPlayer.playerDidFinishPlaying = function (args) {
if (this._completeCallback) this._completeCallback();
};
But this code crashed during execution because it doesn't find the method playerDidFinishPlaying.
I also wanted to be notified on the AVAudioPlayer.currentItem.status to know when the player involved to AVPlayerStatus.readyToPlay
I tried to adopt swift example to nativescript format but no success.
Could you help me to resolve my problems?
Thanks

iOS background task detect screen is on or off

I am developing an app that logs down the screen on/off events. It's a kind of smartphone usage analysis app. All the app does is to write down a log like this:
2015 July 25 at 03:54:12 PM - Screen on
2015 July 25 at 03:59:38 PM - Screen off
2015 July 25 at 04:20:52 PM - Screen on
2015 July 25 at 04:22:32 PM - Screen off
...
2015 July 26 at 10:20:32 AM - Screen on
2015 July 26 at 10:22:11 AM - Screen off
2015 July 26 at 11:30:38 AM - Screen on
2015 July 26 at 10:31:02 AM - Screen off
...
"Screen on": user press home button (or power button) and enter passcode/unlock password if there is any.
"Screen off": user press power button to turn off the screen.
I was able to find some way to do this this on Android using the broadcast receiver to capture the events sent by the system. But in iOS there seems to be a problem since iOS only allows background services to run several minutes, I am not even sure if I can detect the "screen on/off" event on iOS.
I did some researches on this and found some articles, but those weren't help much:
http://www.macworld.com/article/1164616/how_ios_multitasking_really_works.html
Lock Unlock events iphone
My question is "Is it possible to make an app like this in iOS (latest version - 8.4) ?"
Thanks.
It may not be possible to meet all your requirements within the published, non jail broken iOS device using the background service. I can see the notifications come across, I'm just not sure about the backgrounding.
Since others have been saying it's not possible, I'm digging a little deeper here to see just how much can be done.
Because iOS is currently restricted to a small number of background modes (situations where events are delivered in the background), or a mode where your app is granted a few minutes of time after the user navigates away from your app, the primary issue is going to be tricking the system into allowing your app to get time in the background when needed.
There are several background modes, described in the Programming Guide to Background Execution. If you can, for example, send push notifications periodically to awaken the app to "download content", you may be able to get some time periodically as the system sees fit to do so.
The background Daemon is a possible solution, but only for your own use, but not via the App Store. The official take on this is in the App Store Review Guidelines -- the relevant section is 2.8 (presumably you'd get your daemon on by having the app install it "behind the scenes"):
2.8 Apps that install or launch other executable code will be rejected
There may be some system logs that iOS keeps for itself; if you can gain access to those, you'd have your data. I doubt, however, that this is available programmatically from a non jail broken phone.
I was able to test out some Swift (2.0) code that uses the Darwin Notifications mentioned in one of the Stack Overflow discussions that your original question led to: Lock / Unlock Events for iPhone. I didn't actually run in the background, but I did verify that the events are eventually delivered, even if the app isn't running when the actual lock event takes place. When my app is switched in, the notification gets called. So, if you were able to get time from the system, you'd get the (delayed) notification when Apple's algorithms decide to give you time.
The code snippet (I stuffed it into some random app) that allows the listening is as follows:
import UIKit
import CoreFoundation
class MainViewController: UIViewController, UIWebViewDelegate {
override func viewDidLoad() {
super.viewDidLoad()
// CoreFoundation compatible types
var cfstr: CFString = "com.apple.iokit.hid.displayStatus" as NSString
var notificationCenter = CFNotificationCenterGetDarwinNotifyCenter()
CFNotificationCenterAddObserver(notificationCenter, nil,
{ (noti: CFNotificationCenter!, aPtr: UnsafeMutablePointer<Void>, aStr: CFString!, bPtr: UnsafePointer<Void>, aDict: CFDictionary!) -> () in
print("got notification") }, cfstr, nil, CFNotificationSuspensionBehavior.DeliverImmediately)
}
// [... more stuff ...]
}
If you can use the C or objective C code i guess this code might help you.
notify_register_dispatch("com.apple.iokit.hid.displayStatus", &notify_token, dispatch_get_main_queue(), ^(int token) {
uint64_t state = UINT64_MAX;
notify_get_state(token, &state);
//notify_cancel(token);
debug("com.apple.iokit.hid.displayStatus = %llu", state);
});
Provided that you are able to run your app in background modes.
State will provide the screen on off status

Phonegap / Cordova Stop audio after time when in background IOS

I have an audio app that loops sounds for playback using Cordova 2.2 and its Audio API.
At the moment I have setup a number of loops that will stop when finished based on predetermined time (calculated on n seconds per loop / 3 hours) . This method generally works.
playMainAudio = new Media(url,
// success callback
function() {
console.log("playAudio():Audio Success");
},
// error callback
function(err) {
console.log("playAudio():Audio Error: "+err);
});
// Play audio
playMainAudio.play({ numberOfLoops: 123, playAudioWhenScreenIsLocked : true });
But I'd prefer a native code addition where I could just set all audio to stop after 3 hours rather then work it out based on time- but not sure were to look or even place the code. The catch is it has to work when locked or the app is in the background (currently I have the correct background mode set so the audio will play in the BG).
is there a native timer that is background compatible ?
If you are keen on editing app delegate.m in Objective-C (not your language of choice) inside
- (void)applicationDidEnterBackground:(UIApplication *)application stop audio but delay using
– performSelector:withObject:afterDelay:
See the documentation here:
http://developer.apple.com/library/ios/#documentation/uikit/reference/UIApplicationDelegate_Protocol/Reference/Reference.html
https://developer.apple.com/library/mac/#documentation/Cocoa/Reference/Foundation/Classes/NSObject_Class/Reference/Reference.html
For examples on how to play audio in Objective-C:
How to play a sound in objective C iphone coding
Play Audio iOS Objective-C
How can I Add Audio Player in iphone App

Resources