Capturer (iPhone camera) not provided to TVIVideoCapturer for TwilioVideo iOS SDK - ios

I am getting an error where my TwilioVideo module, which expects a Capturer (camera or microphone), is not receiving that input. This error started happening after we switched to Cocoapods for the installation of the SDK and the PureLayout UI library. Previously we had manually installed all these dependencies into XCode.
I am developing a React Native iOS 0.40.0 version, with react-native-cli version 1.0.0. I am using XCode Version 8.2.1 (8C1002), with the iPhone 6 simulator running on iOS 10.2. I am using Cocoapods version 1.2.0. I am using TwilioVideo SDK version 1.0.0-beta5. There is also a 1.0.0-beta6 version, which I have tried as well (with the same result). Reverting to version 1.0.0-beta4 does remove the error, which suggests to me a problem with the way I have implemented registering the audio and video tracks.
Here is my Podfile:
source 'https://github.com/CocoaPods/Specs'
source 'https://github.com/twilio/cocoapod-specs'
target 'MyApp' do
# Uncomment the next line if you're using Swift or would like to use dynamic frameworks
# use_frameworks!
# Pods for MyApp
pod 'TwilioVideo', '1.0.0-beta5'
pod 'PureLayout', '~> 3.0'
target 'MapleNativeProviderTests' do
inherit! :search_paths
# Pods for testing
end
end
I have implemented a TwilioVideo module in XCode based on this repository: react-native-twilio-video-webrtc. He recently updated the repository to work for React Native 0.40.0, which changed the import syntax for XCode. I have tried both with the old import syntax and the new import syntax, and I continue to get the following error when I try to mount my video component:
Here is the documentation for the TwilioVideo SDK. This is the TVIVideoCapturer.
I made a modification to the react-native-twilio-video-webrtc, which is essentially just a thin wrapper for the TwilioVideo SDK using RCT_EXPORT_METHOD to expose key API methods. The library initializes the audio and video tracks in the init method, which causes some annoying behaviours to do with event listeners not receiving callbacks when the application starts. So I moved these tracks to a custom, publicly exposed RCT_EXPORT_METHOD called initialize. This I call from a specific view in the application, which mounts the video and initializes the camera/microphone inputs.
My implementation of TWVideoModule.m is:
#import "TWVideoModule.h"
static NSString* roomDidConnect = #"roomDidConnect";
static NSString* roomDidDisconnect = #"roomDidDisconnect";
static NSString* roomDidFailToConnect = #"roomDidFailToConnect";
static NSString* roomParticipantDidConnect = #"roomParticipantDidConnect";
static NSString* roomParticipantDidDisconnect = #"roomParticipantDidDisconnect";
static NSString* participantAddedVideoTrack = #"participantAddedVideoTrack";
static NSString* participantRemovedVideoTrack = #"participantRemovedVideoTrack";
static NSString* participantAddedAudioTrack = #"participantAddedAudioTrack";
static NSString* participantRemovedAudioTrack = #"participantRemovedAudioTrack";
static NSString* participantEnabledTrack = #"participantEnabledTrack";
static NSString* participantDisabledTrack = #"participantDisabledTrack";
static NSString* cameraDidStart = #"cameraDidStart";
static NSString* cameraWasInterrupted = #"cameraWasInterrupted";
static NSString* cameraDidStopRunning = #"cameraDidStopRunning";
#interface TWVideoModule () <TVIParticipantDelegate, TVIRoomDelegate, TVIVideoTrackDelegate, TVICameraCapturerDelegate>
#end
#implementation TWVideoModule
#synthesize bridge = _bridge;
RCT_EXPORT_MODULE();
- (dispatch_queue_t)methodQueue
{
return dispatch_get_main_queue();
}
- (NSArray<NSString *> *)supportedEvents
{
return #[roomDidConnect,
roomDidDisconnect,
roomDidFailToConnect,
roomParticipantDidConnect,
roomParticipantDidDisconnect,
participantAddedVideoTrack,
participantRemovedVideoTrack,
participantAddedAudioTrack,
participantRemovedAudioTrack,
participantEnabledTrack,
participantDisabledTrack,
cameraDidStopRunning,
cameraDidStart,
cameraWasInterrupted];
}
- (instancetype)init
{
self = [super init];
if (self) {
UIView* remoteMediaView = [[UIView alloc] init];
//remoteMediaView.backgroundColor = [UIColor blueColor];
//remoteMediaView.translatesAutoresizingMaskIntoConstraints = NO;
self.remoteMediaView = remoteMediaView;
UIView* previewView = [[UIView alloc] init];
//previewView.backgroundColor = [UIColor yellowColor];
//previewView.translatesAutoresizingMaskIntoConstraints = NO;
self.previewView = previewView;
}
return self;
}
- (void)dealloc
{
[self.remoteMediaView removeFromSuperview];
self.remoteMediaView = nil;
[self.previewView removeFromSuperview];
self.previewView = nil;
self.participant = nil;
self.localMedia = nil;
self.camera = nil;
self.localVideoTrack = nil;
self.videoClient = nil;
self.room = nil;
}
RCT_EXPORT_METHOD(initialize) {
self.localMedia = [[TVILocalMedia alloc] init];
self.camera = [[TVICameraCapturer alloc] init];
NSLog(#"Camera %#", self.camera);
self.camera.delegate = self;
self.localVideoTrack = [self.localMedia addVideoTrack:YES
capturer:self.camera
constraints:[self videoConstraints]
error:nil];
self.localAudioTrack = [self.localMedia addAudioTrack:YES];
if (!self.localVideoTrack) {
NSLog(#"Failed to add video track");
} else {
// Attach view to video track for local preview
[self.localVideoTrack attach:self.previewView];
}
}
The rest of this file pertains to adding and removing tracks and joining/disconnecting from the Twilio channel, so I have not included it. I also have TWVideoPreviewManager and TWRemotePreviewManager, which simply provide UIViews for the media objects for local and remote video streams.
My TwilioVideoComponent.js component is:
import React, { Component, PropTypes } from 'react'
import {
NativeModules,
NativeEventEmitter
} from 'react-native';
import {
View,
} from 'native-base';
const {TWVideoModule} = NativeModules;
class TwilioVideoComponent extends Component {
state = {};
static propTypes = {
onRoomDidConnect: PropTypes.func,
onRoomDidDisconnect: PropTypes.func,
onRoomDidFailToConnect: PropTypes.func,
onRoomParticipantDidConnect: PropTypes.func,
onRoomParticipantDidDisconnect: PropTypes.func,
onParticipantAddedVideoTrack: PropTypes.func,
onParticipantRemovedVideoTrack: PropTypes.func,
onParticipantAddedAudioTrack: PropTypes.func,
onParticipantRemovedAudioTrack: PropTypes.func,
onParticipantEnabledTrack: PropTypes.func,
onParticipantDisabledTrack: PropTypes.func,
onCameraDidStart: PropTypes.func,
onCameraWasInterrupted: PropTypes.func,
onCameraDidStopRunning: PropTypes.func,
...View.propTypes,
};
_subscriptions = [];
constructor(props) {
super(props);
this.flipCamera = this.flipCamera.bind(this);
this.startCall = this.startCall.bind(this);
this.endCall = this.endCall.bind(this);
this._eventEmitter = new NativeEventEmitter(TWVideoModule)
}
//
// Methods
/**
* Initializes camera and microphone tracks
*/
initializeVideo() {
TWVideoModule.initialize();
}
flipCamera() {
TWVideoModule.flipCamera();
}
startCall({roomName, accessToken}) {
TWVideoModule.startCallWithAccessToken(accessToken, roomName);
}
endCall() {
TWVideoModule.disconnect();
}
toggleVideo() {
TWVideoModule.toggleVideo();
}
toggleAudio() {
TWVideoModule.toggleAudio();
}
_unregisterEvents() {
this._subscriptions.forEach(e => e.remove());
this._subscriptions = []
}
_registerEvents() {
this._subscriptions = [
this._eventEmitter.addListener('roomDidConnect', (data) => {
if (this.props.onRoomDidConnect) {
this.props.onRoomDidConnect(data)
}
}),
this._eventEmitter.addListener('roomDidDisconnect', (data) => {
if (this.props.onRoomDidDisconnect) {
this.props.onRoomDidDisconnect(data)
}
}),
this._eventEmitter.addListener('roomDidFailToConnect', (data) => {
if (this.props.onRoomDidFailToConnect) {
this.props.onRoomDidFailToConnect(data)
}
}),
this._eventEmitter.addListener('roomParticipantDidConnect', (data) => {
if (this.props.onRoomParticipantDidConnect) {
this.props.onRoomParticipantDidConnect(data)
}
}),
this._eventEmitter.addListener('roomParticipantDidDisconnect', (data) => {
if (this.props.onRoomParticipantDidDisconnect) {
this.props.onRoomParticipantDidDisconnect(data)
}
}),
this._eventEmitter.addListener('participantAddedVideoTrack', (data) => {
if (this.props.onParticipantAddedVideoTrack) {
this.props.onParticipantAddedVideoTrack(data)
}
}),
this._eventEmitter.addListener('participantRemovedVideoTrack', (data) => {
if (this.props.onParticipantRemovedVideoTrack) {
this.props.onParticipantRemovedVideoTrack(data)
}
}),
this._eventEmitter.addListener('participantAddedAudioTrack', (data) => {
if (this.props.onParticipantAddedAudioTrack) {
this.props.onParticipantAddedAudioTrack(data)
}
}),
this._eventEmitter.addListener('participantRemovedAudioTrack', (data) => {
if (this.props.onParticipantRemovedAudioTrack) {
this.props.onParticipantRemovedAudioTrack(data)
}
}),
this._eventEmitter.addListener('participantEnabledTrack', (data) => {
if (this.props.onParticipantEnabledTrack) {
this.props.onParticipantEnabledTrack(data)
}
}),
this._eventEmitter.addListener('participantDisabledTrack', (data) => {
if (this.props.onParticipantDisabledTrack) {
this.props.onParticipantDisabledTrack(data)
}
}),
this._eventEmitter.addListener('cameraDidStart', (data) => {
if (this.props.onCameraDidStart) {
this.props.onCameraDidStart(data)
}
}),
this._eventEmitter.addListener('cameraWasInterrupted', (data) => {
if (this.props.onCameraWasInterrupted) {
this.props.onCameraWasInterrupted(data)
}
}),
this._eventEmitter.addListener('cameraDidStopRunning', (data) => {
if (this.props.onCameraDidStopRunning) {
this.props.onCameraDidStopRunning(data)
}
})
]
}
componentWillMount() {
this._eventEmitter.addListener('cameraDidStart', (data) => {
if (this.props.onCameraDidStart) {
this.props.onCameraDidStart(data)
}
});
this._registerEvents()
}
componentWillUnmount() {
this._unregisterEvents()
}
render() {
return this.props.children || null
}
}
export default TwilioVideoComponent;
I'm not sure how to modify the XCode to have compatibility with the TwilioVideo beta5 API. Any help would be appreciated.

In your podfile, look for # use_frameworks! and remove the #.

Related

React Native Module | ASWebAuthenticationSession error on swift but not in Objective C

I'm working on a react native library including auth processes, So I chose ASWebAuthenticationSession to do it.
My first step for this RN library was to develop natively first(in Swift). And when I started this new library it cames with both objective-c bridge and swift and I assume that both files could do the same.
But I can't run ASWebAuthenticationSession from the swift file properly where objective c runs it perfectly and I prefer to do it from Swift (If I'm wrong tell me)
The problem is that when I run code from swift the ASWebAuthenticationSession popup closes before any user input but not from objective-c .
Here are my codes, if you have an idea thank you by advance.
Swift Version
//MyRnModule.m
#interface RCT_EXTERN_MODULE(MyRNModule, NSObject)
- (dispatch_queue_t)methodQueue
{
return dispatch_get_main_queue();
}
RCT_EXTERN_METHOD(startSecuredView:(NSURL *)uri)
//MyRnModule.swift
#objc(MyRNModule)
class MyRNModule: NSObject {
#objc func startSecuredView(_ url: URL?) {
if let url = url {
if #available(iOS 12.0, *) {
let session = ASWebAuthenticationSession(url: url, callbackURLScheme: "", completionHandler: { (callbackURL, error) in
print("completed")
if let error = error {
print("erorr \(error)")
return
}
if let callbackURL = callbackURL {
print("should handle callback \(callbackURL)")
}
})
if #available(iOS 13.0, *) {
session.presentationContextProvider = self
}
session.start()
}
} else {
print("you must specify url")
}
}
}
extension MyRNModule: ASWebAuthenticationPresentationContextProviding {
#available(iOS 13, *)
func presentationAnchor(for session: ASWebAuthenticationSession) -> ASPresentationAnchor{
if let keyWindow = UIApplication.shared.windows.filter {$0.isKeyWindow}.first {
return keyWindow
} else {
return ASPresentationAnchor()
}
}
}
Objective-C
#interface RCT_EXTERN_MODULE(MyRNModule, NSObject)
- (dispatch_queue_t)methodQueue
{
return dispatch_get_main_queue();
}
RCT_EXPORT_METHOD(startSecuredView:(NSURL *)url)
{
if (!url) {
RCTLogError(#"You must specify a url.");
return;
}
if (#available(iOS 12.0, *)) {
ASWebAuthenticationSession* session =
[[ASWebAuthenticationSession alloc] initWithURL:url
callbackURLScheme: #""
completionHandler:^(NSURL * _Nullable callbackURL,
NSError * _Nullable error) {
_authenticationVCC = nil;
if (callbackURL) {
[RCTSharedApplication() openURL:callbackURL];
}
}];
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 130000
if (#available(iOS 13.0, *)) {
session.presentationContextProvider = self;
}
#endif
_authenticationVCC = session;
[session start];
}
}
#if __IPHONE_OS_VERSION_MAX_ALLOWED >= 130000
#pragma mark - ASWebAuthenticationPresentationContextProviding
- (ASPresentationAnchor)presentationAnchorForWebAuthenticationSession:(ASWebAuthenticationSession *)session API_AVAILABLE(ios(13.0)){
return UIApplication.sharedApplication.keyWindow;
}
#endif
Codes seems to reflect same processes, just translated, I don't know what I'm missing out because call of MyRNModule. startSecuredView("https://some.url") do not behave the same
In your Objective-C Code, you hold a "strong" reference to the session.
_authenticationVCC = session;
In your Swift Code, you do not.
The documentation states that a strong reference is mandatory for iOS < 13.0, otherwise it will be immediately removed as there is no more active reference to the session when your method ends. This results in the window getting closed.
To fix this, you could add an attribute to your class MyRNModule and assign the sessionto this attribute instead of a local constant before starting it.
class MyRNModule: NSObject {
private var session: ASWebAuthenticationSession?
}
And later in your method:
self.session = ASWebAuthenticationSession(...)
Cite from the docs:
For iOS apps with a deployment target earlier than iOS 13, your app
must keep a strong reference to the session to prevent the system from
deallocating the session while waiting for authentication to complete.

ios image picker delegates not triggering in nativescript

I'm writing nativescript plugin for image picker. I'm finished with part of android. Now, I'm writing code for ios. It's showing image picker dialog, but assigned delegates are not getting triggered. Please check my code below.
import * as application from "tns-core-modules/application";
import * as frame from "tns-core-modules/ui/frame"
export class Nativemediapicker extends NSObject implements UIImagePickerControllerDelegate {
public static ObjCProtocols = [UIImagePickerControllerDelegate];
get() {
let version = NSBundle.mainBundle.objectForInfoDictionaryKey("CFBundleShortVersionString");
return version;
}
static new(): Nativemediapicker {
return <Nativemediapicker>super.new();
}
private _callback: (result?) => void;
private _errorCallback: (result?) => void;
public initWithCallbackAndOptions(callback: (result?) => void, errorCallback: (result?) => void, options?): Nativemediapicker {
this._callback = callback;
this._errorCallback = errorCallback;
if (options) {
// collect options
}
console.log('initWithCallbackAndOptions')
return this;
}
static registerFileProvider(provider) { }
static pickFiles(mimeType, onResult, onError) {
onError("ERROR: For ios this feature is comming soon.");
}
static takePicture(onResult, onError) {
// if (!UIImagePickerController.isSourceTypeAvailable(UIImagePickerControllerSourceType.Camera)) {
// onError("ERROR: For ios simulator this feature is not supported.");
// return
// }
let imagePicker = UIImagePickerController.new()
imagePicker.delegate = Nativemediapicker.new().initWithCallbackAndOptions(onResult, onError, null)
imagePicker.sourceType = UIImagePickerControllerSourceType.PhotoLibrary
imagePicker.allowsEditing = false
// imagePicker.showsCameraControls = true
let topMostFrame = frame.topmost();
if (topMostFrame) {
let viewController: UIViewController = topMostFrame.currentPage && topMostFrame.currentPage.ios;
if (viewController) {
while (viewController.parentViewController) {
// find top-most view controler
viewController = viewController.parentViewController;
}
while (viewController.presentedViewController) {
// find last presented modal
viewController = viewController.presentedViewController;
}
viewController.presentViewControllerAnimatedCompletion(imagePicker, true, null);
}
}
}
static recordVideo(onResult, onError) {
onError("ERROR: For ios this feature is comming soon.");
}
static recordAudio(onResult, onError) {
onError("ERROR: For ios this feature is comming soon.");
}
imagePickerControllerDidCancel(picker): void {
console.log("imagePickerControllerDidCancel")
this._errorCallback("ERROR: Image capturing cancelled.");
}
imagePickerControllerDidFinishPickingMediaWithInfo(picker, info): void {
console.log("imagePickerControllerDidCancel")
this._errorCallback("ERROR: Image capturing done.");
}
}
I'm not getting, what I'm doing wrong and where?
Please help me, guys...
I suspect the reason is that your delegate is being cleaned up by garbage collector. One important rule with iOS is, you must always keep a reference of native object in a JS variable to keep it alive.
Try,
private _delegate;
....
this._delegate = Nativemediapicker.new().initWithCallbackAndOptions(onResult, onError, null);
imagePicker.delegate = this._delegate;
After adding this line in takePicture function it worked.
imagePicker.modalPresentationStyle = UIModalPresentationStyle.CurrentContext;

Showing "Import with Instagram" in UIActivityViewController

I am trying to add Instagram in "Share To" functionality in my app. I have seen the Instagram's iPhone hooks documents. I have created custom UIActivty which works fine but my question is, is there a way to add "Import with Instagram" functionality as it can be seen in iOS's Photos app iOS Photo App:
In my app for some reason, it does not show that "Import with Instagram". my app Share view :
I do not want to share only with Instagram so no ".igo"
EDIT: All of this is specifically for iOS versions < 10. For some reasons Instagram Share Extension works fine (for my app) in devices with iOS >= 10.
EDIT: I am trying to share image and video with ".jpeg" and ".mov" formats respectively
I have seen/read that Instagram added share extension in release 8.2, so technically all the apps should show "Instagram" in share tray, i.e. it can be seen in Google Photos app.
public void NativeShareImage(UIView sourceView, CGRect sourceRect,
UIImage image, string shareCaption, string emailSubject)
{
string filename = Path.Combine(FileSystemUtils.GetTemporaryDataPath(), "Image.jpg");
NSError err = null;
using(var imgData = image.AsJPEG(JpgImageQuality))
{
if(imgData.Save(filename, false, out err))
{
Logger.Information("Image saved before native share as {FileName}", filename);
}
else
{
Logger.Error("Image NOT saved before native share as to path {FileName}. {Error}", filename, err.Description);
return;
}
}
// this are the items that needs to be shared
// Instagram ignores the caption, that is known
var activityItems = new List<NSObject>
{
new NSString(shareCaption),
new NSUrl(new Uri(filename).AbsoluteUri)
};
// Here i add the custom UIActivity for Instagram
UIActivity[] applicationActivities =
{
new InstagramActivity(image, sourceRect, sourceView),
}
var activityViewController = new UIActivityViewController(activityItems.ToArray(), applicationActivities);
activityViewController.SetValueForKey(new NSString(emailSubject), new NSString("subject"));
activityViewController.CompletionWithItemsHandler = (activityType, completed, returnedItems, error) =>
{
UserSharedTo(activityType, completed);
};
// Hide some of the less used activity types so that Instagram shows up in the list. Otherwise it's pushed off the activity view
// and the user has to scroll to see it.
activityViewController.ExcludedActivityTypes = new[] { UIActivityType.AssignToContact, UIActivityType.CopyToPasteboard, UIActivityType.Print };
if(UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone)
{
PresentViewController(activityViewController, true, null);
}
else
{
activityViewController.ModalPresentationStyle = UIModalPresentationStyle.Popover;
PresentViewController(activityViewController, true, null);
// Get the popover presentation controller and configure it.
UIPopoverPresentationController presentationController = activityViewController.PopoverPresentationController;
presentationController.PermittedArrowDirections = UIPopoverArrowDirection.Down;
presentationController.SourceRect = sourceRect;
presentationController.SourceView = sourceView;
}
}
// when opening custom activity use ".igo" to only show instagram
public class InstagramActivity : UIActivity
{
public InstagramActivity(UIImage imageToShare, CGRect frame, UIView view, string shareCaption = "")
{
_ImageToShare = imageToShare;
_Frame = frame;
_View = view;
}
public override UIImage Image { get { return UIImage.FromBundle("Instagram"); } }
public override string Title { get { return "Instagram"; } }
public override NSString Type { get { return new NSString("PostToInstagram"); } }
public string Caption { get; set; }
public override bool CanPerform(NSObject[] activityItems)
{
return UIApplication.SharedApplication.CanOpenUrl(NSUrl.FromString("instagram://app"));
}
public override void Prepare(NSObject[] activityItems)
{
}
public override void Perform()
{
string filename = Path.Combine(FileSystemUtils.GetTemporaryDataPath(), "Image.igo");
NSError err = null;
using(var imgData = _ImageToShare.AsJPEG(JpgImageQuality))
{
if(imgData.Save(filename, false, out err))
{
Logger.Information("Instagram image saved as {FileName}", filename);
}
else
{
Logger.Error("Instagram image NOT saved as to path {FileName}. {Error}", filename, err.Description);
Finished(false);
return;
}
}
var url = NSUrl.FromFilename(filename);
_DocumentController = UIDocumentInteractionController.FromUrl(url);
_DocumentController.DidEndSendingToApplication += (o, e) => Finished(true);
_DocumentController.Uti = "com.instagram.exclusivegram";
if(!string.IsNullOrEmpty(ShareCaption))
{
_DocumentController.Annotation = NSDictionary.FromObjectAndKey(new NSString(ShareCaption), new NSString("InstagramCaption"));
}
_DocumentController.PresentOpenInMenu(_Frame, _View, true);
}
UIImage _ImageToShare;
CGRect _Frame;
UIView _View;
UIDocumentInteractionController _DocumentController;
}

WebViewJavascriptBridge not call again when dismissViewController

when the view show the first time, the js event run normaly. than I use presentViewController to show the image which I click. after dismissViewController , back to the webView, click the image again,but the js callHandler not run. Here is my code.
// oc file code
- (void)viewWillAppear:(BOOL)animated {
[super viewWillAppear:animated];
if (_bridge) { return; }
_bridge = [WebViewJavascriptBridge bridgeForWebView:self.webView];
[_bridge registerHandler:#"jsImagesList" handler:^(id data, WVJBResponseCallback responseCallback) {
self.imageList = data;
}];
[_bridge registerHandler:#"openImageOnIndex" handler:^(id data, WVJBResponseCallback responseCallback) {
[self openImageOnIndex:data];// use MWPhotoBrowser to show the image
}];
}
// js file code
function setupWebViewJavascriptBridge(callback) {
// just copy from WebViewJavascriptBridge
}
setupWebViewJavascriptBridge(function(bridge) {
var objs = document.getElementsByTagName("img");
var images = [];
for(var i=0;i<objs.length;i++) {
images.push(objs[i].src);
objs[i].setAttribute('data-index',i);
objs[i].onclick = function(){
// just for test, let me know I can click the img
document.body.style.backgroundColor = '#'+Math.floor(Math.random()*0xFFFFFF).toString(16);
var di = this.getAttribute('data-index');
// ----------- question -----------
// first time it run, but when dismissViewController, it not run
bridge.callHandler('openImageOnIndex', {'i': di}, function(response) {})
}
}
bridge.callHandler('jsImagesList', images, function(response) {})
})
how can I modify it, thank you very much !

Cordova Media Plugin doesn't implement audioPlayerEndInterruption

I have created an iOS app that plays audio while the app is running in the background. If the audio is ever interrupted (i.e.: phone call), the audio stops and never resumes.
I think this is because the cordova media plugin doesn't implement audioPlayerEndInterruption
I'm good with Javascript but know almost nothing about Objective-C. Does anyone have advice on how to add this functionality?
Is there a different media plugin that implements audioPlayerEndInterruption, or is there a simple way to incorporate audioPlayerEndInterruption into the cordova plugin?
CDVSound.h
-modify:add: , MEDIA_END_INTERRUPT = 5
enum CDVMediaStates {
MEDIA_NONE = 0,
MEDIA_STARTING = 1,
MEDIA_RUNNING = 2,
MEDIA_PAUSED = 3,
MEDIA_STOPPED = 4,
MEDIA_END_INTERRUPT = 5
};
CDVSound.m
-add:
- (void)audioPlayerEndInterruption:(AVAudioPlayer*)player successfully:(BOOL)flag
{
CDVAudioPlayer* aPlayer = (CDVAudioPlayer*)player;
NSString* mediaId = aPlayer.mediaId;
CDVAudioFile* audioFile = [[self soundCache] objectForKey:mediaId];
NSString* jsString = nil;
if (audioFile != nil) {
NSLog(#"Ended Interruption of playing audio sample '%#'", audioFile.resourcePath);
}
if (flag) {
jsString = [NSString stringWithFormat:#"%#(\"%#\",%d,%d);", #"cordova.require('org.apache.cordova.media.Media').onStatus", mediaId, MEDIA_STATE, MEDIA_END_INTERRUPT];
} else {
jsString = [NSString stringWithFormat:#"%#(\"%#\",%d,%#);", #"cordova.require('org.apache.cordova.media.Media').onStatus", mediaId, MEDIA_ERROR, [self createMediaErrorWithCode:MEDIA_ERR_DECODE message:nil]];
}
[self.commandDelegate evalJs:jsString];
}
Media.js
-modify:add: Media.MEDIA_END_INTERRUPT = 5; AND , "EndInterrupt"
// Media states
Media.MEDIA_NONE = 0;
Media.MEDIA_STARTING = 1;
Media.MEDIA_RUNNING = 2;
Media.MEDIA_PAUSED = 3;
Media.MEDIA_STOPPED = 4;
Media.MEDIA_END_INTERRUPT = 5;
Media.MEDIA_MSG = ["None", "Starting", "Running", "Paused", "Stopped", "EndInterrupt"];
Then find Media.onStatus = function(id, msgType, value) { and add a conditional in the case Media.MEDIA_STATE : block...
if(value == Media.MEDIA_END_INTERRUPT) {
//do whatever you want
}
Just following up here. I ended up writing a plugin for this that overwrites some of the default cordova media plugin (v0.2.12) methods. I'm posting a link here, but the code is a mess. I was just trying to get something that works and not really planning on sharing. But here it is (no documentation provided, sorry):
https://github.com/stevethorson/Interruptible-Cordova-Media.git
I fixed the problem from within the media plugin itself, using the getCurrentPosition method I was able to detect if the streaming gets intrupted by a phone call or voicemail.. and then stop and rePlay the audio again. I hope this helps someone I took me forever to get this working with the media plugin and shoutcast.
var mediaplayer = new Media(streamingUrl, mediaSuccess, mediaFail, mediaState);
mediaplayer.play({playAudioWhenScreenIsLocked :true});
var prevPos = 0;
var mediaTimer = setInterval(function () {
// get media position
mediaplayer.getCurrentPosition(
// success callback
function (currentposition) {
if (currentposition > prevPos) {
prevPos = currentposition;
} else if (currentposition === prevPos) {
stop();
play();
} else {
console.log("can't get pos");
}
},
// error callback
function (e) {
alert+("Error getting pos=" + e);
}
);
}, 5000);

Resources