I am developing a nativescript app that stream a webRadio, and for that I have used the plugin nativescript-audio.
This plugin do the job on Android, but on the iOS part, I have replaced the AVAudioPlayer by AVPlayer because AVAudioPlayer doesn't manage web streaming. Actually, I try to understand how to manage the variables and events observer to be notified when an event is trigged or a value involved.
I tried to be notified when player has reached the end of a media :
NSNotificationCenter.defaultCenter.addObserverSelectorNameObject(
_this,
NSSelectorFromString("playerDidFinishPlaying"),
AVPlayerItemDidPlayToEndTimeNotification,
null);
TNSPlayer.playerDidFinishPlaying = function (args) {
if (this._completeCallback) this._completeCallback();
};
But this code crashed during execution because it doesn't find the method playerDidFinishPlaying.
I also wanted to be notified on the AVAudioPlayer.currentItem.status to know when the player involved to AVPlayerStatus.readyToPlay
I tried to adopt swift example to nativescript format but no success.
Could you help me to resolve my problems?
Thanks
Related
I am using an instance of MPMusicPlayerController.systemMusicPlayer to enqueue an array of store IDs. This has worked for months now. Earlier today I updated to iOS 14.3, and the player is now failing to play songs.
The code below is the minimal amount needed to replicate the bug:
// note: repo using any play method you want
let player = MPMusicPlayerController.systemMusicPlayer
let descriptor: MPMusicPlayerStoreQueueDescriptor?
func setup() {
let storeIDs: [String] = ["lorem", "ipsum"] // fetch real IDs from the API
descriptor = MPMusicPlayerStoreQueueDescriptor(queue: storeIDs)
}
func play() {
self.player.setQueue(with: descriptor!)
self.player.play()
}
// Expected: plays song with store ID "lorem"
// Actual: app freezes and I see error logs
When I play a song, instead of playing it, the app completely freezes (meaning it doesn't respond to user interaction), and I see the following logs:
[SDKPlayback] ASYNC-WATCHDOG-1: Attempting to wake up the remote process
[SDKPlayback] SYNC-WATCHDOG-1: Attempting to wake up the remote process
[SDKPlayback] ASYNC-WATCHDOG-2: Tearing down connection
[SDKPlayback] SYNC-WATCHDOG-2: Tearing down connection
The MPMusicPlayerController plays music just fine on iOS 14.2.
Can anybody confirm or shed some light on what's going on here?
I filed a TSI/bug report with Apple in the meantime.
I can confirm the issue is still present, but after doing some testing I found out that what it's actually doing is blocking the main thread from executing. So a workaround that at least worked for me is executing the play function inside the background thread like this:
DispatchQueue.global(qos: .background).async {
player.prepareToPlay()
player.play()
}
Now the issue may still be present sometimes but i found that moving it to the background thread makes it way less tedious and less often. Also adding prepare to play also seems to make it work 99% of the time.
I am having an intermittent (aargh!) problem playing Text-to-Speech in the background, triggered from Apple Watch. I have properly set up the background mode, the AVSession category, and the WatchKitExtensionRequest handler. (See below.) I had this working before, and can't figure out what changed. (Could it be iOS 9 has issues? "Before" means, among other things, iOS 8.)
The problem is this: when the app gets the request from the Watch and the app is either in the background or the phone is sleeping (locked), the speech sometimes plays right away, and other times doesn't play until the app is brought to the foreground. The OS seems to be sometimes queuing the audio, and sometimes not. I can't find any common thread between success and failure cases. I can verify with logging that the call to speakUtterance() is being made in all situations. But its behavior varies, apparently randomly. The only clue is that it might be the case that the longer the app is in the background, the less likely it is to speak right away.
This is making me pull my hair out. Suggestions welcome.
In info.plist:
Required background modes: App plays audio or streams audio/video using AirPlay
In AppDelegate.application:didFinishLaunching:withOptions():
do {
try AVAudioSession.sharedInstance().setCategory(
AVAudioSessionCategoryPlayback,
withOptions:.DuckOthers
)
try AVAudioSession.sharedInstance().setActive(true)
} catch let error as NSError {
// etc...
}
In AppDelegate.application:handleWatchKitExtensionRequest...():
var bgTaskId:UIBackgroundTaskIdentifier = 0
bgTaskId = application.beginBackgroundTaskWithName(
"Prose WKE handler",
expirationHandler: {
application.endBackgroundTask(bgTaskId)
}
)
//... Post notification to call Text-to-Speech
application.endBackgroundTask(bgTaskId)
Here's a workaround: play a second snippet of sound (I used a half-second of silence), using AVAudioPlayer, right after the call to speakUtterance(), This seems to "jog the pipeline".
I am currently working on a PhoneGap application that, upon pressing a button, is supposed to play an 8 seconds long sound clip, while at the same time streaming sound from the microphone over RTMP to a Wowza server through a Cordova plugin using the iOS library VideoCore.
My problem is that (on iOS exclusively) when the sound clip stops playing, the microphone - for some reason - also stops recording sound. However, the stream is still active, resulting in a sound clip on the server side consisting of 8 seconds of microphone input, then complete silence.
Commenting out the line that plays the sound results in the microphone recoding sound without a problem, however we need to be able to play the sound.
Defining the media variable:
_alarmSound = new Media("Audio/alarm.mp3")
Playing the sound and starting the stream:
if(_streamAudio){
startAudioStream(_alarmId);
}
if(localStorage.getItem("alarmSound") == "true"){
_alarmSound.play();
}
It seems to me like there is some kind of internal resource usage conflict occuring when PhoneGap stops playing the sound clip, however I have no idea what I can do to fix it.
I've encountered the same problem on iOS, and I solved it by doing two things:
AVAudioSession Category
In iOS, apps should specify their requirements on the sound resource using the singleton AVAudioSession. When using Cordova there is a plugin that enables you to do this: https://github.com/eworx/av-audio-session-adapter
So for example, when you want to just play sounds you set the category to PLAYBACK:
audioSession.setCategoryWithOptions(
AVAudioSessionAdapter.Categories.PLAYBACK,
AVAudioSessionAdapter.CategoryOptions.MIX_WITH_OTHERS,
successCallback,
errorCallback
);
And when you want to record sound using the microphone and still be able to playback sounds you set the category as PLAY_AND_RECORD:
audioSession.setCategoryWithOptions(
AVAudioSessionAdapter.Categories.PLAY_AND_RECORD,
AVAudioSessionAdapter.CategoryOptions.MIX_WITH_OTHERS,
successCallback,
errorCallback
);
cordova-plugin-media kills the AVAudioSession
The Media plugin that you're using for playback handles both recording and playback in a way that makes it impossible to combine with other sound plugins and the Web Audio API. It deactivates the AVAudioSession each time it has finished playback or recording. Since there is only one such session for your app, this effectively deactivates all sound in your app.
There is a bug registered for this: https://issues.apache.org/jira/browse/CB-11026
Since the bug is still not fixed, and you still want to use the Media plugin, the only way to fix this is to download the plugin code and comment out/remove the lines where the AVAudioSession is deactivated:
[self.avSession setActive:NO error:nil];
I found it took some effort to get the existing answers on this topic working with my code so I'm hoping this helps someone who needs a more explicit solution.
As of version 5.0.2 of the Cordova Media plugin this issue still persists. If you want to use this plugin, the most straightforward solution that I have found is to fork the plugin repository and make the following changes to src/ios/CDVSound.m. After quite some hours looking into this, I was unable to find a working solution without modifying the plugin source, nor was I able to find suitable alternative plugins.
Force session category
The author of the plugin mentioned in Edin's answer explains the issue with the AVAudioSession categories on a similar question. Since changes to CDVSound.m are necessary regardless, I found the following change easier than getting the eworx/av-audio-session-adapter plugin working.
Locate the following lines of code:
NSString* sessionCategory = bPlayAudioWhenScreenIsLocked ? AVAudioSessionCategoryPlayback : AVAudioSessionCategorySoloAmbient;
[self.avSession setCategory:sessionCategory error:&err];
Replace them with:
[self.avSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err];
Prevent session deactivation
Further more, the changes suggested in Edin's answer are necessary to prevent deactivation of additional audio sessions, however, not all instances need to be removed.
Remove the following line and surrounding conditional statements from the functions audioRecorderDidFinishRecording(), audioPlayerDidFinishPlaying and itemDidFinishPlaying():
[self.avSession setActive:NO error:nil];
The other instances of the code are used for error handling and resource release and, in my experience, do not need to be and should not be removed.
Update: Playback Volume
After apply these changes we experienced issues with low audio volume during playback (while recording was active). Adding the following lines after the respective setCategory lines from above allowed for playback at full volume. These changes are explained in the answer from which this fix was sourced.
[self.avSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
and
[weakSelf.avSession overrideOutputAudioPort:AVAudioSessionPortOverrideSpeaker error:nil];
There's not much to my question I guess. I'm just curious about how CocoaLibSpotify works with AVFoundation and if it's compatible with how Apple needs me to register for remote control events and to set the now playing info in MKNowPlayingInfoCenter.
Apple says to receive remote control events my app needs to "Begin playing audio. Your app must be the “Now Playing” app. Restated, even if your app is the first responder and you have turned on event delivery, your app does not receive remote control events until it begins playing audio.'" however, that's all the documentation I can find... Does playing a track with SPPlaybackManager meet this requirement? What is the requirement anyway?
Thanks for your help again.
Remote control events work fine with CocoaLibSpotify without any modifications to the library at all, but only on the device and not in the Simulator (including iOS7's Control Center).
Taking the Simple Player example, I made the following changes:
Changed Simple_PlayerAppDelegate to be a subclass of UIResponder.
Overrode canBecomeFirstResponder: to return YES.
Implemented remoteControlReceivedWithEvent:.
In the callback to the playTrack: call to CocoaLibSpotify, added:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
These changes allowed Simple Player to receive remote control events when running on a device.
I'm setting up an AVAudioSession when the app launches and setting the delegate to the appDelegate. Everything seems to be working (playback, etc) except that beginInterruption on the delegate is not being called when the phone receives a call. When the call ends endInterruption is being called though.
The only thought I have is that the audio player code I'm using used to be based on AVAudioPlayer, but is now using AVPlayer. The callbacks for the AVAudioPlayer delegate for handling interrupts are still in there, but it seems odd that they would conflict in any way.
Looking at the header, in iOS6, it looks like AVAudioSessionDelegate is now deprecated.
Use AVAudioSessionInterruptionNotification instead in iOS6.
Update: That didn't work. I think there's a bug in the framework.
Yes, in my experience, beginInterruption, nor the newly documented AVAudioSessionInterruptionNotification work properly. What I had to do was track the status of the player using a local flag, then handle the endInterruption:withFlags: method in order to track recovery from interruptions.
With iOS 6, the resuming from an interruption will at least keep your AudioPlayer in the right place, so there was no need for me to store the last known play time of my AVAudioPlayer, I simply had to hit play.
Here's the solution that I came up with. It seems like iOS 6 kills your audio with a Media Reset if an AVPlayer stays resident too long. What ends up happening, is the AVPlayer plays, but no sound comes out. The rate on the AVPlayer is 1, but there's absolutely no sound. To add pain to the situation, there's no error on either the AVAudioSession setActive, nor the AVPlayer itself that indicates that there's a problem.
Add to the fact that you can't depend on appWillResignActive, because your app may already be in the background if you're depending on remote control gestures at all.
The final solution I implemented was to add a periodic observer on the AVPlayer, and record the last known time. When I receive the event that I've been given back control, I create a new AVPlayer, load it with the AVPlayerItem, and seekToTime to the proper time.
It's quite an annoying workaround, but at least it works, and avoids the periodic crashes that were happening.
I can confirm that using the C api, the interruption method is also not called when the interruption begins; only when it ends
(AudioSessionInitialize (nil, nil, interruptionListenerCallback, (__bridge void *)(self));
I've also filed a bug report with apple for the issue.
Edit: This is fixed in iOS 6.1 (but not iOS 6.0.1)
Just call:
[[AVAudioSession sharedInstance] setDelegate: self];
I just checked on my iPhone 5 (running iOS 6.0) by setting a breakpoint in the AudioSessionInterruptionListener callback function that was declared in AudioSessionInitialize(), and this interrupt callback does, in fact, get called when the app has an active audio session and audio unit and is interrupted with an incoming phone call (Xcode shows the app stopped at the breakpoint at the beginning of the interruption, which I then continue from).
I have the app then stop its audio unit and de-activate its audio session. Then, on the end interruption callback, the app re-activates the audio session and restarts the audio unit without problems (the app is recording audio properly afterwards).
I built a brand new audio streaming (AVPlayer) application atop iOS 6.0.x and found the same problem.
Delegates are now deprecated and we have to use notifications, that's great, however here's my findings:
During an incoming phone call I get only AVAudioSessionInterruptionTypeEnded in my handler, along with AVAudioSessionInterruptionOptionShouldResume. Audio session gets suspended automatically (audio fades) and I just need to resume playback of AVPlayer.
However when attempting to launch a game, such as CSR Racing, I oddly get the dreaded AVAudioSessionInterruptionTypeBegan but no sign when my application can resume playback, not even killing the game.
Now, this may depend on other factors, such as my audio category (in my case AVAudioSessionCategoryPlayback) and the mixing settings of both applications (kAudioSessionProperty_OverrideCategoryMixWithOthers), I'm not sure, but definitely I see something out of place.
Hopefully others reported that on 6.1beta this is fixed and I yet have to upgrade, so we'll see.