Unable to get ReplayKit (w/RPBroadcastActivityViewController) to stream to YouTube live - get "The user declined application recording" error - ios

I'm trying to use ReplayKit to live stream from within an iOS app on iOS 11 and Swift 4. My code succesfully live streams to MobCrush, but when I select YouTube and the broadcast is supposed to get kicked off it fails.
Relevant code:
func broadcastActivityViewController(_ broadcastActivityViewController: RPBroadcastActivityViewController,
didFinishWith broadcastController: RPBroadcastController?,
error: Error?) {
//1
guard error == nil else {
print("Broadcast Activity Controller is not available.")
print("ERROR BROADCASTING: " + error!.localizedDescription)
return
}
//2
broadcastActivityViewController.dismiss(animated: true) {
//3
broadcastController?.startBroadcast { error in
//4
//TODO: Broadcast might take a few seconds to load up. I recommend that you add an activity indicator or something similar to show the user that it is loading.
//5
if error == nil {
print("Broadcast started successfully!")
self.broadcastStarted()
}
}
}
}
It Prints:
Broadcast Activity Controller is not available.
ERROR BROADCASTING: The user declined application recording
Trying to figure out if this is an issue with YouTube or with some permissions/implementation problem on my side.
It's worth noting that ReplayKit streaming clearly does not work for some of the advertised platforms (e.g. Periscope), but I have successfully gotten YouTube ReplayKit to work with some other apps I tested, so it should be possible.

I'm seeing a similar thing.
MobCrush - Works beautifully
Periscope - Stream starts, connects & the entry shows-up in Periscope but the video is blank/inaccessable when you want to view it either live or saved.
Youtube - An error occurs, stopping streaming from starting, yet a Scheduled Livestream entry appears for the live stream you attempted to do. This is scheduled about 8 hours in the past for me. (But I'm sure this value depends on your system clock relative to the US West Coast)
So. It appears only MobCrush seems to have upheld its end of the bargain.

Related

Spotify AppRemote is not reconnecting once disconnected in iOS

I am using Spotify in my iOS application in order to list and play a playlists. Initially I connect appRemote and fetch essential playlist contents with the following scopes such as .appRemote, .playlist, . playlistReadPrivate etc.
App is working fine but if I receive a phone call, then my appRemote was getting disconnected with the below log:
AppRemote: Disconnected with error: Error Domain=com.spotify.app-remote.transport Code=-2001 "End of stream." UserInfo={NSLocalizedRecoverySuggestion=Reconnect to the Spotify app., NSLocalizedDescription=End of stream., NSLocalizedFailureReason=One of the streams has reached the end.}
I tried disconnecting appRemote in app WillResignActive state and also reenabling(Reconnecting with appRemote.connect()) in foreground state. None of them helps. Is there any way to fix this issue or we have to reauthorize once again. Any help will be much appreciated. Thanks.
I've tried in:
- (void)appRemote:(SPTAppRemote *)appRemote didFailConnectionAttemptWithError:(NSError *)error
to handle reconnect attempts with a counter variable and increasing delay
[self performSelector:#selector(connect2Spotify) withObject:self afterDelay:3.0];
not perfect...but works in most cases..
-(void)connect2Spotify
{
if (self.appRemote!=nil)
{
if (!self.appRemote.connected){
self.appRemote.connectionParameters.accessToken = self.token;
[self.appRemote connect];
}
}
}

Programmatically trigger the action that a headphone pause button would do

I am trying to find a way to pause any playing media on the device, so I was thinking of triggering the same logic that is fired when a user press the headphone "middle button"
I managed to prevent music from resuming (after I pause it within my app, which basically start an AVAudioSession for recording) by NOT setting the AVAudioSession active property to false and leave it hanging, but I am pretty sure thats a bad way to do it. If I deactivate it the music resumes. The other option I am thinking of is playing some kind of silent loop that would "imitate" the silence I need to do. But I think if what I am seeking is doable, it would be the best approach as I understood from this question it cannot be done using the normal means
func stopAudioSession() {
let audioSession = AVAudioSession.sharedInstance(
do {
if audioSession.secondaryAudioShouldBeSilencedHint{
print("someone is playing....")
}
try audioSession.setActive(false, options: .notifyOthersOnDeactivation)
isSessionActive = false
} catch let error as NSError {
print("Unable to deactivate audio session: \(error.localizedDescription)")
print("retying.......")
}
}
In this code snippet as the function name implies I set active to false, tried to find other options but I could not find another way of stopping my recording session and prevent resume of the other app that was already playing
If someone can guide me to which library I should look into, if for example I can tap into the H/W part and trigger it OR if I can find out which library is listening to this button press event and handling the pause/play functionality
A friend of mine who is more experienced in IOS development suggested the following workaround and it worked - I am posting it here as it might help someone trying to achieve a similar behaviour.
In order to stop/pause what is currently being played on a user device, you will need to add a music player into your app. then at the point where you need to pause/stop the current media, you just initiate the player, play and then pause/stop it - simple :)
like so:
let musicPlayer = MPMusicPlayerApplicationController.applicationQueuePlayer
func stopMedia(){
MPMediaLibrary.requestAuthorization({(newPermissionStatus: MPMediaLibraryAuthorizationStatus) in
self.musicPlayer.setQueue(with: .songs())
self.musicPlayer.play()
print("Stopping music player")
self.musicPlayer.pause()
print("Stopped music player")
})
}
the part with MPMediaLibrary.requestAuthorization is needed to avoid an authorisation error when accessing user's media library.
and of course you will need to add the Privacy - Media Library Usage Description
key into your Info.plist file

Connecting bluetooth headphones while app is recording in the background causes the recording to stop

I am facing the following issue and hoping someone else encountered it and can offer a solution:
I am using AVAudioEngine to access the microphone. Until iOS 12.4, every time the audio route changed I was able to restart the AVAudioEngine graph to reconfigure it and ensure the input/output audio formats fit the new input/output route. Due to changes introduced in iOS 12.4 it is no longer possible to start (or restart for that matter) an AVAudioEngine graph while the app is backgrounded (unless it's after an interruption).
The error Apple now throw when I attempt this is:
2019-10-03 18:34:25.702143+0200 [1703:129720] [aurioc] 1590: AUIOClient_StartIO failed (561145187)
2019-10-03 18:34:25.702528+0200 [1703:129720] [avae] AVAEInternal.h:109 [AVAudioEngineGraph.mm:1544:Start: (err = PerformCommand(*ioNode, kAUStartIO, NULL, 0)): error 561145187
2019-10-03 18:34:25.711668+0200 [1703:129720] [Error] Unable to start audio engine The operation couldn’t be completed. (com.apple.coreaudio.avfaudio error 561145187.)
I'm guessing Apple closed a security vulnerability there. So now I removed the code to restart the graph when an audio route is changed (e.g. bluetooth headphones are connected).
It seems like when an I/O audio format changes (as happens when the user connects a bluetooth device), an .AVAudioEngingeConfigurationChange notification is fired, to allow the integrating app to react to the change in format. This is really what I should've used to handle changes in I/O formats from the beginning, instead of brute forcing restarting the graph. According to the Apple documentation - “When the audio engine’s I/O unit observes a change to the audio input or output hardware’s channel count or sample rate, the audio engine stops, uninitializes itself, and issues this notification.” (see the docs here). When this happens while the app is backgrounded, I am unable to start the audio engine with the correct audio i/o formats, because of point #1.
So bottom line, it looks like by closing a security vulnerability, Apple introduced a bug in reacting to audio I/O format changes while the app is backgrounded. Or am I missing something?
I'm attaching a code snippet to better describe the issue. For a plug-and-play AppDelegate see here - https://gist.github.com/nevosegal/5669ae8fb6f3fba44505543e43b5d54b.
class RCAudioEngine {
​
private let audioEngine = AVAudioEngine()
init() {
setup()
start()
NotificationCenter.default.addObserver(self, selector: #selector(handleConfigurationChange), name: .AVAudioEngineConfigurationChange, object: nil)
}
​
#objc func handleConfigurationChange() {
//attempt to call start()
//or to audioEngine.reset(), setup() and start()
//or any other combination that involves starting the audioEngine
//results in an error 561145187.
//Not calling start() doesn't return this error, but also doesn't restart
//the recording.
}
public func setup() {
​
//Setup nodes
let inputNode = audioEngine.inputNode
let inputFormat = inputNode.inputFormat(forBus: 0)
let mainMixerNode = audioEngine.mainMixerNode
​
//Mute output to avoid feedback
mainMixerNode.outputVolume = 0.0
​
inputNode.installTap(onBus: 0, bufferSize: 4096, format: inputFormat) { (buffer, _) -> Void in
//Do audio conversion and use buffers
}
}
​
public func start() {
RCLog.debug("Starting audio engine")
guard !audioEngine.isRunning else {
RCLog.debug("Audio Engine is already running")
return
}
​
do {
audioEngine.prepare()
try audioEngine.start()
} catch {
RCLog.error("Unable to start audio engine \(error.localizedDescription)")
}
}
}
I see only a fix that had gone into iOS 12.4. I am not sure if that causes the issue.
With the release notes https://developer.apple.com/documentation/ios_ipados_release_notes/ios_12_4_release_notes
"Resolved an issue where running an app in iOS 12.2 or later under the Leaks instrument resulted in random numbers of false-positive leaks for every leak check after the first one in a given run. You might still encounter this issue in Simulator, or in macOS apps when using Instruments from Xcode 10.2 and later. (48549361)"
You can raise issue with Apple , if you are a signed developer. They might help you if the defect is on their part.
You can also test with upcoming iOS release to check if your code works in the future release (with the apple beta program)

Chromecast Sleep/Background issue in iOS app

I am facing a very big issue while using Chromecast in my application. I am using normal GCKUICastButton to connect to the Chromecast. The video plays well.
I am using the default Cast receiver for my application. When the device goes to sleep, sometimes, Chromecast stops and sometimes, after sleep mode, the device disconnects after some time. After going through lot of forums and Stack Overflow questions, I implemented the below code
extension GCKSessionManager {
static func ignoreAppBackgroundModeChange() {
let oldMethod = class_getInstanceMethod(GCKSessionManager.self, #selector(GCKSessionManager.suspendSession(with:)))
let newMethod = class_getInstanceMethod(GCKSessionManager.self, #selector(GCKSessionManager.suspendSessionIgnoringAppBackgrounded(with:)))
method_exchangeImplementations(oldMethod, newMethod)
}
func suspendSessionIgnoringAppBackgrounded(with reason: GCKConnectionSuspendReason) -> Bool {
guard reason != .appBackgrounded else { return false }
return suspendSession(with:reason)
}
}
Then in my code I wrote the below line
GCKSessionManager.ignoreAppBackgroundModeChange()
Now suddenly, the Chromecast does not disconnect however after few minutes of sleep, it disconnects as well as, kill the app as well. How can I retain the Chromecast play session even if the device goes to sleep or goes to background.
As I am using GCKUICastButton so I am not using the GCKDeviceManager so I am unable to use the ignoreAppStateNotification using GCKDeviceManager, can you advice if I can use that as well.
I have also added the GCKCastOption code as well in AppDelegate.

Is it possible to get audio from an ICY stream with percentage and seek function

I'm trying to reproduce audio from an ICY stream. I'm able to reproduce that with AVPlayer and some good open source library but I'm not able to control the stream. I have no idea how I can get the percentage reproduced or how to seek to a specific time in the stream. Is that possible? Is there a good library that can help me?
Actually I'm using AFSoundManager but I'm always receiving negative numbers for percentage and I get invalid time when trying to seek the stream at a specified time.
That's the code that I'm using:
AFSoundManager.sharedManager().startStreamingRemoteAudioFromURL("http://www.abstractpath.com/files/audiosamples/sample.mp3") { (percentage, elapsedTime, timeRemaining, error, poppi) in
if error == nil {
//This block will be fired when the audio progress increases in 1%
if elapsedTime > 0 {
println(elapsedTime)
self.slider.value = Float(elapsedTime*1000)
}
} else {
//Handle the error
println(error)
}
I'm able of course to get the elapsedTime but not the percentage or the remainingTime. I always get negative numbers.
This code works perfectly with remote or local audio file but not with the stream.
This isn't possible.
These streams are live. There is nothing to seek to because what you haven't heard hasn't happened yet. Even streams that playback music end-to-end are still "live" in the sense that the audio you haven't received hasn't been encoded yet. (Small codec and transit buffers aside, of course.)

Resources