iPhone Screen recording failed to save due to -5825 - ios

I've video streaming in my app. I'm having some weird issues in screen recording on iPhone. The most obvious issue is that during live streaming screen recording iPhone fails to save recorded screen with error 5825.
Another issue is that audio stops in my app when screen recording is started. I can screen record other apps such as YouTube etc.
Somewhere it's suggested that it's due to lack of storage. I've enough storage space on my devices. I've tried different iPhones and I can see same issue on all.
So I was wondering if I need any special permissions or any handling in my app to allow screen recording?
Is it really any issue with my app (code) or it's purely device issue?

Have you tried to set AVAudioSession category to .multiRoute or .playAndRecord?
do {
let session = AVAudioSession.sharedInstance()
try session.setCategory(.multiRoute)
try session.setMode(.default /* or .moviePlayback*/)
try session.setActive(true, options: options)
} catch {
print("ActiveAudioSession failed.")
}
I have the same issue but can't always reproduce it, so my next try is to add this lines of code.
Another thing I found here - https://discussions.apple.com/thread/251342404
which says:
That usually appears when there is a space issue. Do you have ample free space on the iPhone? How long of a screen recording was it? Did you attempt to record any copyrighted materials?
UPDATE
seems to be it could be also a bug in iOS 14+

Related

Turning off audio background mode triggers kMIDINotPermitted even though background audio is not needed

I've got a MIDISampler that is triggered by a MIDICallbackInstrument. I don't want my app to work in the background since it is an interactive ear training game. Unfortunately if I disable "Audio, AirPlay, and Picture in Picture" I get the following error:
CheckError.swift:CheckError(_:):176:kMIDINotPermitted: Have you enabled the audio background mode in your ios app?
Also the sampler plays fixed pitch sine waves instead of samples.
The solution would be to just turn on the audio background mode though it is not needed. Everything works fine, no errors. Unfortunately Apple is rejecting my app in that case since it's not using background audio.
Does anyone know a work around for this?
You need background audio enabled for MIDISampler. As long as your app is stopping the audio engine when the app moves into the background, they should allow it. I would explain your use case and appeal the rejection.

Is there any way to keep playing haptics when the screen is locked?

This is not for an App Store app, so "creative" solutions welcome.
I'd like to keep haptics playing when the screen is locked.
I always get this though:
Stop Handler: The engine stopped for reason: 1
Audio session interrupt
I've added audio as a background mode so it's surprising to me that I'm getting the "audio session interrupt" as a stop reason, rather than, say, applicationSuspended.
Has anyone solved this problem?

WebRTC audio is not working in lock screen using CallKit

I have tried many solutions from here but no one is working. WebRTC is working fine and I get connected status when accepting the call while the device is locked and after the unlock the audio opens and video starts. How could I get only audio when the screen remains locked?
I have enabled RTCAudioSession and disabled it when the call stops.
It's working fine when the device is unlocked the first time and when I lock it back I'm getting the audio. But the first time, when I answer the call from CallKit, it is not working. It only starts working after the device is unlocked.
I don't know the cause of your specific problem, but what I learnt from similar issue while integrating CallKit with WebRTC, you must acquire camera and microphone access only inside the DidActivateAudioSession method of your implementation of the CXProviderDelegate class. Otherwise, you will get weird issues.

iOS Volume only adjustable on Playback

Developing an iOS app which plays back audio messages in segments (not continuous playback). When the app is opened, I initialize the audio session with the following options.
func _initAudioSesh() {
let audioSession = AVAudioSession.sharedInstance()
do {
if #available(iOS 10.0, *) {
try audioSession.setCategory(
AVAudioSessionCategoryPlayAndRecord,
mode: AVAudioSessionModeVoiceChat,
options: [
AVAudioSessionCategoryOptions.defaultToSpeaker,
AVAudioSessionCategoryOptions.allowBluetooth,
AVAudioSessionCategoryOptions.allowBluetoothA2DP
]
)
} else {
try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
}
} catch {
print("Setting category to AVAudioSessionCategoryPlayback failed.")
}
}
Then, when ready to playback audio, I grab focus using setActive(true) and release focus using setActive(false)
The issue I'm encountering is that in the app, the hardware volume buttons only work when audio is playing, otherwise, the buttons do nothing. I was able to hack around this by ALWAYS holding setActive(true), but that hack is ugly and causes other issues. Has anyone else experienced volume buttons not working/adjusting in-app, and only working when audio is actively being played?
As soon as I leave the app, audio adjustment works, as soon as I bring it back into focus, it stops working unless I begin to play audio.
I've tried messing with how & when I create the audio session, with no success.
This ended up being a result of a specific library we're using (react-native-system-settings), which has now been patched. If other users encounter issues, the fix seems to be around allowing UI to show in VolumeView. Otherwise, it doesn't allow the hardware buttons to affect volume.

Cordova - 'AVAudioSessionDelegateMediaPlayerOnly end interruption' error after pausing and resuming an app

This issue has arose using Cordova to play videos within an iOS application. I had thought to use the pause/resume feature to interact with the HTML5 video. However even when the video has stopped playing and the element has been set to display:none, or faded out etc, this error still appears in the console after the app is resumed - which then renders all video playing useless after it.
MP AVAudioSessionDelegateMediaPlayerOnly end interruption. Interruptor
<________-1874> category <(null)> resumable <0>, _state = 1
I have found issues relating to this, but are answered using C Objective for native app building.. and because I am using Cordova to build they do not apply unfortunately.
Has anyone else playing videos within PhoneGap/Cordova/Chrome Apps come across this and can offer a solution? Or anyone coding Native iOS apps could offer some advice as to why it is happening?
So to fix this (for anyone who may come across this in the future!), i had to resort to a bit of a hacky method.
When the Cordova iOS app is pushed to the background, and resumed, it looks as though any running video tags were unable to continue load and play video - the error above pops up in Xcode console and the video element is black no matter what. (I was fading still images on top of a video tag, then loading a new video using data attributes after a hotspot on the image is pressed). So when the app is paused I had to use JS/jQuery to remove and on resume it re-creates the video element and used the last selected data attributes to pick up the info from where it left off..
Seems to work, and is surprisngly seamless! :)

Resources