AVAudioRecorder and AirPlay Mirrioring - ios

When I have an AVAudioRecorder Session active - (when I'm recording audio) I can't activate AirPlay mirroring on the device. Airplay mirroring just deactivates while the app is running and switches it back on when the app exits. This post seems to suggest there is no way around this.
My thoughts are to try:
using a lower level recording framework
or outputting a separate window to external display, rather than mirroring (I've tried this, it doesn't work).
Is there another way around this, or do you know whether either of these methods are known to work?

Using AudioQueue to record (like Apple's Sample Code Speak Here) rather than the AVRecorder works. A bit more work to implement, but recording continues on or off Airplay mirroring.

Related

Possible to access/monitor tvOS system audio output?

I'd like to develop a music visualizer app for tvOS which has the ability to listen to whatever background audio is playing (like Spotify, Pandora, etc.). Does anyone know if this is even possible on iOS/tvOS? In other words, there would have to be some functionality that allows the system audio output to be treated like an audio input.
I imagine that it would be the same functionality as doing a screen recording capture, at least the audio part.
My goal is to be able to do this programatically (Objective-C) so that the user doesn't have to do anything, it just "works" out of the box, so to speak.
Thanks

Is it possible to avoid microphone voices when screen recording?

I want to record screen audio with permission for audio processing "Input.installTap" type of code definitely works for me but I want to eliminate outside voices like the human voice or background noises. I think if I stop to microphone it might be work but, I couldn't find any answer yet.
is it possible with anything?
Have you considered using ReplayKit and disabling the microphone?? (available in iOS already, and I believe macOS from 11.0+).
Check out also the WWDC Big Sur Sample Code.

AVPlayer not working with AirPlay

I have a sequence of AVMutableCompositions that I am playing with AVPlayer. Everything works great when I am playing via speaker, headphones, or bluetooth. However, as soon as I connect to AirPlay, everything falls apart.
I can play one asset, and I try to switch to a new asset, I get an AVAudioSessionMediaServicesWereResetNotification and the device disconnects from AirPlay. I tried exporting the AVMutableCompositions to files and then using AVURLAssets, but this didn't seem to change anything.
Does anyone know what is going on here? It looks like I won't be able to support AirPlay...
I get the feeling that I could switch to AVAudioPlayer and this would fix the problem, but I am generating dozens of AVMutableCompositions and don't want to be reading and writing them to disk.
This is counter-intuitive, but try setting
myPlayer.allowsExternalPlayback = NO;
The documentation is not completely clear on this, but it seems that this will disallow video playback only. When I do this in my app (which uses AVPlayer), Airplay with an AVComposition works. Even the artwork shows up nicely (via an MPNowPlayingInfoCenter).

AVCaptureSession startRunning is unmuting device

My application has various sound effects for buttons and other actions, if the device is muted/silenced they don't make sounds as expected. However, one of the screens does video recording, and if that screen is navigated to it enables all of the sound effects everywhere in the app. By commenting out some things I determined that it was the startRunning function that does this - I'm not sure if this is just normal behavior because starting the camera enables related things, like audio, or if there's something weird going on that I can change.
If you're doing video recording you're most likely using the AVAudioSessionCategoryPlayAndRecord category. This category will always ignore the mute switch on the side of the device, by design. See here for definitions of all AVAudioSession categories. In short, there's no way to respect the mute switch when using this audio category. So maybe when you switch away from that screen, you should set the audio session category to something else like AVAudioSessionCategoryAmbient if that will not affect your app.

Independent Volume control of AVAudioPlayer and MPMusicPlayerController in iOS App.

Within my application, I am playing downloaded audio using an AVAudioPlayer, while simultaneously playing audio from the user's iPod music library with an MPMusicPlayerController.
I need to be able to adjust the Volume of the AVAudioPlayer instance so that it's louder
than the audio coming from the MPMusicPlayerController.
The problem is, when I adjust the Volume property of the AVAudioPlayer, it also
adjusts the volume of the MPMusicPlayerController.
Is there any solution which would allow me to independently control the volume of
these two players?
If not, is there another technique I should use to do this? Any help is appreciated.
Take a look at the documentation for AVAudioSession. For example, in the AVAudioSession Programming Guide, says the following:
"Finally, you can enhance a category to automatically lower the volume of other audio when your audio is playing. This could be used, for example, in an exercise application. Say the user is exercising along to their iPod when your application wants to overlay a verbal message—for instance, “You’ve been rowing for 10 minutes.” To ensure that the message from your application is intelligible, apply the kAudioSessionProperty_OtherMixableAudioShouldDuck property to your audio session. When ducking takes place, all other audio on the device—apart from phone audio—lowers in volume."
I think it might solve your problem. The documentation on initializing an AVAudioSession and setting its categories and properties is pretty clear and easy to follow; you should have no trouble.

Resources