I am using ReplayKit Broadcast Extension to record system screen. I am able to get the sample buffer's for video, audioApp, audioMic & process them.
Is there any way to get the microphone status i.e whether it's enabled or disabled during the screen recording?
If you have declared recorder as -
var recorder = RPScreenRecorder.shared()
Then you can check it as -
if recorder.isMicrophoneEnabled == true {
}
Related
I am using AVCaptureSession to develop a camera app.
I need trigger some vibration feedback when touch some UI. using code like below:
private var _impact = UIImpactFeedbackGenerator()
private var _select = UISelectionFeedbackGenerator()
private init() {
_impact.prepare()
_select.prepare()
}
func impact() {
_impact.impactOccurred()
}
func select() {
_select.selectionChanged()
}
so I can invoke select() or impact() to trigger a feedback.
But, if I add Audio Device to AVCaptureSession, all feedbacks will be invalidated. so, for the feedback effect, I have to removed AudioDevice from AVCaptureSession first, when I need to record a video, then add the Audio Device to AVCaptureSession, this operation will lag captureOutput that make camera preview freezed in little time.
So, I found another way to try this, first always add Audio Device to AVCaptureSession, then get all AVCaptureConnection about Audio from AVCaptureSession.connections. and set isEnable to false or true, But this way didn't work. even AVCaptureConnection.isEnable is false, the feedbacks also invalidate.
I think maybe there is only way to make feedback worked is don't add audio device to AVCaptureSession.
I'm trying to mimic behaviour as in Phone app during calling. You can easily switch output sources from/to speaker or headphones.
I know I can force speaker as an output when headphones are connected by calling:
try! audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord)
try! audioSession.overrideOutputAudioPort(.speaker)
However, when I do that, I don't see any way to detect if headphones are still connected to the device.
I initially thought outputDataSources on AVAudioSession would return all posible outputs but it always returns nil.
Is there something I'm missing
You need to change the outputDataSources, as when you overrode it,
now it contains only the .Speaker option
in the Documentation you can find the solution to this,
If your app uses the playAndRecord category, calling this method with the AVAudioSession.PortOverride.speaker option causes audio to be routed to the built-in speaker and microphone regardless of other settings. This change remains in effect only until the current route changes or you call this method again with the AVAudioSession.PortOverride.none option.
Therefore the audio is routed to the built-in speaker, This change remains in effect only untill the current route changes or you call this method again with .noneOption.
it's not possible to forcefully direct sound to headphone unless an accessory is plugged to headphone jack (which activates a physical switch to direct voice to headphone).
So when you want to switch back to headphone, this should work.
And if there is no headphone connected will switch the output device to the small speaker output on the top of the device instead of the big speaker.
let session: AVAudioSession = AVAudioSession.sharedInstance()
do {
try session.setCategory(AVAudioSessionCategoryPlayAndRecord)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
try session.setActive(true)
} catch {
print("Couldn't override output audio port")
}
Read about this AVAdioSession/OverrideOutputAudioPort Here.
You can check if headset connected adding this extension,
extension AVAudioSession {
static var isHeadphonesConnected: Bool {
return sharedInstance().isHeadphonesConnected
}
var isHeadphonesConnected: Bool {
return !currentRoute.outputs.filter { $0.isHeadphones }.isEmpty
}
}
extension AVAudioSessionPortDescription {
var isHeadphones: Bool {
return portType == AVAudioSessionPortHeadphones
}
}
And simply use this line of code
session.isHeadphonesConnected
I'm using MPRemoteCommandCenter and MPMusicPlayerController.applicationMusicPlayer on the iPhone.
I'm trying to receive remote control events when the user is playing music and double taps on the headphone button.
If I use AVAudioPlayer, the remote commands are received perfectly.
However, if I use MPMusicPlayerController with any of its players (systemMusicPlayer, applicationMusicPlayer, or applicationQueuePlayer) the the commands do not get received. They appear to get gobbled up. For example when I double tap the remote, the music will toggle between play and stop. Instead, I need the remote events sent to my app.
Below is a sample app with my code. In the info.plist I've specified the required background mode for an app that plays audio (although its not necessary).
import UIKit
import MediaPlayer
class ViewController: UIViewController {
var mpPlayer:MPMusicPlayerController!
func remoteHandler() {
print("success")
}
override func viewDidLoad() {
super.viewDidLoad()
mpPlayer = MPMusicPlayerController.applicationMusicPlayer()
//mpPlayer = MPMusicPlayerController.systemMusicPlayer()
assert(mpPlayer != nil)
let cc = MPRemoteCommandCenter.shared()
print("cc = \(cc)")
cc.nextTrackCommand.isEnabled = true
cc.nextTrackCommand.addTarget(self, action: #selector(ViewController.remoteHandler))
cc.previousTrackCommand.isEnabled = true
cc.previousTrackCommand.addTarget(self, action: #selector(ViewController.remoteHandler))
cc.playCommand.isEnabled = true
cc.playCommand.addTarget(self, action: #selector(ViewController.remoteHandler))
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
print("AVAudioSession successfully set AVAudioSessionCategoryPlayback")
} catch let error as NSError {
print("AVAudioSession setCategory error: \(error.localizedDescription)")
}
mpPlayer.setQueueWithStoreIDs(["270139033"]) // requires iOS 10.3
mpPlayer.play()
}
}
Output is:
cc = 0x123e086c0
AVAudioSession successfully set AVAudioSessionCategoryPlayback
remoteHandler is never called.
From the Apple Developer web site.
When you use either the system or application player, you do not get
event notifications. Those players automatically handle events.
So there is no way to receive remote control events if you use MPMusicPlayerController. Looking forward to see this feature! Right now MPMusicPlayerController is the only way to play Apple Music songs.
I am testing this using iOS 10.2 on my actual iPhone 6s device.
I am playing streamed audio and am able to play/pause audio, skip tracks, etc. I also have enabled background modes and the audio plays in the background and continues through a playlist properly. The only issue I am having is getting the lock screen controls to show up. Nothing displays at all...
In viewDidLoad() of my MainViewController, right when my app launches, I call this...
func setupAudioSession(){
UIApplication.shared.beginReceivingRemoteControlEvents()
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: AVAudioSessionCategoryOptions.mixWithOthers)
self.becomeFirstResponder()
do {
try AVAudioSession.sharedInstance().setActive(true)
print("AVAudioSession is Active")
} catch let error as NSError {
print(error.localizedDescription)
}
} catch let error as NSError {
print(error.localizedDescription)
}
}
and then in my AudioPlayer class after I begin playing audio I call ...
func setupLockScreen(){
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.nextTrackCommand.isEnabled = true
commandCenter.nextTrackCommand.addTarget(self, action:#selector(skipTrack))
MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: "TESTING"]
}
When I lock my iPhone and then tap the power button again to go to the lock screen, the audio controls are not displayed at all. It is as if no audio is playing, I just see my normal background photo. Also no controls are displayed in the control panel (swiping up on home screen and then swiping left to where the music controls should be).
Is the issue because I am not using AVAudioPlayer or AVPlayer? But then how does, for example, Spotify get the lock screen controls to display using their own custom audio player? Thanks for any advice / help
The issue turned out to be this line...
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: AVAudioSessionCategoryOptions.duckOthers)
Once I changed it to
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, with: [])
everything worked fine. So it seems that passing in any argument for AVAudioSessionCategoryPlaybackOptions causes the lock screen controls to not display. I also tried passing in .mixWithOthers an that too caused the lock screen controls to not be displayed
In Swift 4. This example is only to show the player on the lock screen and works with iOS 11. To know how to play auidio on the device you can follow this thread https://stackoverflow.com/a/47710809/1283517
import MediaPlayer
import AVFoundation
Declare player
var player : AVPlayer?
Now create a function
func setupLockScreen(){
let commandCenter = MPRemoteCommandCenter.shared()
commandCenter.nextTrackCommand.isEnabled = true
commandCenter.togglePlayPauseCommand.addTarget(self, action: #selector(controlPause))
MPNowPlayingInfoCenter.default().nowPlayingInfo = [MPMediaItemPropertyTitle: currentStation]
}
now create a function for control play and pause event. I have created a BOOL "isPlaying" to determine the status of the player.
#objc func controlPause() {
if isPlaying == true {
player?.pause()
isPlaying = false
} else {
player?.play()
isPlaying = true
}
}
And ready. Now the player will be displayed on the lock screen
Yes, for the lock screen to work you need to use iOS APIs to play audio. Not sure how Spotify does it but they may be using a second audio session in parallel for this purpose and use the controls to control both. Your background handler (the singleton in my case) could start playing the second audio with 0 volume when it goes into background and stop it when in foreground. I haven't tested it myself but an option to try.
I have an app that needs to have:
Background music playing while using the app (eg. spotify)
Background music playing while watching movie from AVPlayer
Stop the music when recording a video
Like Snapchat, the camera-viewcontroller is part of a "swipeview" and therefore always on.
However, when opening and closing the app, the music makes a short "crack" noise/sound that ruins the music.
I recorded it here:
https://soundcloud.com/morten-stulen/hacky-sound-ios
(3 occurrences)
I use these settings for changing the AVAudiosession in the appdelegate didFinishLaunchingWithOptions:
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayAndRecord,withOptions:
[AVAudioSessionCategoryOptions.MixWithOthers,
AVAudioSessionCategoryOptions.DefaultToSpeaker])
try AVAudioSession.sharedInstance().setActive(true)
} catch {
print("error")
}
I use the LLSimpleCamera control for video recording and I've set the session there to:
_session.automaticallyConfiguresApplicationAudioSession = NO;
It seems others have the same problem with other camera libraries as well:
https://github.com/rFlex/SCRecorder/issues/127
https://github.com/rFlex/SCRecorder/issues/224
This guy removed the audioDeviceInput, but I kinda need that for recording video.
https://github.com/omergul123/LLSimpleCamera/issues/48
I also tried with Apple's code "AvCam", and I still have the same issue. How does Snapchat do this?!
Any help would be greatly appreciated, and I'll gladly provide more info or code!
I do something similar to what you're wanting, but without the camera aspect, but I think this will do what you want. My app allows background audio that will mix with non-fullscreen video/audio. When the user plays an audio file or a full screen video file, I stop the background audio completely.
The reason I do SoloAmbient then Playback is because I allow my audio to be played in the background when the device is locked. Going SoloAmbient will stop all background music playing and then switching to Playback lets my audio play in the app as well as in the background.
This is why you see a call to a method that sets the lock screen information in the Unload method. In this case, it is nulling it out so that there is no lock screen info.
In AppDelegate.swift
//MARK: Audio Session Mixing
func allowBackgroundAudio()
{
do {
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback, withOptions: .MixWithOthers)
} catch {
NSLog("AVAudioSession SetCategory - Playback:MixWithOthers failed")
}
}
func preventBackgroundAudio()
{
do {
//Ask for Solo Ambient to prevent any background audio playing, then change to normal Playback so we can play while locked
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategorySoloAmbient)
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
} catch {
NSLog("AVAudioSession SetCategory - SoloAmbient failed")
}
}
When I want to stop background audio, for example when playing an audio track that should be alone, I do the following:
In MyAudioPlayer.swift
func playUrl(url: NSURL?, backgroundImageUrl: NSURL?, title: String, subtitle: String)
{
ForgeHelper.appDelegate().preventBackgroundAudio()
if _mediaPlayer == nil {
self._mediaPlayer = MediaPlayer()
_mediaPlayer!.delegate = self
}
//... Code removed for brevity
}
And when I'm done with my media playing, I do this:
private func unloadMediaPlayer()
{
if _mediaPlayer != nil {
_mediaPlayer!.unload()
self._mediaPlayer = nil
}
_controlView.updateForProgress(0, duration: 0, animate: false)
ForgeHelper.appDelegate().allowBackgroundAudio()
setLockScreenInfo()
}
Hope this helps you out!