I've implemented the volumeView class onto my app so that I can pick and output my audio via airplay however the button for this is not showing.
I don't get any compiler errors and after reading Airplay Button is not showing in Player Controls with AVPlayer I can't work out what I'm doing differently.
The code:
override func viewDidLoad() {
super.viewDidLoad()
let volumeView = MPVolumeView(frame: myView.bounds);
self.view.addSubview(volumeView);
volumeView.showsRouteButton = true;
self.view.backgroundColor = UIColor.red;
volumeView.center = CGPoint(x:380, y:150)
Any advice is much appreciated.
I'm going to guess that this isn't supposed to work any more. On my device, if you change to an AirPlay device using the control center, the MPVolumeView disappears completely.
So I would advise against use of MPVolumeView, except perhaps for the simplest cases. It is extremely old technology and hasn't worked as advertised for years. Properties like showsRouteButton and areWirelessRoutesAvailable were deprecated years ago.
The user can change routes using the control center, or you can present an AVRoutePickerView; and you can detect route changes with AVRouteDetector. So you really don't need this feature of MPVolumeView anyway.
Related
I have developed an iOS radio audio app using Swift / UIKit and everything works well.
I want to integrate wit CarPlay and got the required entitlements.
I believe I have set everything up right for the most part as I can see the CPListTemplate with my CPListItems and on tapping one of them, it goes to the CPNowPlayingTemplate and the audio starts playing in the simulator.
While everything seems to be working well, However, there are 2 issues:
I can't seem to interact with CPNowPlayingTemplate play / pause button, I just keep seeing the play button but clicking it does nothing
I am able to do this on the device's lock screen and through Command Center after adding this code:
func setupNowPlayingInfoCenter(){
UIApplication.shared.beginReceivingRemoteControlEvents()
MPRemoteCommandCenter.shared().playCommand.isEnabled = true
MPRemoteCommandCenter.shared().playCommand.addTarget { [weak self] event in
self?.reliableRadioPlayer?.play()
return .success
}
MPRemoteCommandCenter.shared().pauseCommand.isEnabled = true
MPRemoteCommandCenter.shared().pauseCommand.addTarget { [weak self] event in
self?.reliableRadioPlayer?.pause()
return .success
}
}
The second issue is again on the same screen, I cannot see any of the meta data such as the artwork, song name and artist name - again these show up on the device's lock screen and the Command Center with the help of these lines of code:
MPNowPlayingInfoCenter.default().nowPlayingInfo =
[MPMediaItemPropertyTitle: currentlyPlaying.getSongName(),
MPMediaItemPropertyArtist: currentlyPlaying.getSongArtist(),
MPMediaItemPropertyArtwork: artwork]
Do I need to set anything else up or are these simply limitations of the CarPlay simulator ?
Thanks
While I do not believe this is going to be best answer and someone might come up with something better, here are some things I believe could improve how you test CarPlay and explain some issues you might face:
Add these two lines of code before launching your NowPlayingTemplate
#if targetEnvironment(simulator)
UIApplication.shared.endReceivingRemoteControlEvents()
UIApplication.shared.beginReceivingRemoteControlEvents()
#endif
I immediately saw some improvements in the meta data displayed by the simulator
The player status does not reflect accurately on the simulator
When you launch the app on the CarPlay simulator and the now playing template is showing, most likely your audio is going to be playing but the player's status is going to show and this will not show the isPlaying status accurately in your CPListTemplate if you have one
There is nothing to worry about here as it works fine in the car, however, I suggest just clicking the play button so you can see the active status in the NowPlayingTemplate and CPListTemplate screen with the animated bars
Testing on a real device
While I don't think most of us can buy a car just to test CarPlay, you could look into buying a car stereo which supports CarPlay like a Sony XAV-AX5500, Sony XAV-AX1005DB or something lower end like this
You cannot power up a car stereo normally with your plugs at home, so I suggest youtubing some videos to power this up, however, this was the easiest one I found using a laptop charger - basically I believe you need something 12V or greater
Good luck
I'm working on an app and I need it to take input from two solid buttons connected from the earphone jack. I need the app to count how much time it take to finish pressing the button 10 times. I have the timer working now but I can't figure out how to make it take the input. Also, if I have this figured out, is there a way to stimulate? Can you guys give me a hand, please?
remoteControlReceivedWithEvent is used get the audio controls (play/pause), probably it's same as your audio jack device. You need to enable your application with Background audio capability and should add code to start audio session for your app so that you can get call backs when any button pressed in headphones to the function
try this:
override func remoteControlReceivedWithEvent(event: UIEvent?) {
let rc = event!.subtype
print("does this work? \(rc.rawValue)")
}
The multitasking features got updates in iOS 11, one of those was slide over which is demonstrated in the gif below.
With these changes it's no longer possible to use the techniques that check frame size from iOS 9 to detect if another app is a "slide over" over my app.
Is there any new method to detect if another app is running as slide over?
I was able to get this working fairly easily on an iPad Pro (which supports side-by-side apps, not just slide-overs). Here's the code:
class ViewController: UIViewController {
override func viewWillLayoutSubviews() {
isThisAppFullScreen()
}
#discardableResult func isThisAppFullScreen() -> Bool {
let isFullScreen = UIApplication.shared.keyWindow?.frame == UIScreen.main.bounds
print("\(#function) - \(isFullScreen)")
return isFullScreen
}
}
The end result is that it will print "true" if the view is full screen and "false" if it's sharing the screen with another app, and this is run every time anything is shown, hidden, or resized.
The problem then is older devices that only support slide-over. With these, your app is not being resized anymore. Instead, it's just resigning active use and the other app is becoming active.
In this case, all you can do is put logic in the AppDelegate to look for applicationWillResignActive and applicationDidBecomeActive. When you slide-over, you get applicationWillResignActive but not applicationDidEnterBackground.
You could look for this as a possibility, but you cannot distinguish between a slide-over and a look at the Notifications from sliding down from the top of the screen. It's not ideal for that reason, but monitoring application lifecycle is probably the best you can do.
I'm using different kinds of players in my app. I have some kind of control over all of them, but one of them I don't, at all.
When I play an embedded video from a UIWebView, specifically.
My issue is, the app requires audio to be played in background, which works fine. Note, the video's audio does not need to be played in background. But that means I need to use the remote controls (command center & lock screen commands).
My issue is the following :
After I've played a video from an embed, the remote controls still control that video, even though it's been dismissed. And when I press "play", it plays both my radio AND the embedded video.
How can I "clear the memory" of the command center ?
I have enough information to know when to apply a solution, I don't know what the solution is though.
Solution can include a complete disability to use the command center while playing the embed video, or until my radio is played. Solution can include some kind of "hard reset" of the command center.
Actually, any solution is welcome.
This question is a little bit old, but I just fixed this issue and feel like this could help someone out. The thing that was preventing me from reseting the command center was that I had added target actions that were persisting past the instance of my view controller's lifecycle.
Here's what my code looked like.
On the view controller:
var commandPlay: Any?
var commandPause: Any?
When adding the targets:
commandPlay = commandCenter.playCommand.addTarget {event in
self.startButtonTapped(self.startButton)
MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPNowPlayingInfoPropertyPlaybackRate] = 1.0
return .success
}
commandPause = commandCenter.pauseCommand.addTarget {event in
self.startButtonTapped(self.startButton)
MPNowPlayingInfoCenter.default().nowPlayingInfo?[MPNowPlayingInfoPropertyPlaybackRate] = 0.0
return .success
}
At dismiss to remove the targets (important part):
commandCenter.playCommand.removeTarget(commandPlay)
commandCenter.pauseCommand.removeTarget(commandPause)
My team is developing a set of SDKs for barcode scanning, ID scanning and OCR. We use device's camera, specifically, AVCaptureSession, to obtain video frames on which we perform our processing.
We're exploring new iOS 9 multitasking features Slide Over and Split View.
Apple suggests opting out of these features for camera-centric apps, where using the entire screen for preview and capturing a moment quickly is a primary feature (reference). This is the approach the use in their sample app AVCam.
However, our customers might have apps which don't fall into this category (e.g Mobile banking apps), so we cannot force them to opt out, instead, we need to handle new features in the SDK. We're exploring what would be the best approach to do that, since the docs at the moment aren't really telling us what to do.
We used our simple Camera sample app to analyse the use case. The sample app is available on Github and it's developed as of iOS 9 Beta 5.
From the sample app, it can be clearly seen which system events happen when Slide Over is used, and when Split View is used.
When our app is primary, and Slide Over is used, we get UIApplicationWillResignActiveNotification and AVCaptureSessionDidStopRunningNotification
When Slide Over is used, and our app is secondary, we get UIApplicationWillEnterForegroundNotification and AVCaptureSessionDidStopRunningNotification immediately after that
When Split View is used, on each divider drag, our app gets UIApplicationWillResignActiveNotification.
However, if the Camera is launched when in Split View, it immediately gets AVCaptureSessionDidStopRunningNotification
So, empirically, it looks like AVCaptureSession is immediately stopped when Slide Over or Split View are used.
What's confusing is that UIImagePickerController, which our sample app also supports, exhibits completely different behaviour.
UIImagePickerController isn't stopped when the app goes into Slide Over/ Split View, instead, it functions completely normally. One can normally take a photo in Split View. In fact, two apps, both of which present UIImagePickerController, can work side by side, with UIImagePickerController of the active app being active. (You can try that by running our sample app, and Contacts app -> New Contact -> Add photo)
With all this in mind, our questions are the following:
If AVCaptureSession is immediately paused when Slide Over and Split View are used, is it a good idea to monitor AVCaptureSessionDidStopRunningNotification, and present a message "Camera Paused" to the user, so that he clearly knows that the app isn't performing scanning?
Why is behaviour of UIImagePickerController different than AVCaptureSession?
Can we expect from Apple than in future beta versions behaviour of AVCaptureSession changes to match UIImagePickerController?
In case you haven't found out yet. After some more investigation I can now answer your first question:
If AVCaptureSession is immediately paused when Slide Over and Split
View are used, is it a good idea to monitor
AVCaptureSessionDidStopRunningNotification, and present a message
"Camera Paused" to the user, so that he clearly knows that the app
isn't performing scanning?
The notification you actually want to observe is this one: AVCaptureSessionWasInterruptedNotification
And you want to check for the newly introduced in iOS9 reason: AVCaptureSessionInterruptionReason.VideoDeviceNotAvailableWithMultipleForegroundApps
override func viewWillAppear(animated: Bool)
{
super.viewWillAppear(animated)
self.addObserverForAVCaptureSessionWasInterrupted()
}
func addObserverForAVCaptureSessionWasInterrupted()
{
let mainQueue = NSOperationQueue.mainQueue()
NSNotificationCenter.defaultCenter().addObserverForName(AVCaptureSessionWasInterruptedNotification, object: nil, queue: mainQueue)
{ (notification: NSNotification) -> Void in
guard let userInfo = notification.userInfo else
{
return
}
// Check if the current system is iOS9+ because AVCaptureSessionInterruptionReasonKey is iOS9+ (relates to Split View / Slide Over)
if #available(iOS 9.0, *)
{
if let interruptionReason = userInfo[AVCaptureSessionInterruptionReasonKey] where Int(interruptionReason as! NSNumber) == AVCaptureSessionInterruptionReason.VideoDeviceNotAvailableWithMultipleForegroundApps.rawValue
{
// Warn the user they need to get back to Full Screen Mode
}
}
else
{
// Fallback on earlier versions. From iOS8 and below Split View and Slide Over don't exist, no need to handle anything then.
}
}
}
override func viewWillDisappear(animated: Bool)
{
super.viewWillDisappear(true)
NSNotificationCenter.defaultCenter().removeObserver(self)
}
You can also know when the interruption was ended by observing:
AVCaptureSessionInterruptionEndedNotification
Answer based on these two links:
http://asciiwwdc.com/2015/sessions/211
https://developer.apple.com/library/ios/samplecode/AVCam/Introduction/Intro.html
Since iOS 16.0+, it's possible to use isMultitaskingCameraAccessEnabled flag.
Reference
Since iOS 13.5+ and iPadOS 13.5+, it's possible to use entitlement com.apple.developer.avfoundation.multitasking-camera-access, allowing the app to continue using the camera while running alongside another foreground app.
Reference
More information about accessing the camera while multitasking
here