Music info showing in Command Center but not Lock Screen - ios

Using Swift 4+, iOS 11+, Xcode 10+
I've built a music player using MPMediaPlayer and I can interact with it from the Command Center, however I would like to be able to also see it on the Lock Screen.
To be honest, I'm a bit confused as to why it's showing/working in the Command Center as I have not written any code to do this.
Nevertheless, I would also like it to show in the Lock Screen.
This is what I have done so far:
1) I'm using the applicationMusicPlayer and made certain something is playing during my tests:
MPMusicPlayerController.applicationMusicPlayer
2) Set the BackgroundModes to include Audio, Fetch, and Remote Notifications
3) Added AVAudioSession code (which doesn't seem to do anything as I have tried it and tried commenting it out and seen no difference):
let session = AVAudioSession.sharedInstance()
do {
// Configure the audio session for playback
try session.setCategory(AVAudioSessionCategoryPlayback,
mode: AVAudioSessionModeDefault,
options: [])
try session.setActive(true)
} catch let error as NSError {
print("Failed to set the audio session category and mode: \(error.localizedDescription)")
}
4) Used this basic code to see if I can get it to show on the lock screen with just some basic hard-coded content:
let nowPlayingInfo: [String: Any] = [
MPMediaItemPropertyArtist: "Pink Floyd",
MPMediaItemPropertyTitle: "Wish You Were Here",
//MPMediaItemPropertyArtwork: mediaArtwork,
]
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
UIApplication.shared.beginReceivingRemoteControlEvents()
let commandCenter = MPRemoteCommandCenter.shared()
5) I know I have not implemented anything to actively update the info or respond to any commands as I'm just trying to get something to show on the lock screen at this point.
Why is the now playing info showing in the Command Center if I have done nothing to put it there?
What do I need to do to get the info to show on the Lock Screen like it does in the Command Center?
EDIT:
Link to simple project that has same issue on GitLab:https://gitlab.com/whoit/lockscreentest
EDIT: I have submitted this as a bug to Apple, however they have yet to confirm or resolve this.

I had to fill (even with empty string) at least 4 keys to see something correct on the lock screen:
MPMediaItemPropertyTitle
MPMediaItemPropertyArtist
MPMediaItemPropertyAlbumTitle
MPNowPlayingInfoPropertyPlaybackRate
you can check this NowPlayingSource code source:

Using .systemMusicPlayer instead of .applicationMusicPlayer will solve your problem.
As apple document:
The music player does not affect the Music app’s state. When your app moves to the background, the music player stops playing the current media.
And I think it's related to not showing in locked screen.
and also systemMusicPlayer handles all song informations to show automatically.

Related

Audio Recording is not working during Screen-share with Zoom or other app

I am trying to record voice with AVAudioRecorder. It is working fine if Screen-share is not enable. But notice when i share my device screen with Zoom or any other app. AVAudioSession is not active.
Here i attach code that i added for audio record
UIApplication.shared.beginReceivingRemoteControlEvents()
let session = AVAudioSession.sharedInstance()
do{
try session.setCategory(.playAndRecord,options: .defaultToSpeaker)
try session.setActive(true)
let settings = [
AVFormatIDKey: Int(kAudioFormatMPEG4AAC),
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2,
AVEncoderAudioQualityKey:AVAudioQuality.high.rawValue
]
audioRecorder = try AVAudioRecorder(url: getFileUrl(), settings: settings)
audioRecorder.delegate = self
audioRecorder.isMeteringEnabled = true
audioRecorder.prepareToRecord()
self.nextBtn.isHidden = true
}catch let error {
print("Error \(error)")
}
When i hit record button it shows me error NSOSStatusErrorDomain Code=561017449 "Session activation failed".
Here i attach video.
https://share.icloud.com/photos/0a09o5DCNip6Rx_GnTpht7K3A
I don't have the reputation to comment or I would. (Almost there lol!) Have you tried AVAudioSession.CategoryOptions.overridemutedmicrophoneinterrupt?
Edit
The more I looked into this it seems like if Zoom is using the hardware then the iPhone won't be able to record that stream. I think that's the idea behind the AVAudioSession.sharedSession() being a singleton.
From the docs:
Type Property
overrideMutedMicrophoneInterruption: An option that indicates
whether the system interrupts the audio session when it mutes the
built-in microphone.
Declaration
AVAudioSession.CategoryOptions { get }
Discussion
Some devices include a privacy feature that mutes the built-in
microphone at the hardware level in certain conditions, such as when
you close the Smart Folio cover of an iPad. When this occurs, the
system interrupts the audio session that’s capturing input from the
microphone. Attempting to start audio input after the system mutes the
microphone results in an error. If your app uses an audio session
category that supports input and output, such as playAndRecord, you
can set this option to disable the default behavior and continue using
the session. Disabling the default behavior may be useful to allow
your app to continue playback when recording or monitoring a muted
microphone doesn’t lead to a poor user experience. When you set this
option, playback continues as normal, and the microphone hardware
produces sample buffers, but with values of 0.
Important
Attempting to use this option with a session category that doesn’t
support audio input results in an error.

AVPlayer Audio Output

Through AVCaptureSession I record a video and then immediately play it back via an AVPlayer once recording has stopped.
My problem is that the audio from the video sometimes plays out of the ear speaker at a really low volume and other times plays out of the bottom speaker.
How can I default the audio to output to the bottom speaker?
I've looked at other related posts with instances of the below code, which I tried, but to no avail..Any guidance would be appreciated.
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
try session.setActive(true)
} catch {
print ("error")
}
You're explicitly turning that off here:
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
If you want to prefer the speaker, you'd use:
try session.overrideOutputAudioPort(.speaker)
AVAudioSession is very complicated, and many parts of it are not intuitive. Do not copy code you find on the internet without reading the docs on each command. The docs are pretty good, but you have to read them.
That said, rather than doing this, I'd probably switch your category and options when you switch to playback. You can do that at any time:
try session.setCategory(.playback, options: [.defaultToSpeaker])
It is generally best to keep your category aligned what you're doing. If you set .playback here as the category, you may not even need .defaultToSpeaker, depending on what precisely you're trying to achieve.
Be certain to read all the relevant docs on .defaultToSpeaker, setCategory, overrideOutputAudioPort, etc. Don't just copy my suggestions. These settings have many subtle (and documented) interactions, you need to configure it based on your actual use case, not just copy something that "seems to work." You may be very surprised at what happens when the user switches to Bluetooth, or plugs headphones, or switches to CarPlay.
You can change the audio output device for a given AVPlayer instance by setting the instance property 'audioOutputDeviceUniqueID' to the UniqueID of the desired device.
I can confirm that this works as expected in MacOS 10.11.6, using Key-Value coding ( setValue:forKey:)
Apple's doc on this:
Instance Property
audioOutputDeviceUniqueID
Specifies the unique ID of the Core Audio output device used to play audio.
Declaration
#property(nonatomic, copy) NSString *audioOutputDeviceUniqueID;
Discussion
The default value of this property is nil, indicating that the default audio output device is used. Otherwise the value of this property is a string containing the unique ID of the Core Audio output device to be used for audio output.
Core Audio's kAudioDevicePropertyDeviceUID is a suitable source of audio output device unique IDs.

I want to make sound effects without stopping the music playing in the background in another app

I am currently developing an application with SwiftUI.
There is a function that plays a sound effect when you tap on it.
When I tested it on the actual device, the Spotify music playing in the background stopped. Is it possible to use AVFoundation to play sound effects without stopping the music? Also, if there is a better way to implement this, please help me.
import Foundation
import AVFoundation
var audioPlayer: AVAudioPlayer?
func playSound(sound: String, type: String) {
if let path = Bundle.main.path(forResource: sound, ofType: type) {
do {
audioPlayer = try AVAudioPlayer(contentsOf: URL(fileURLWithPath: path))
audioPlayer?.play()
} catch {
print("ERROR:Could not find and play the sound file.")
}
}
}
Set your AVAudioSession category options to .mixWithOthers.
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(.ambient, mode: .default, options: [.mixWithOthers])
} catch {
print("Failed to set audio session category.")
}
audioSession.setActive(true) // When you're ready to play something
ambient indicates that sound is not critical to this application. The default is soloAmbient, which is non-mixable. (It's possible you can just set the category to ambient here and you'll get mixWithOthers for free. I don't often use that mode, so I don't remember how much you get by default.) If sound is critical, see the mode .playback.
As a rule, you set the category once in the app, and then set the session active or inactive as you need it. If I remember correctly, AVAudioPlayer will automatically activate the session, so you may not need to do that (I typically work at much lower levels, and don't always remember what the high-level tools do automatically). It is legal to change your category, however, if different features of your app have different needs.
For more details, see AVAudioSession.Category and AVAudioSession. There are many options in sound playback, and many of them are not obvious (and a few have very confusing names, I'm looking at you .allowBluetooth), so it's worth reading through the docs.

AVAudioEngine stops running when changing input to AirPods

I have trouble understanding AVAudioEngine's behaviour when switching audio input sources.
Expected Behaviour
When switching input sources, AVAudioEngine's inputNode should adopt the new input source seamlessly.
Actual Behaviour
When switching from AirPods to the iPhone speaker, AVAudioEngine stops working. No audio is routed through anymore. Querying engine.isRunning still returns true.
When subsequently switching back to AirPods, it still isn't working, but now engine.isRunning returns false.
Stopping and starting the engine on a route change does not help. Neither does calling reset(). Disconnecting and reconnecting the input node does not help, either. The only thing that reliably helps is discarding the whole engine and creating a new one.
OS
This is on iOS 14, beta 5. I can't test this on previous versions I'm afraid; I only have one device around.
Code to Reproduce
Here is a minimum code example. Create a simple app project in Xcode (doesn't matter whether you choose SwiftUI or Storyboard), and give it permissions to access the microphone in Info.plist. Create the following file Conductor.swift:
import AVFoundation
class Conductor {
static let shared: Conductor = Conductor()
private let _engine = AVAudioEngine?
init() {
// Session
let session = AVAudioSession.sharedInstance()
try? session.setActive(false)
try! session.setCategory(.playAndRecord, options: [.defaultToSpeaker,
.allowBluetooth,
.allowAirPlay])
try! session.setActive(true)
_engine.connect(_engine.inputNode, to: _engine.mainMixerNode, format: nil)
_engine.prepare()
}
func start() { _engine.start() }
}
And in AppDelegate, call:
Conductor.shared.start()
This example will route the input straight to the output. If you don't have headphones, it will trigger a feedback loop.
Question
What am I missing here? Is this expected behaviour? If so, it does not seem to be documented anywhere.

How can you extract the overall stream title from AVPlayer?

I'm using AVPlayer and AVPlayerItem in Swift to play internet radio streams.
Observing the "timedMetadata" gives me the track currently playing, however I never seem to get a handle of the radio title
See this example using VLC, I can obtain the "Now playing" part easily with timedMetadata, however I never receive the overall title of the radio "Title".
What am I missing, should I be observing something else to access the stream's/shoutcast/icecast information?
Try this:
let title = AVMetadataItem.metadataItems(from: urlAsset.commonMetadata, withKey: AVMetadataKey.commonKeyTitle, keySpace: AVMetadataKeySpace.common).first?.value as? String
print(title)
This works for media I have with metadata in the title (set using the Apple Music app). If this doesn't work for your media, please post it somewhere online along with your current code.

Resources