MPNowPlayingInfoCenter info being overwritten by AVPlayer - ios

I'm working an iOS application that uses an AVPlayer to play songs represented by AVPlayerItems.
I've tried setting the nowPlayingInfo property of MPNowPlayingInfoCenter.defaultCenter() but it appears that the info gets immediately overwritten by the AVPlayer since it needs to update other info like elapsedTime and playbackDuration.
Even immediately after the program executes
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo?[MPMediaItemPropertyTitle] = "Title"
The printout of MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo is
Info: ["MPNowPlayingInfoPropertyElapsedPlaybackTime": 34.89555007300078, "AVNowPlayingInfoControllerIdentifierKey": <__NSConcreteUUID 0x13778e9b0> 9141DD1C-FD09-4210-B3C7-B948522E3AF6, "playbackDuration": 198.2516666666667, "MPNowPlayingInfoPropertyPlaybackRate": 1]
What am I doing wrong? Is it possible to store additional metadata like title/album name in an AVPlayerItem so it gets displayed on the info center?
Also, what is AVNowPlayingInfoControllerIdentifierKey? There do not seem to be any classes named AVNowPlayingInfoController.

The problem is that this line does nothing:
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo?[MPMediaItemPropertyTitle] =
"Title"
Even if that line actually did anything, you would be fetching a copy of the nowPlayingInfo as a separate dictionary and then writing into that separate dictionary. You would not be writing into the actual now playing info, which is what you want to do.
The correct approach is to form a dictionary and then set MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo to that dictionary.
For example:
MPNowPlayingInfoCenter.defaultCenter().nowPlayingInfo =
[MPMediaItemPropertyTitle : "Title"]
(Of course the dictionary can have more entries.)

Related

AVPlayer Audio Output

Through AVCaptureSession I record a video and then immediately play it back via an AVPlayer once recording has stopped.
My problem is that the audio from the video sometimes plays out of the ear speaker at a really low volume and other times plays out of the bottom speaker.
How can I default the audio to output to the bottom speaker?
I've looked at other related posts with instances of the below code, which I tried, but to no avail..Any guidance would be appreciated.
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
try session.setActive(true)
} catch {
print ("error")
}
You're explicitly turning that off here:
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
If you want to prefer the speaker, you'd use:
try session.overrideOutputAudioPort(.speaker)
AVAudioSession is very complicated, and many parts of it are not intuitive. Do not copy code you find on the internet without reading the docs on each command. The docs are pretty good, but you have to read them.
That said, rather than doing this, I'd probably switch your category and options when you switch to playback. You can do that at any time:
try session.setCategory(.playback, options: [.defaultToSpeaker])
It is generally best to keep your category aligned what you're doing. If you set .playback here as the category, you may not even need .defaultToSpeaker, depending on what precisely you're trying to achieve.
Be certain to read all the relevant docs on .defaultToSpeaker, setCategory, overrideOutputAudioPort, etc. Don't just copy my suggestions. These settings have many subtle (and documented) interactions, you need to configure it based on your actual use case, not just copy something that "seems to work." You may be very surprised at what happens when the user switches to Bluetooth, or plugs headphones, or switches to CarPlay.
You can change the audio output device for a given AVPlayer instance by setting the instance property 'audioOutputDeviceUniqueID' to the UniqueID of the desired device.
I can confirm that this works as expected in MacOS 10.11.6, using Key-Value coding ( setValue:forKey:)
Apple's doc on this:
Instance Property
audioOutputDeviceUniqueID
Specifies the unique ID of the Core Audio output device used to play audio.
Declaration
#property(nonatomic, copy) NSString *audioOutputDeviceUniqueID;
Discussion
The default value of this property is nil, indicating that the default audio output device is used. Otherwise the value of this property is a string containing the unique ID of the Core Audio output device to be used for audio output.
Core Audio's kAudioDevicePropertyDeviceUID is a suitable source of audio output device unique IDs.

How can I save the last paused time in an AVPlayer, and seek to it in another storyboard?

I'm new to xCode/swift so please forgive my inexperience. I have a ViewController where my viewers can listen to some audio playback. The playback is accessed like this when the player clicks a play button:
#IBAction func buttonClicked(_ sender: RoundButton)
{
self.clickedButton = sender
guard let url = sender.url else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
present(controller, animated: true) {
player.play()
}
}
I got this code from another StackOverflow question, so I don't completely understand it. My goal is to be able to save the URL and the last played time so that the user can minimize the app, or navigate to a different screen, and then be able to click a "continue listening" button which will pull up another AVPlayer with the last used URL. This "continue listening" AVPlayer will then seek to the last played time.
I know that I need to observe the first AVPlayer somehow, so that when it is paused, stopped, or put in the background, I save the currentTime to a NSUserDefault (I think?). I also need to save the URL, because there are many different URLs that the user could click on.
I tried doing this, and besides not being able to figure out the observation, I also couldn't figure out the type inconsistencies present with NSUserDefault. I tried to retrieve URL NSDefault value as a String after setting it, but when I went to cast the String to a URL using URL(string: lastPlayedURL), xCode complained about "Cannot convert type Data? to expected type String".
My issue with using other StackOverflow questions to solve my problem is that I don't understand where to put the code blocks. Where do I create the observer? Because xCode did not seem happy when I created it inside the body of "buttonClicked".
Thank you for listening to my rambling.
Yes, UserDefaults seem appropriate to store the URL in. Use this method to do that.
To observe the player you need to use an AVPlayerItem. Here's some code that shows that.
About your general issues with Xcode (note the capitalisation) and Swift language, I'm afraid these are things you need to work through yourself by reading/watching tutorials/documentation. Then when you find detailed issues, post your code here and ask.
Good luck and have fun!

How can you extract the overall stream title from AVPlayer?

I'm using AVPlayer and AVPlayerItem in Swift to play internet radio streams.
Observing the "timedMetadata" gives me the track currently playing, however I never seem to get a handle of the radio title
See this example using VLC, I can obtain the "Now playing" part easily with timedMetadata, however I never receive the overall title of the radio "Title".
What am I missing, should I be observing something else to access the stream's/shoutcast/icecast information?
Try this:
let title = AVMetadataItem.metadataItems(from: urlAsset.commonMetadata, withKey: AVMetadataKey.commonKeyTitle, keySpace: AVMetadataKeySpace.common).first?.value as? String
print(title)
This works for media I have with metadata in the title (set using the Apple Music app). If this doesn't work for your media, please post it somewhere online along with your current code.

Is it possible to access system classes in swift?

I was trying to access the player inside WKWebView but I did some coding and it turned out it doesn't use AVPlayerViewController it uses a system class called WebFullScreenVideoRootVideoController
I used the code like this
the function in the picture is fired after a UIWindow appears
After that I started digging more and search for notifications fired by WebFullScreenVideoRootVideoController and some class called AVSystemController or something like that... it turned out it has multiple notifications two of them logically do what I want:
NowPlayingAppIsPlayingDidChange // first one
SomeClientPlayingDidChange // second one
But also the object that they return is called FigBaseObject
Is there anyway to access these objects "some hacky way :P" ?
This link should help you to find when to work on particular notification.
http://paulofierro.com/blog/2015/10/12/listening-for-video-playback-within-a-wkwebview
inside the notification, you can find the url
Extract video URL from NSNotification Asset
if let asset = notification.object?.asset as? AVURLAsset {
let videoURL = asset.URL
}
Did test for XAMAarin IOS, notification.object?.asset is not available, but not sure about swift.
Thanks

get mediaSessionID for the current media playing on chrome cast in Swift on iOS

I'm referring to https://developers.google.com/cast/docs/reference/ios/interface_g_c_k_media_status.html#a45e3eb39e674f5d0dbfd78deef77a1e6
that helps me with the api, but the initializer for the GCKMediaStatus class says:
- (instancetype) initWithSessionID: (NSInteger) mediaSessionID
mediaInformation: (GCKMediaInformation *) mediaInformation
note: this is in Objective-c syntax but Swift works just the same except in Swift language...
Nonetheless I can't seem to figure out how to retrieve the mediaSessionID to be able to initialize an instance of this class to a new variable.
I'm trying to do the following to get me eventually to the method within this class called streamPosition which would go like this:
var mediaStatus = GCKMediaStatus(sessionID: Int, mediaInformation: GCKMediaInformation!)
var currentStreamPosition = mediaStatus.streamPosition()
where Int would be the mediaSessionID NOT the sessionID of the chrome cast (read the additional section below!!) and GCKMediaInformation! would be an instance of the GCKMediaInformation class. (I think) correct me if I'm wrong on either of those parameters.
Then I could use this data. But when I do this the currentStreamPosition I suppose defaults to 0 and thats what I get when I print to the currentStreamPosition variable.
Note: I've already connected to the current playing media and I am able to pause, play, and seek to an arbitrary number within the stream. This all works. So I now I'm connected and everything else works.
use case: I want to be able skip ahead 15 seconds or rewind 15 seconds etc. with the use of this method, but I haven't found anything to help.
also - don't get sessionID confused with mediaSessionID!! I CAN get the sessionID successfully and print it out. My issue is with the mediaSessionID.
additional info: the autocomplete is Xcode says this is the parameters labeled names:
GCKMediaStatus(sessionID: Int, mediaInformation: GCKMediaInformation!)
note the first parameter says sessionID and it is of type int. But on https://developers.google.com/cast/docs/ios_sender if you notice sessionID is of type String! (an optional String).
I think this label was mis-named in Xcode for the autocomplete. I think it should be named mediaSessionID and NOT sessionID since this is what the documentation shows on the first link I provided.
Any help would be much appreciated.
Thanks!
To get the stream position, use the method approximateStreamPosition on GCKMediaControlChannel.

Resources