I'm using AVPlayer and AVPlayerItem in Swift to play internet radio streams.
Observing the "timedMetadata" gives me the track currently playing, however I never seem to get a handle of the radio title
See this example using VLC, I can obtain the "Now playing" part easily with timedMetadata, however I never receive the overall title of the radio "Title".
What am I missing, should I be observing something else to access the stream's/shoutcast/icecast information?
Try this:
let title = AVMetadataItem.metadataItems(from: urlAsset.commonMetadata, withKey: AVMetadataKey.commonKeyTitle, keySpace: AVMetadataKeySpace.common).first?.value as? String
print(title)
This works for media I have with metadata in the title (set using the Apple Music app). If this doesn't work for your media, please post it somewhere online along with your current code.
Related
Through AVCaptureSession I record a video and then immediately play it back via an AVPlayer once recording has stopped.
My problem is that the audio from the video sometimes plays out of the ear speaker at a really low volume and other times plays out of the bottom speaker.
How can I default the audio to output to the bottom speaker?
I've looked at other related posts with instances of the below code, which I tried, but to no avail..Any guidance would be appreciated.
let session = AVAudioSession.sharedInstance()
do {
try session.setCategory(.playAndRecord)
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
try session.setActive(true)
} catch {
print ("error")
}
You're explicitly turning that off here:
try session.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
If you want to prefer the speaker, you'd use:
try session.overrideOutputAudioPort(.speaker)
AVAudioSession is very complicated, and many parts of it are not intuitive. Do not copy code you find on the internet without reading the docs on each command. The docs are pretty good, but you have to read them.
That said, rather than doing this, I'd probably switch your category and options when you switch to playback. You can do that at any time:
try session.setCategory(.playback, options: [.defaultToSpeaker])
It is generally best to keep your category aligned what you're doing. If you set .playback here as the category, you may not even need .defaultToSpeaker, depending on what precisely you're trying to achieve.
Be certain to read all the relevant docs on .defaultToSpeaker, setCategory, overrideOutputAudioPort, etc. Don't just copy my suggestions. These settings have many subtle (and documented) interactions, you need to configure it based on your actual use case, not just copy something that "seems to work." You may be very surprised at what happens when the user switches to Bluetooth, or plugs headphones, or switches to CarPlay.
You can change the audio output device for a given AVPlayer instance by setting the instance property 'audioOutputDeviceUniqueID' to the UniqueID of the desired device.
I can confirm that this works as expected in MacOS 10.11.6, using Key-Value coding ( setValue:forKey:)
Apple's doc on this:
Instance Property
audioOutputDeviceUniqueID
Specifies the unique ID of the Core Audio output device used to play audio.
Declaration
#property(nonatomic, copy) NSString *audioOutputDeviceUniqueID;
Discussion
The default value of this property is nil, indicating that the default audio output device is used. Otherwise the value of this property is a string containing the unique ID of the Core Audio output device to be used for audio output.
Core Audio's kAudioDevicePropertyDeviceUID is a suitable source of audio output device unique IDs.
I'm new to xCode/swift so please forgive my inexperience. I have a ViewController where my viewers can listen to some audio playback. The playback is accessed like this when the player clicks a play button:
#IBAction func buttonClicked(_ sender: RoundButton)
{
self.clickedButton = sender
guard let url = sender.url else {
return
}
let player = AVPlayer(url: url)
let controller = AVPlayerViewController()
controller.player = player
present(controller, animated: true) {
player.play()
}
}
I got this code from another StackOverflow question, so I don't completely understand it. My goal is to be able to save the URL and the last played time so that the user can minimize the app, or navigate to a different screen, and then be able to click a "continue listening" button which will pull up another AVPlayer with the last used URL. This "continue listening" AVPlayer will then seek to the last played time.
I know that I need to observe the first AVPlayer somehow, so that when it is paused, stopped, or put in the background, I save the currentTime to a NSUserDefault (I think?). I also need to save the URL, because there are many different URLs that the user could click on.
I tried doing this, and besides not being able to figure out the observation, I also couldn't figure out the type inconsistencies present with NSUserDefault. I tried to retrieve URL NSDefault value as a String after setting it, but when I went to cast the String to a URL using URL(string: lastPlayedURL), xCode complained about "Cannot convert type Data? to expected type String".
My issue with using other StackOverflow questions to solve my problem is that I don't understand where to put the code blocks. Where do I create the observer? Because xCode did not seem happy when I created it inside the body of "buttonClicked".
Thank you for listening to my rambling.
Yes, UserDefaults seem appropriate to store the URL in. Use this method to do that.
To observe the player you need to use an AVPlayerItem. Here's some code that shows that.
About your general issues with Xcode (note the capitalisation) and Swift language, I'm afraid these are things you need to work through yourself by reading/watching tutorials/documentation. Then when you find detailed issues, post your code here and ask.
Good luck and have fun!
I'm attempting to write an IOS app that will play my iPhone music library content, showing artwork, and "announcing" title, artist, etc.
I have it working nicely using Apple's Media Player framework. I can display Playlist names and queue the songs in a selected Playlist.
I use the "MPMusicPlayerControllerNowPlayingItemDidChange" observer notification to pause playback, retrieve metadata, and do the announcements via AVSpeechSynthesizer.
I was a happy camper until I ran into the dreaded "Media Player framwork doesn't respond to observer notifications in background" issue.
So, I started looking at the AVFoundation Framework. I found a sample that plays local song files via URLs in the background and.
I'm failing miserably in attempting to retrieve Music Library content via the AVFoundation.
I have also failed in supplying content retrieved via the Media Player framework to the AVFoundation player.
(Note: The URLs retrieved from MPMediaItem are of a bogus "ipod-library://item/item.m4a?id=#########################" format. Creating AVPlayerItem with this "URL" doesn't work.)
Has anyone managed to accomplish this? I'm developing for my own usage. I have no intention of posting the app in Apple's App Store, so I'm willing to use hidden APIs or un-Apple approved methodology.
A Swift code example would be great. (Objective-C not so much)
Having fetched an MPMediaItem from the user's library, obtain its assetURL. Creating an AVPlayer from the resulting URL does work.
Actual code from one of my example apps:
func oneSong () -> (URL?, String?) {
let query = MPMediaQuery.songs()
// always need to filter out songs that aren't present
let isPresent = MPMediaPropertyPredicate(value:false,
forProperty:MPMediaItemPropertyIsCloudItem,
comparisonType:.equalTo)
query.addFilterPredicate(isPresent)
let item = query.items?[0]
return (item?.assetURL, item?.title)
}
#IBAction func doPlayOneSongAVPlayer (_ sender: Any) {
let (url, title) = self.oneSong()
if let url = url, let title = title {
self.avplayer = AVPlayer(url:url)
self.avplayer.play()
MPNowPlayingInfoCenter.default().nowPlayingInfo = [
MPMediaItemPropertyTitle : title
]
}
}
Using Swift 4+, iOS 11+, Xcode 10+
I've built a music player using MPMediaPlayer and I can interact with it from the Command Center, however I would like to be able to also see it on the Lock Screen.
To be honest, I'm a bit confused as to why it's showing/working in the Command Center as I have not written any code to do this.
Nevertheless, I would also like it to show in the Lock Screen.
This is what I have done so far:
1) I'm using the applicationMusicPlayer and made certain something is playing during my tests:
MPMusicPlayerController.applicationMusicPlayer
2) Set the BackgroundModes to include Audio, Fetch, and Remote Notifications
3) Added AVAudioSession code (which doesn't seem to do anything as I have tried it and tried commenting it out and seen no difference):
let session = AVAudioSession.sharedInstance()
do {
// Configure the audio session for playback
try session.setCategory(AVAudioSessionCategoryPlayback,
mode: AVAudioSessionModeDefault,
options: [])
try session.setActive(true)
} catch let error as NSError {
print("Failed to set the audio session category and mode: \(error.localizedDescription)")
}
4) Used this basic code to see if I can get it to show on the lock screen with just some basic hard-coded content:
let nowPlayingInfo: [String: Any] = [
MPMediaItemPropertyArtist: "Pink Floyd",
MPMediaItemPropertyTitle: "Wish You Were Here",
//MPMediaItemPropertyArtwork: mediaArtwork,
]
MPNowPlayingInfoCenter.default().nowPlayingInfo = nowPlayingInfo
UIApplication.shared.beginReceivingRemoteControlEvents()
let commandCenter = MPRemoteCommandCenter.shared()
5) I know I have not implemented anything to actively update the info or respond to any commands as I'm just trying to get something to show on the lock screen at this point.
Why is the now playing info showing in the Command Center if I have done nothing to put it there?
What do I need to do to get the info to show on the Lock Screen like it does in the Command Center?
EDIT:
Link to simple project that has same issue on GitLab:https://gitlab.com/whoit/lockscreentest
EDIT: I have submitted this as a bug to Apple, however they have yet to confirm or resolve this.
I had to fill (even with empty string) at least 4 keys to see something correct on the lock screen:
MPMediaItemPropertyTitle
MPMediaItemPropertyArtist
MPMediaItemPropertyAlbumTitle
MPNowPlayingInfoPropertyPlaybackRate
you can check this NowPlayingSource code source:
Using .systemMusicPlayer instead of .applicationMusicPlayer will solve your problem.
As apple document:
The music player does not affect the Music app’s state. When your app moves to the background, the music player stops playing the current media.
And I think it's related to not showing in locked screen.
and also systemMusicPlayer handles all song informations to show automatically.
i have created a UWP App which uses HTML5 Webradio streams.
Everything works fine but now i wanted to add track and artist information to the MediaPlayer Element.
This information will be shown if the user locked his device, on the start screen.
The first track if the user selects a stream is shown correctly. But I can't update this information without restart the Stream.
MediaItemDisplayProperties mdp = _mediaPlaybackItem.GetDisplayProperties();
mdp.Type = Windows.Media.MediaPlaybackType.Music;
mdp.MusicProperties.Artist = "TBA Artist";
mdp.MusicProperties.Title = "TBA Title";
mdp.Thumbnail = Windows.Storage.Streams.RandomAccessStreamReference.CreateFromUri(MainPage.Current.CurrentStream.PreviewImageUri);
_mediaPlaybackItem.ApplyDisplayProperties(mdp);
_mediaPlayer.Source = mpItem;
_mediaPlayer.Play();
If i take this lines into my refresh Method for Artist/Title, I also have to set the Source of _mediaPlayer again which will result to a pause of playing the music.
Does anyone have an idea how to fix this problem? Or give any advice I can look further.
Thanks Chris
If you want to update the Artist/Title, you should be able to use SystemMediaTransportControlsDisplayUpdater class, it provides functionality to update the music information that is displayed on the SystemMediaTransportControls.
We can set Artist/Title to the SystemMediaTransportControlsDisplayUpdater.MusicProperties property. Then we can use SystemMediaTransportControlsDisplayUpdater.Update method to update the metadata for the currently playing media.
Use the SystemMediaTransportControlsDisplayUpdater class to update the media info that is displayed by the transport controls, such as the song title or the album art for the currently playing media item. Get an instance of this class with the SystemMediaTransportControls.DisplayUpdater property. If your scenario requires it, you can update the metadata displayed by the system media transport controls manually by setting the values of the MusicProperties, ImageProperties, or VideoProperties objects exposed by the DisplayUpdater class.
For example:
SystemMediaTransportControlsDisplayUpdater updater = _systemMediaTransportControls.DisplayUpdater;
updater.MusicProperties.Artist = "artist";
updater.MusicProperties.AlbumArtist = "album artist";
updater.MusicProperties.Title = "song title";
updater.Thumbnail = RandomAccessStreamReference.CreateFromUri(new Uri("ms-appx:///Music/music1_AlbumArt.jpg"));
updater.Update();