Swift MediaPlayer/MusicKit: How to play lossless audio? - ios

Apple announced lossless audio for apple music users in 2021. With MediaPlayer and MusicKit, you can also play any song from the users apple music library. This playback works through a MPMediaItem, you can query one like this:
#State private var librarySongs = [MPMediaItem]()
#State private var libraryPlaylists = [MPMediaItemCollection]()
let songsQuery = MPMediaQuery.songs()
if let songs = songsQuery.items {
let desc = NSSortDescriptor(key: MPMediaItemPropertyDateAdded, ascending: false)
let sortedSongs = NSArray(array: songs).sortedArray(using: [desc])
librarySongs = sortedSongs as? [MPMediaItem] ?? []
}
let playlistQuery = MPMediaQuery.playlists()
if let playlists = playlistQuery.collections {
libraryPlaylists = playlists
}
MPMediaItem documentation
Now to my question, is it possible to play a users music library as lossless audio?

Related

Media Player filtering and prepending album plays album, a random song, and then the album again

I am trying to make a music player and currently I am stuck on filtering and prepending an album to the playlist.
What I am trying to do is play music on shuffle and then when a user taps a button continue playing songs only from the album currently playing, when it is finished I want to know its finished so I can change UI elements and then go on to play any and all music from the library.
However, what happens is it will play the rest of the songs from the album THEN when it has exhausted all the songs from that album will play a random song then after that one random song will go back to that album ..go through its entirety and then play random songs. Once in a blue moon after it finishes the album it will just play random songs.
In a singleton I have
func getAllSongs(completion: #escaping (_ songs: [MPMediaItem]?) -> Void) {
MPMediaLibrary.requestAuthorization { (status) in
if status == .authorized {
let query = MPMediaQuery()
let mediaTypeMusic = MPMediaType.music
let audioFilter = MPMediaPropertyPredicate(value: mediaTypeMusic.rawValue, forProperty: MPMediaItemPropertyMediaType, comparisonType: MPMediaPredicateComparison.equalTo)
query.addFilterPredicate(audioFilter)
let songs = query.items
completion(songs)
} else {
completion(nil)
}
}
}
func getSongsWithCurrentAlbumFor(item: MPMediaItem) -> MPMediaQuery {
let albumFilter = MPMediaPropertyPredicate(value: item.albumTitle, forProperty: MPMediaItemPropertyAlbumTitle, comparisonType: MPMediaPredicateComparison.equalTo)
let predicates: Set<MPMediaPropertyPredicate> = [albumFilter]
let query = MPMediaQuery(filterPredicates: predicates)
query.addFilterPredicate(albumFilter)
return query
}
In my VC to set up the audio I use
let mediaPlayer = MPMusicPlayerApplicationController.applicationMusicPlayer
func setUpAudioPlayerAndGetSongsShuffled() {
try? AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategorySoloAmbient)
try? AVAudioSession.sharedInstance().setActive(true)
MBProgressHUD.showAdded(to: view, animated: true)
MediaManager.shared.getAllSongs { (songs) in
guard let theSongs = songs else {
return
}
self.newSongs = theSongs.filter({ (item) -> Bool in
return !self.playedSongs.contains(item)
})
self.mediaPlayer.setQueue(with: MPMediaItemCollection(items: self.newSongs))
self.mediaPlayer.shuffleMode = .songs
self.mediaPlayer.repeatMode = .none
self.mediaPlayer.prepareToPlay(completionHandler: { (error) in
DispatchQueue.main.async {
MBProgressHUD.hide(for: self.view, animated: true)
}
})
}
}
When the user taps the button to continue playing songs only from that album I use
if let nowPlaying = mediaPlayer.nowPlayingItem {
let albumQuery = MediaManager.shared.getSongsWithCurrentAlbumFor(item: nowPlaying)
print("\(albumQuery)")
let descriptor = MPMusicPlayerMediaItemQueueDescriptor(query: albumQuery)
mediaPlayer.prepend(descriptor)
}
Upon rereading the documentation I notice it says that I should change to
let mediaPlayer = MPMusicPlayerApplicationController. applicationQueuePlayer
I cannot figure out how to know when an album has been exhausted and then continue playing the rest of the music library.
it would be great to know if the album does not have any other items so that the user could not press the button to play more from the album
I discovered my mistakes. I was putting my logic in the wrong place
In my singleton I added
func hasPlayedAllSongsFromAlbumFor(song: MPMediaItem) -> Bool {
if let allSongsInAlbum = getSongsWithCurrentAlbumFor(item: song).items {
return lockedSongsContains(songs: allSongsInAlbum)
}
return true
}
func lockedSongsContains(songs: [MPMediaItem]) -> Bool {
for aSong in songs {
if !lockedSongs.contains(aSong) {
return false
}
}
return true
}
I needed to use a Notification
I created a func that has a notification in the ViewDidLoad called songChanged
In the songChanged func I have
if albumIsLocked && MediaManager.shared.hasPlayedAllSongsFromAlbumFor(song: nowPlaying) {
albumLockButtonTapped(albumLockIconButton)
MediaManager.shared.lockedSongs.removeAll()
}
On the album lock button I now use
if let nowPlaying = mediaPlayer.nowPlayingItem {
if sender.isSelected {
sender.isSelected = false
albumIsLocked = false
let lockRemovedQuery = MediaManager.shared.removeAlbumLockFor(item: nowPlaying)
let lockRemovedDescriptor = MPMusicPlayerMediaItemQueueDescriptor(query: lockRemovedQuery)
mediaPlayer.prepend(lockRemovedDescriptor)
mediaPlayer.prepareToPlay()
} else {
sender.isSelected = true
albumIsLocked = true
let albumQuery = MediaManager.shared.getSongsWithCurrentAlbumFor(item: nowPlaying)
let albumDescriptor = MPMusicPlayerMediaItemQueueDescriptor(query: albumQuery)
mediaPlayer.prepend(albumDescriptor)
mediaPlayer.prepareToPlay()
}
}

How to loop AVPlayer from 4 second to 8 second in swift 3?

I have an AVPlayer in swift 3 that plays video - the problem is that I want to use loop from A to B seconds (for example from 4 to 8 second)here is my codes for loop but didn't work
NotificationCenter.default.addObserver(forName: .AVPlayerItemDidPlayToEndTime, object: self.Player.currentItem, queue: nil, using: { (_) in
DispatchQueue.main.async {
self.Player.seek(to: kCMTimeZero)
self.Player.play()
DispatchQueue.main.asyncAfter(deadline: DispatchTime.now() + 4.0) {
// check if player is still playing
if self.Player.rate != 0 {
print("OK")
print("Player reached 4.0 seconds")
let timeScale = self.Player.currentItem?.asset.duration.timescale;
// let seconds = kCMTimeZero
let time = CMTimeMakeWithSeconds( 8.0 , timeScale!)
self.Player.seek(to: time, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)
self.Player.play()
}
}
}
})
the problem is that this loop doesn't work and because of AVPlayerItemDidPlayToEndTime the print("OK") won't work until the player has finished the movie
There are a few options:
If you want gapless playback, you can start off by using:
Pre iOS 10: https://developer.apple.com/library/content/samplecode/avloopplayer/Introduction/Intro.html
iOS 10+:
https://developer.apple.com/documentation/avfoundation/avplayerlooper
The pre-ios10 "solution" from apple does work, and is the only way I have gotten gapless looping since I target ios9.
If you are using that solution, you also need to either feed it an avplayeritem the right length or add to the solution to cut it up as you send it to the player.
For that, you can do something like how I changed apples code (sorry if its a bit sparse - just trying to show the main changes) - Basically adding in sending the track and the chunk of time to use, then make that an AVMutableCompositionTrack (I got rid of all the stuff for video - you will want to keep that in) :
class myClass: someClass {
var loopPlayer:QueuePlayerLooper!
var avAssetLength:Int64!
var avAssetTimescale:CMTimeScale!
var avAssetTimeRange:CMTimeRange!
let composition = AVMutableComposition()
var playerItem:AVPlayerItem!
var avAssetrack:AVAssetTrack!
var compAudioTrack:AVMutableCompositionTrack!
var uurl:URL!
var avAsset:AVURLAsset!
func createCMTimeRange(start:TimeInterval, end:TimeInterval) -> CMTimeRange {
avAssetTimescale = avAssetTrack.naturalTimeScale
let a:CMTime = CMTime(seconds: start, preferredTimescale: avAssetTimescale)
let b:CMTime = CMTime(seconds: end, preferredTimescale: avAssetTimescale)
return CMTimeRange(start: a, end: b)
}
func startLoopingSection() {
loopPlayer = QueuePlayerLooper(audioURL: uurl, loopCount: -1, timeRange: createCMTimeRange(start: a_playbackPosition, end: b_playbackPosition))
loopPlayer.start()
}
}
//--==--==--==--==--==--==--==--==--
/*
Copyright (C) 2016 Apple Inc. All Rights Reserved.
See LICENSE.txt for this sample’s licensing information
Abstract:
An object that uses AVQueuePlayer to loop a video.
*/
// Marked changed code with ++
class QueuePlayerLooper : NSObject, Looper {
// MARK: Types
private struct ObserverContexts {
static var playerStatus = 0
static var playerStatusKey = "status"
static var currentItem = 0
static var currentItemKey = "currentItem"
static var currentItemStatus = 0
static var currentItemStatusKey = "currentItem.status"
static var urlAssetDurationKey = "duration"
static var urlAssetPlayableKey = "playable"
}
// MARK: Properties
private var player: AVQueuePlayer?
private var playerLayer: AVPlayerLayer?
private var isObserving = false
private var numberOfTimesPlayed = 0
private let numberOfTimesToPlay: Int
private let videoURL: URL
++var assetTimeRange:CMTimeRange!
++let composition = AVMutableComposition()
++var currentTrack:AVAssetTrack!
++var assetTimeRange:CMTimeRange!
// MARK: Looper
required init(videoURL: URL, loopCount: Int, ++timeRange:CMTimeRange) {
self.videoURL = videoURL
self.numberOfTimesToPlay = loopCount
++self.assetTimeRange = timeRange
super.init()
super.init()
}
func start(in parentLayer: CALayer) {
stop()
player = AVQueuePlayer()
playerLayer = AVPlayerLayer(player: player)
guard let playerLayer = playerLayer else { fatalError("Error creating player layer") }
playerLayer.frame = parentLayer.bounds
parentLayer.addSublayer(playerLayer)
let videoAsset = AVURLAsset(url: videoURL)
++currentTrack = composition.addMutableTrack(withMediaType: AVMediaTypeVideo, preferredTrackID: CMPersistentTrackID())
++currentTrack = videoAsset.tracks(withMediaType: AVMediaTypeVideo)
++try! compositionTrack.insertTimeRange(assetTimeRange, of: currentTrack, at: CMTimeMake(0, 1))
videoAsset.loadValuesAsynchronously(forKeys: [ObserverContexts.urlAssetDurationKey, ObserverContexts.urlAssetPlayableKey]) {
/*
The asset invokes its completion handler on an arbitrary queue
when loading is complete. Because we want to access our AVQueuePlayer
in our ensuing set-up, we must dispatch our handler to the main
queue.
*/
DispatchQueue.main.async(execute: {
var durationError: NSError?
let durationStatus = videoAsset.statusOfValue(forKey: ObserverContexts.urlAssetDurationKey, error: &durationError)
guard durationStatus == .loaded else { fatalError("Failed to load duration property with error: \(durationError)") }
var playableError: NSError?
let playableStatus = videoAsset.statusOfValue(forKey: ObserverContexts.urlAssetPlayableKey, error: &playableError)
guard playableStatus == .loaded else { fatalError("Failed to read playable duration property with error: \(playableError)") }
guard videoAsset.isPlayable else {
print("Can't loop since asset is not playable")
return
}
guard CMTimeCompare(videoAsset.duration, CMTime(value:1, timescale:100)) >= 0 else {
print("Can't loop since asset duration too short. Duration is(\(CMTimeGetSeconds(videoAsset.duration)) seconds")
return
}
/*
Based on the duration of the asset, we decide the number of player
items to add to demonstrate gapless playback of the same asset.
*/
let numberOfPlayerItems = (Int)(1.0 / CMTimeGetSeconds(videoAsset.duration)) + 2
for _ in 1...numberOfPlayerItems {
let loopItem = AVPlayerItem(asset: ++self.composition)
self.player?.insert(loopItem, after: nil)
}
self.startObserving()
self.numberOfTimesPlayed = 0
self.player?.play()
})
}
}
}}
You can add periodic time observer to monitor current time
let timeObserverToken = player.addPeriodicTimeObserver(forInterval: someInterval, queue: DispatchQueue.main) { [unowned self] time in
let seconds = CMTimeGetSeconds(cmTime)
if seconds >= 8.0 {
// jump back to 4 seconds
// do stuff
}
}

iOS Adjust Pitch Whilst Playing via AVAudioUnitTimePitch

I’m trying to get some audio to be able to have the pitch adjusted whilst playing. I’m very new to Swift and iOS, but my initial attempt was to just change timePitchNode.pitch whilst it was playing; however, it wouldn’t update whilst playing. My current attempt is to reset audioEngine, and have it just resume from where it was playing (below). How do I determine where the audio currently is, and how do I get it to resume from there?
var audioFile: AVAudioFile?
var audioEngine: AVAudioEngine?
var audioPlayerNode: AVAudioPlayerNode?
var pitch: Int = 1 {
didSet {
playResumeAudio()
}
}
…
func playResumeAudio() {
var currentTime: AVAudioTime? = nil
if audioPlayerNode != nil {
let nodeTime = audioPlayerNode!.lastRenderTime!
currentTime = audioPlayerNode!.playerTimeForNodeTime(nodeTime)
}
if audioEngine != nil {
audioEngine!.stop()
audioEngine!.reset()
}
audioEngine = AVAudioEngine()
audioPlayerNode = AVAudioPlayerNode()
audioEngine!.attachNode(audioPlayerNode!)
let timePitchNode = AVAudioUnitTimePitch()
timePitchNode.pitch = Float(pitch * 100)
timePitchNode.rate = rate
audioEngine!.attachNode(timePitchNode)
audioEngine!.connect(audioPlayerNode!, to: timePitchNode, format: nil)
audioEngine!.connect(timePitchNode, to: audioEngine!.outputNode, format: nil)
audioPlayerNode!.scheduleFile(audioFile!, atTime: nil, completionHandler: nil)
let _ = try? audioEngine?.start()
audioPlayerNode!.playAtTime(currentTime)
}
I was being dumb apparently. You can modify the pitch during playback, and it does update. No need to reset any audio, just mutate the node as it’s playing, and it’ll work.

Last heard songs (Swift)

I want to fill a tableView with the last heard songs. With .nowPlayingItem I can get the very last song, but how can I get the songs heard before?
I think this question already has an answer at Retrieve list of songs ordered by last play time in iOS, but it is in Objective-C and I´m not able to translate it. Or is there an even better way to do it in Swift instead?
You could do something like this:
let startTime: NSTimeInterval = NSDate().timeIntervalSince1970
let songsQuery: MPMediaQuery = MPMediaQuery.songsQuery()
let songsArray: [MPMediaItem] = songsQuery.items!
let songsNSArray : NSArray = NSArray(array: songsArray)
let descriptor: NSSortDescriptor = NSSortDescriptor(key: MPMediaItemPropertyLastPlayedDate, ascending: false)
let sortedResults: NSArray = songsNSArray.sortedArrayUsingDescriptors([descriptor])
let finishTime: NSTimeInterval = NSDate().timeIntervalSince1970
NSLog("Execution took %f seconds to return %i results.", finishTime - startTime, sortedResults.count)
The results would be stored in the sortedResults array
This is how you can do it in Swift,
let start = NSDate().timeIntervalSince1970
let songsQuery = MPMediaQuery.songsQuery()
if let songsArray = songsQuery.items {
let sortedArray = songsArray.sort { item1, item2 in
if let lastPlayedDateForItem1 = item1.lastPlayedDate,
let lastPlayedDateForItem2 = item2.lastPlayedDate {
return lastPlayedDateForItem1.compare(lastPlayedDateForItem2) == .OrderedDescending
}
return false
}
}

loading playlist artwork like in the music app

I'm looking for an solution to display my artwork like in the apple music application. I'm able to load one artwork for the playlist but I want to be able to show 4 of the artworks as a playlist representative.
Currently I'm using this code for my playlist view
let objects = items.objectAtIndex(indexPath.item) as! MPMediaItemCollection
let repObjects = objects.representativeItem
cell.lbl_Name.text = objects.valueForProperty(MPMediaPlaylistPropertyName) as? String
let artwork = repObjects?.valueForProperty(MPMediaItemPropertyArtwork)
let artworkImage = artwork?.imageWithSize(CGSize(width: 130, height: 130))
if(artworkImage != nil){
cell.img_artwork.image = artworkImage
}
else{
cell.img_artwork.image = UIImage(named: "no_Artwork.png")
}
What would be the best way to go at this

Resources