Updating a timer label whilst an animation is playing? - ios

I run the code
DispatchQueue.main.async { [weak self] in
self?.sliderView.elapsedLabel.text = self?.player.currentTime.secondsToString()
}
Which every second of an observer firing that watches the audio file progression, will update a label of remaning seconds. It forces main thread because its UI related...
player.event.secondElapse.addListener(self, handleAudioPlayerSecondElapsed)
Meanwhile I have an animation loop playing in the background using AVKit
I am finding that, when the label updates, it stops the animation, as both are real time main thread updates?
How is this handled to allow 2 processes to update the UI at the same time without clashing with each other?
Updated code for the animating video:
let playerItem = AVPlayerItem(url: file)
videoPlayer = AVQueuePlayer(items: [playerItem])
playerLooper = AVPlayerLooper(player: videoPlayer!, templateItem: playerItem)
videoPlayerLayer = AVPlayerLayer(player: videoPlayer)
videoPlayerLayer!.frame = inView.bounds
videoPlayerLayer!.videoGravity = AVLayerVideoGravity.resizeAspectFill
then I call the standard
videoPlayer?.play()

Related

How to play audio from URL on button click and display progress on slider or progress bar in swift..?

I am trying to build a music app in Swift in which I got a remote URL of a song and now I want to play the song on button click. I also want a slider or progress bar to display progress of the song according to playback progress.
This is a sample audio file I want to stream.
https://audio-ssl.itunes.apple.com/itunes-assets/AudioPreview113/v4/99/c4/84/99c48467-71dd-0a95-8388-3c5d4d433ee2/mzaf_6642611679343132363.std.aac.p.m4a
Have a look at Apple's AVFoundation framework. There are several classes you can use like AVPlayer, AVAudioPlay, AVQueuePlayer?, it all depends on your requiements.
here's an example to get you started: (you need to enable "Allow Outgoing Connections" in xcode under signing and capabilities for the audio to play.)
for the slider, you find out the duaration of the audio and set the slider maxVaue to duration. using addPeriodicTimeObserver we update the slider in real time with slider.doubleValue = ProgressTime.seconds
you can also be notified when the audio has ended with AVPlayerItemDidPlayToEndTime.
import AVFoundation
var player: AVPlayer! = nil //(needs to be outside of class)
#IBOutlet weak var slider: NSSliderCell!
let urlToPlay = URL(string: "https://audio-ssl.itunes.apple.com/itunes-assets/AudioPreview113/v4/99/c4/84/99c48467-71dd-0a95-8388-3c5d4d433ee2/mzaf_6642611679343132363.std.aac.p.m4a")
let asset = AVAsset(url: urlToPlay!)
let playerItem = AVPlayerItem(asset: asset)
if player == nil {
player = AVPlayer(playerItem: playerItem)
player.play()
}
//time observer to update slider.
player.addPeriodicTimeObserver(forInterval: CMTime(value: 1, timescale: 2), // used to monitor the current play time and update slider
queue: DispatchQueue.global(), using: { [weak self] (progressTime) in
DispatchQueue.main.async {
self!.slider.maxValue = player.currentItem!.asset.duration.seconds
self!.slider.doubleValue = progressTime.seconds
}
})

How to control AVPlayer buffering

I'm using an AVPlayer to play a remote progressive download (i.e. non-HLS) video. But, I can't figure out how to control its buffering behavior.
I would like to pre-fetch 2 seconds of the video before it's ready to play, and also to stop buffering when the video is paused.
Here's my setup:
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
I tried the following, without success:
// doesn't start buffering
playerItem.preferredForwardBufferDuration = 2.0
// doesn't stop buffering
playerItem.preferredForwardBufferDuration = 2.0
player.replaceCurrentItem(with: playerItem)
I tried player.automaticallyWaitsToMinimizeStalling = true in both cases, and in combination with various player.pause() or player.rate = 0 - doesn't work.
A potential approach that comes to mind is to observe for loadedTimeRanges until the first 2 seconds loaded and set current item of the player to nil.
let c = playerItem.publisher(for: \.loadedTimeRanges, options: .new)
.compactMap { $0.first as? CMTimeRange }
.sink {
if $0.duration.seconds - $0.start.seconds > 2 {
player.replaceCurrentItem(with: nil)
}
}
This would work for pre-buffer, but it doesn't work for pausing, because it makes the video blank instead of paused. (And at this point, I feel I'm attempting to reimplement/interfere with some core buffering functionality)

iOS: AVPlayer plays multiple instances of audio when using controls on lock screen

I have set up a function that creates an AVPlayerViewController instance in order to play an audio track (stored in our server).
func playAudio(_ url: URL) {
let avAssest = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: avAssest)
audioPlayer = AVPlayer(playerItem: playerItem)
let playerViewController = AVPlayerViewController()
playerViewController.player = audioPlayer
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [])
self.present(playerViewController, animated: true, completion: {
self.audioPlayer.play()
})
}
The player works fine while in the foreground. If I go to the lock screen, the audio continues to play. If I press the Pause button in the lock screen, the audio pauses. The problem is, if I then click play on the lock screen, the audio I was listening to resumes, but a second instance of the audio starts playing from the start, so now i have 2 instances of the audio playing simultaneously. If I pause again and press play again, yet another instance of the audio starts playing.
How can I fix this so the lock screen buttons only affect the original audio?
I'm working on Xcode 11, with Swift 4.2.
The problem cannot be reproduced based on the code you showed. Therefore it sounds like you have an extra reference somewhere to self.audioPlayer, and the problem is caused by code you did not reveal to us.
To test that theory, let's start by not making it a global. There's no need for that within the scope of the question.
Modify your code to look like this (note the changes to make this a local):
let avAssest = AVAsset(url: url)
let playerItem = AVPlayerItem(asset: avAssest)
let audioPlayer = AVPlayer(playerItem: playerItem) // *
let playerViewController = AVPlayerViewController()
playerViewController.player = audioPlayer
try? AVAudioSession.sharedInstance().setCategory(.playback, mode: .default, options: [])
self.present(playerViewController, animated: true, completion: {
audioPlayer.play() // *
})
Now delete the audioPlayer property and, if your code fails to compile, comment out everything that refers to it until the code does compile. Test the app; all is well. Now go back and figure out where in the code you did not show us the problem is coming from.

Recommended way of updating timeRange property on AVPlayerLooper

I am building off of the AVPlayerLooper example code Apple provides, specifically utilizing the example AVPlayerLooper setup they've given you in PlayerLooper.swift, LooperViewController.swift, and the Looper.swift protocol.
What I would like to do is be able to update the timeRange property on the AVPlayerLooper that is instantiated inside of the PlayerLooper.swift file.
To this end, I have slightly modified the following function which instantiates and starts the player looper:
func start(in parentLayer: CALayer, loopTimeRange: CMTimeRange) {
player = AVQueuePlayer()
playerLayer = AVPlayerLayer(player: player)
guard let playerLayer = playerLayer else { fatalError("Error creating player layer") }
playerLayer.frame = parentLayer.bounds
parentLayer.addSublayer(playerLayer)
let playerItem = AVPlayerItem(url: videoURL)
playerItem.asset.loadValuesAsynchronously(forKeys: [ObserverContexts.playerItemDurationKey], completionHandler: {()->Void in
/*
The asset invokes its completion handler on an arbitrary queue when
loading is complete. Because we want to access our AVPlayerLooper
in our ensuing set-up, we must dispatch our handler to the main queue.
*/
DispatchQueue.main.async(execute: {
guard let player = self.player else { return }
var durationError: NSError? = nil
let durationStatus = playerItem.asset.statusOfValue(forKey: ObserverContexts.playerItemDurationKey, error: &durationError)
guard durationStatus == .loaded else { fatalError("Failed to load duration property with error: \(String(describing: durationError))") }
//self.playerLooper = AVPlayerLooper(player: player, templateItem: playerItem)
self.playerLooper = AVPlayerLooper(player: player, templateItem: playerItem, timeRange: loopTimeRange)
self.startObserving()
player.play()
})
})
}
For demonstration purposes, in LooperViewController, I created a simple button that triggers looper?.start() with a new CMTimeRange like so:
looper?.start(in: view.layer, loopTimeRange: CMTimeRange(start: CMTime(value: 0, timescale: 600), duration: CMTime(value: 3000, timescale: 600)))
Before that function is called, however, I call looper?.stop(), which does the following:
func stop() {
player?.pause()
stopObserving()
playerLooper?.disableLooping()
playerLooper = nil
playerLayer?.removeFromSuperlayer()
playerLayer = nil
player = nil
}
I'm basically completely re-instantiating the AVPlayerLooper in order to set the new timeRange property because I don't see any way to actually access, and reset, that property once it's been setup the first time.
The problem is that, while this seems to initially work and the looper player will adjust and start looping the new timerange, it will eventually just stop playing after a few loops. No errors are thrown anywhere, and none of the observers that are already setup in the code are reporting that the loop is stopping or that there was some error with the loop.
Is my approach here entirely wrong? Is AVPlayerLooper meant to be adjusted in this way or should I look for another approach to having an adjustable looping player?
You can actually update the AVPlayerLooper without tearing down the whole thing. What you need to do is to remove all items from the AVQueuePlayer first and then reinstantiate the looper with the new time range. Something like this:
if self.avQueuePlayer.rate == 0 {
self.avQueuePlayer.removeAllItems()
let range = CMTimeRange(start: self.startTime, end: self.endTime)
self.avPlayerLooper = AVPlayerLooper(player: self.avQueuePlayer, templateItem: self.avPlayerItem, timeRange: range)
self.avQueuePlayer.play()
}
You must be sure to removeAllItems() or it will crash. Otherwise this will change the time range while allowing you to use the current layer, etc., setup to view the player.

How to show paused video instead of black screen when starting AVPlayer

I am able to successfully create a player but was annoyed with the initial black screen. I decided to overlay a UIImageView and hide it once the player started. This worked, but I didn't want the hassle of creating and maintaining images for all my videos.
I was able to achieve the exact results I wanted by playing and immediately pausing the player after instantiating it. The only issue was that sometimes the state of the player was getting recorded incorrectly, so when I went to start the player again, the status was listed as already "playing" even though the player was paused.
I starting looking into using AVPlayerItem seekToTime but haven't found any feasible solutions. Is there a "non hacky" way of achieving this?
If you're using an AVPlayerViewController, this is a perfect use of the player controller's contentOverlayView property. This is a UIView between the player layer and the controls exposed exactly for this purpose:
First, create the screenshot:
let asset = AVAsset(URL: URL(string: "")!) // link to some video
let imageGenerator = AVAssetImageGenerator(asset: asset)
let screenshotTime = CMTime(seconds: 1, preferredTimescale: 1)
if let imageRef = try? imageGenerator.copyCGImageAtTime(screenshotTime, actualTime: nil) {
let image = UIImage(CGImage: imageRef)
// see part 2 below
}
Now, add the image as a subview of the contentOverlayView in the player controller:
// in the same try block
let imageView = UIImageView(image: image)
let playerVC = AVPlayerViewController()
let playerItem = AVPlayerItem(asset: asset)
playerVC.player = AVPlayer(playerItem: playerItem)
self.presentViewController(playerVC, animated: true) {
playerVC.contentOverlayView?.addSubview(imageView)
// adjust the frame of your imageView to fit on the player controller's contentOverlayView
}
Then, remove the imageView subview when the player starts playing the asset, or when buffering completes.
The AVPlayerLayer associated with your player has a readyForDisplay property.
You can try to make your host view an observer for this value, and do a refresh as soon as it is set to true.
It is set to true when the first frame is ready to be rendered, so before the player has enough data to play and set its status to readyToPlay.

Resources