Access variable inside if statement of function - ios

I am trying to make a video play if a certain view is tapped once. The .play() function works well if I directly call it within an if statement inside a function (if which "checks" URL inside the setupPlayerView() function). The first functions (setupPlayerView and defInteractions) I am going to show below are going to be called in an override init which sets the properties and subviews/sublayers etc. for the video player. The last function is triggered by the second function . Pay attention to the declaration of player and my comment in the bottom...
Code: func #1
func setupPlayerView() {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
//check URL if can be converted to NSURL
if let videoURL = NSURL(string: urlString){
//player's video
let player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//if I call player.play() here the video plays directly
}
}
As I the UITapGestureRecognizers to detect a single tap I have the following function which is called in the override init as well (shortly after the previous function):
Code: func #2
//set interactions
func defInteractions (){
//enable interaction
controlsContainerView.isUserInteractionEnabled = true
//singletap
let singleTap = UITapGestureRecognizer(target: self, action: #selector(singleTapDetected(_:)))
singleTap.numberOfTapsRequired = 1
//controlsContainerView
controlsContainerView.addGestureRecognizer(singleTap)
}
Now, I would like to call player.play() inside the function singleTapDetected which currently looks like this:
Code: func #3
func singleTapDetected(_ sender: UITapGestureRecognizer) {
player.play()
}
However, it does not work of course as this function is outside the override init as opposed to the others and I get the error use of unresolved identifier 'player'. How can I call player.play() and get the same result as if I would call it in the first function? Can I access it within the if? I could need some help...

Make player an instance variable:
var player: AVPlayer?
Then initialize it like this:
if self.player == nil {
player = AVPlayer(url: videoURL as URL)
}
Then you can easily access it from anywhere in your class.

Make the player property Global like this :
let player : AVPlayer?
Initialize it in setupPlayerView() :
func setupPlayerView() {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
//check URL if can be converted to NSURL
if let videoURL = NSURL(string: urlString){
//player's video
self.player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//if I call player.play() here the video plays directly
}
}
and use it like this:
func singleTapDetected(_ sender: UITapGestureRecognizer) {
// with self
self.player.play()
}
Hope it helps

Declare player at the class level with optional.
var player: AVPlayer?

Instead of declaring player as a local variable in setupPlayerView function, you should declare it a an instance variable to be accessible in the scope of the whole class/struct.
For your case, I would suggest that instead of implementing setupPlayerView function, it would be a good practice to declare it as a lazy property , as follows:
lazy var player: AVPlayer? = {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
guard let videoURL = NSURL(string: urlString) else {
return nil
}
let player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
return player
}()
Usage:
// this means if the urlString is valid
if let player = player {
player.play()
}
That leads to: all the needed setup for player should be ready only if needed. You might want to check this Q&A.

Related

AVPlayerViewController took too much time to play video/audio with automaticallyWaitsToMinimizeStalling

I am using AVPlayer, in Player I want to play video from the Server. I am doing like the code below. Problem I am facing is I set the
automaticallyWaitsToMinimizeStalling = true
According to Documentation
A Boolean value that indicates whether the player should automatically delay playback in order to minimize stalling.
but it took too much time to load the video/audio, for 8minutes video it took almost 2 to 3 minutes wait to play. If during this time (wait time of 2 to 3 minutes), user pause the video and play again, then video will play without any delay. This unnecessary wait should not be happened.
Can anyone guide me how to decrease wait of stalling? I can not use this answer
player.automaticallyWaitsToMinimizeStalling = false
because due to set this value false, my player stops again and again, and user have to play this manually, so this thing is very bad.
// MARK: - Outlets
#IBOutlet weak var audioView: UIView!
// MARK: - Variables
var player: AVPlayer = AVPlayer()
let playerController = AVPlayerViewController()
var obs = Set<NSKeyValueObservation>()
// MARK: - Helper Method
private func settingForAudioPlayer() {
guard let lesson = viewModal.lesson else {
print("lesson not found")
return
}
var path = "\(Server.audioVideoBasurl + (lesson.videoPath ?? ""))"
path = path.replacingOccurrences(of: " ", with: "%20")
print("path:\(path)")
guard let url = URL(string: path) else {
print("Path not converted to url")
return
}
self.player = AVPlayer(url: url)
self.player.automaticallyWaitsToMinimizeStalling = true
self.player.playImmediately(atRate: 1.0)
self.playerController.player = self.player
DispatchQueue.main.async {
self.playerController.view.clipsToBounds = true
self.playerController.view.removeFromSuperview()
self.playerController.delegate = self
self.showSpinner(onView: self.audioView, identifier: "audioView", title: "")
self.audioView.addSubview(self.playerController.view)
self.audioView.layoutIfNeeded() // Previously we were playing only audio but after some time we decided to add videos also so thats why that view name is audioView Don’t get confuse with this view name
self.playerController.view.frame.size.width = self.audioView.frame.width
self.playerController.view.frame.size.height = self.audioView.frame.height
self.playerController.view.backgroundColor = .clear
self.playerController.videoGravity = AVLayerVideoGravity.resizeAspectFill
var ob : NSKeyValueObservation!
ob = self.playerController.observe(\.isReadyForDisplay, options: [.initial, .new]) { vc, ch in
guard let ok = ch.newValue, ok else {return}
self.obs.remove(ob)
DispatchQueue.main.async {
print("finishing")
self.removeSpinner(identifier: "audioView") // This is my Custom Method
vc.view.isHidden = false // The Idea of KVO observer add got from the Internet
}
}
self.obs.insert(ob)
let iv = self.audioBackgroundImageView ?? UIImageView()
let v = self.playerController.contentOverlayView!
iv.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
iv.bottomAnchor.constraint(equalTo:v.bottomAnchor),
iv.topAnchor.constraint(equalTo:v.topAnchor),
iv.leadingAnchor.constraint(equalTo:v.leadingAnchor),
iv.trailingAnchor.constraint(equalTo:v.trailingAnchor),
])
NSLayoutConstraint.activate([
v.bottomAnchor.constraint(equalTo:self.playerController.view.bottomAnchor),
v.topAnchor.constraint(equalTo:self.playerController.view.topAnchor),
v.leadingAnchor.constraint(equalTo:self.playerController.view.leadingAnchor),
v.trailingAnchor.constraint(equalTo:self.playerController.view.trailingAnchor),
])
self.view.layoutIfNeeded()
}
}
in Above peace of code I have video url (Path), I passed it to the AVPlayer,
Player pass to AVPlayerController,
add observer to the playerController to check that AVPlayerController is ready for Display, then remove observer.
After that I am only settings the constraint.
Kindly guide me how to decrease the waits on swift, on Android side video/Audio plays with in seconds.
It can be duplicate of this but my scenario is different and the solution here did not work for me. Kindly let me know in-case you need more information to help me.
If you don't want to use automaticallyWaitsToMinimizeStalling = true you can also observe player buffer and decide whether you start playing video. Here are some steps how to do that :
Create observer variable in the class where you intend to handle player:
var playbackBufferEmptyObserver: NSKeyValueObservation?
var playbackBufferFullObserver: NSKeyValueObservation?
var playbackLikelyToKeepUpObserver: NSKeyValueObservation?
Instantiate AVPlayer with AVPlayerItem instead of URL likewise:
let playerItem = AVPlayerItem(url: url)
let player = AVPlayer(playerItem: playerItem)
Create observers and assign them to variables:
playbackBufferEmptyObserver = self.playerItem.observe(\.isPlaybackBufferEmpty, options: [.new, .initial ], changeHandler: { [weak self] (player, bufferEmpty) in
if let self = self {
DispatchQueue.main.async {
if bufferEmpty.newValue == true {
// handle showing loading, player not playing
}
}
}
})
playbackBufferFullObserver = self.playerItem.observe(\.isPlaybackBufferFull, options: [.new, .initial], changeHandler: {[weak self] (player, bufferFull) in
if let self = self {
DispatchQueue.main.async {
if bufferFull.newValue == true {
//handle when player buffer is full (e.g. hide loading) start player
}
}
}
})
playbackLikelyToKeepUpObserver = self.playerItem.observe(\.isPlaybackLikelyToKeepUp, options: [.new, .initial], changeHandler: { [weak self] (player, _) in
if let self = self {
if ((self.playerItem?.status ?? AVPlayerItem.Status.unknown)! == .readyToPlay) {
// handle that player is ready to play (e.g. hide loading indicator, start player)
} else {
// player is not ready to play yet
}
}
})

How to tell when AVPlayer has been played for three seconds Swift

I have an application that contains videos that play automatically in an UIImageView in a UITableView when the cell is visible, and all I am trying to do is allow the application to know when the video has been played for three seconds. I wrote this code.
class PostCell: UITableViewCell {
var player: AVPlayer?
var playerLayer: AVPlayerLayer?
var post: Post? {
didSet {
updateView()
}
}
func updateView() {
self.viewcount()
if let videoUrlString = post?.videoUrl, let videoUrl = URL(string: videoUrlString) {
player = AVPlayer(url: videoUrl)
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = postImageView.frame
playerLayer?.frame.size.width = postImageView.frame.size.width
playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.contentView.layer.addSublayer(playerLayer!)
player?.play()
}
func viewcount() {
if let currentitem = player?.currentItem {
if currentitem.currentTime() == CMTimeMake(3, 1) {
print ("VIDEO PLAYED FOR THREE SECONDS")
}
}
}
}
but it is not printing out my message once the video starts playing. I have searched the web for help but couldn't find anything on this subject. So could anyone please help with my issue and tell me what I am doing wrong ?
You are searching for observer of player here is how you can check and track the current position of AVPlayer
Here is function that is adding observer to cell
private func addObserversForVideoPlayer(cell:CustomCell) {
let observer = cell.player?.addPeriodicTimeObserver(forInterval: CMTime.init(seconds: 1, preferredTimescale: 1), queue: .main, using: {[weak self,weak cell] (time) in
guard let cell = cell else {return}
if cell.player?.currentItem?.status == .readyToPlay {
// print("Inside Will DISPLAY\(cell.video.currentTime)")
let timeDuration : Float64 = CMTimeGetSeconds((cell.player?.currentItem?.asset.duration)!)
cell.lblDuration.text = self?.getDurationFromTime(time: timeDuration)
let currentTime : Float64 = CMTimeGetSeconds((cell.player?.currentTime())!)
cell.lblStart.text = self?.getDurationFromTime(time: currentTime)
cell.slider.maximumValue = Float(timeDuration.rounded())
cell.slider.value = Float(currentTime.rounded())
}
})
NotificationCenter.default.addObserver(forName: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: cell.player?.currentItem, queue: .main, using: {[weak cell,weak self] (notification) in
if cell?.player != nil {
cell?.player?.seek(to: kCMTimeZero)
cell?.player?.play()
}
})
}
so that addPeriodicTimeObserver will notify you when the player start playing.
And NSNotification.Name.AVPlayerItemDidPlayToEndTime will notify you when your AVPlayer stops.
Note1: If your cell.player?.currentItem is nil while you are adding AVPlayerItemDidPlayToEndTime it will be cause bug see this One AVPlayer's AVPlayerItemDidPlayToEndTime action executed for all Currently playing videos , If . you don't need it don't add it :)
Note2: You should keep observer so after time you can remove it so that can not take extra load on memory
Hope it is helpful
Try calling the view count after player had started playing
func updateView() {
/// Not here Because at this time player current item is not initiated yet
/// if you use Breakpoints in viewCount code you will see it won't enter
/// in if condition created
self.viewcount() /// Comment this line
if let videoUrlString = post?.videoUrl, let videoUrl = URL(string: videoUrlString) {
player = AVPlayer(url: videoUrl)
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = postImageView.frame
playerLayer?.frame.size.width = postImageView.frame.size.width
playerLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
self.contentView.layer.addSublayer(playerLayer!)
/// Player is initiated with a item to play
player?.play()
/// Call current time here
/// Now it will Enter in if Condition
/// Also try using else statement so you know Do control enter in if or in Else
self.viewcount()
}
func viewcount()
{
if let currentitem = player?.currentItem
{
///Yes Player have a item whose time can be Detected
if currentitem.currentTime() == CMTimeMake(3, 1)
{
print ("VIDEO PLAYED FOR THREE SECONDS")
}
}
else
{
/// Check do Control reach here in case 1 When you are calling before player.play()
}
}

AVPlayer Fairplay HLS won't stop audio playback when video is paused

I am using Fairplay implementation as per Apple's Fairplay Streaming sample code at https://developer.apple.com/streaming/fps/, although I tried to choose only parts that are related to Online Fairplay Streaming, not the persistence/offline playback. In the below code a video without Fairplay plays/pauses/seeks normally, but when I play a Fairplay protected video, only the video track behaves correctly.
Pausing playback won't stop the audio playback, changing audio track won't stop the previous audio track, so both plays together and perhaps the seek also does not work.
Besides this helper class below, I have AssetLoaderDelegate and AssetPlaybackManager from Apple's client sample code of FairPlay Streaming Server SDK https://developer.apple.com/streaming/fps/ and I have updated the code to handle SPC/CKC for our DRM keys provider.
Did I miss to implement some important part of the code to handle audio for FPS Streaming? Can you please point me into right direction? Many thanks.
class PlayHelper {
static let shared = PlayHelper()
fileprivate var playerViewController: PlayerViewController?
init() {
AssetPlaybackManager.sharedManager.delegate = self
}
// Play video without DRM
func playVideo(from urlString: String, at context: UIViewController) {
guard let videoURL = URL(string: urlString) else {
Log.error("Video URL can't be created from string: \(urlString)")
return }
let player = AVPlayer(url: videoURL)
let playerViewController = PlayerViewController()
playerViewController.player = player
context.present(playerViewController, animated: true) {
playerViewController.player?.play()
}
}
// Play FPS video
func playFpsVideo(with asset: AVURLAsset, at context: UIViewController) {
// Cleanup, should be done when playerViewController is actually dismissed
if self.playerViewController != nil {
// The view reappeared as a results of dismissing an AVPlayerViewController.
// Perform cleanup.
AssetPlaybackManager.sharedManager.setAssetForPlayback(nil)
self.playerViewController?.player = nil
self.playerViewController = nil
}
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
// Customize player
player.appliesMediaSelectionCriteriaAutomatically = true
let playerViewController = PlayerViewController()
playerViewController.player = player
self.playerViewController = playerViewController
context.present(playerViewController, animated: true) {
playerViewController.player?.play()
}
}
// Stop video
func stop() {
// Cleanup, should be done when playerViewController is dismissed
if self.playerViewController != nil {
// Results of dismissing an AVPlayerViewController, perform cleanup
AssetPlaybackManager.sharedManager.setAssetForPlayback(nil)
self.playerViewController?.player = nil
self.playerViewController = nil
}
}
}
// MARK: - Extend `PlayHelper` to conform to the `AssetPlaybackDelegate` protocol
extension PlayHelper: AssetPlaybackDelegate {
func streamPlaybackManager(_ streamPlaybackManager: AssetPlaybackManager, playerReadyToPlay player: AVPlayer) {
player.play()
}
func streamPlaybackManager(_ streamPlaybackManager: AssetPlaybackManager, playerCurrentItemDidChange player: AVPlayer) {
guard let playerViewController = playerViewController, player.currentItem != nil else { return }
playerViewController.player = player
}
}
I can also provide the code in AssetLoaderDelegate and AssetPlaybackManager if needed.
My bad. I called play() twice in the code above... Grrr.. Once when the presentation of the PlayerViewController finished and second time in the callback from AssetPlaybackDelegate that is triggered by KVO in AssetPlaybackManager. This way the player controls stopped playing the video, but most probably a second (audio) stream was still playing there. I removed the play() in playerReadyToPlay callback and now all the controls in the Player works as expected. I can pause, resume, seek, change audio tracks.

Swift: AVPlayer release memory / resources

I am writing an app that need display different video according to the selection of the user. When the user select a video, the function playVideo will be called. And after the video finish playing, then the videoView will be hidden again.
My code is as follows:
var player: AVPlayer?
func playVideo(String: videoFile) {
self.videoView.isHidden = false
let videoURL: NSURL = Bundle.main.url(forResource: videoFile, withExtension: "mp4")! as NSURL
self.player = AVPlayer(url: videoURL as URL)
let playerLayer = AVPlayerLayer(player: player)
playerLayer.frame = self.videoView.frame
self.videoView.layer.addSublayer(playerLayer)
let duration : Int64 = 0
let preferredTimeScale : Int32 = 1
let seekTime : CMTime = CMTimeMake(duration, preferredTimeScale)
self.player?.seek(to: seekTime)
self.player?.play()
NotificationCenter.default.addObserver(self, selector: #selector(self.playerItemDidReachEnd), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: player?.currentItem)
}
#objc func playerItemDidReachEnd()
{
self.player?.pause()
self.videoView.isHidden = true
NotificationCenter.default.removeObserver(self)
}
However, with the code above, i have several question:
How to delete / deallocate the player gracefully? If just using my current code, will it consume lots of memory?
Every time, when the user press a button, the function playVideo will be called, and the corresponding player will be created and play. Is this the right way to do so? Is there any other method or more efficient way or elegant way to do so?
I did try to replace the code on creation of the player by the following, but it fails to play the video.
let playerItem: AVPlayerItem = AVPlayerItem(url: videoURL as URL)
self.player? = AVPlayer(playerItem: playerItem)
Thank you

AVPlayer stuck video, but audio works

I go to create AVPlayerItem through AVURLAsset, my code:
let asset = AVURLAsset(URL: safeURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
asset.loadValuesAsynchronouslyForKeys([assetKeyPlayable, self.assetKeyTracks, self.assetKeyHasProtectedContent]) {
() -> Void in
dispatch_async(dispatch_get_main_queue(), {
() -> Void in
// Use the AVAsset playable property to detect whether the asset can be played
if !asset.playable {
let localizedDescription = "Item cannot be played,Item cannot be played description"
let localizedFailureReason = "The assets tracks were loaded, but could not be made playable,Item cannot be played failure reason"
let userInfo = [NSLocalizedDescriptionKey: localizedDescription, NSLocalizedFailureReasonErrorKey: localizedFailureReason]
let error = NSError(domain: "domain", code: 0, userInfo: userInfo)
self.videoPlayerDelegate?.videoPlayer?(self, playerItemStatusDidFail: error)
self.cleanPlayer()
return
}
// At this point we're ready to set up for playback of the asset. Stop observing
if let _ = self.player?.currentItem {
self.cleanPlayer()
}
if asset.URL.absoluteString != safeURL.absoluteString {
return
}
var error: NSError?
let status = asset.statusOfValueForKey(self.assetKeyTracks, error: &error)
var playerItem = AVPlayerItem(URL: safeURL)
if status == .Loaded {
playerItem = AVPlayerItem(asset: asset)
} else {
// You should deal with the error appropriately.If Loaded fails, create an AVPlayerItem directly from the URL
playerItem = AVPlayerItem(URL: safeURL)
}
self.player = self.playerWithPlayerItem(playerItem)
self.registerMonitoring()
self.registerNotification()
self.addTimeObserver()
completionBlock?(loadURLString: playerURL.absoluteString)
})
}
Add AVPlayerLayer display video in my View, my code:
// MARK: - Property
var player: AVPlayer? {
get {
return playerLayer.player
}
set {
playerLayer.player = newValue
}
}
var playerLayer: AVPlayerLayer {
return layer as! AVPlayerLayer
}
When displaying video after completion of loading
self.videoPlayer?.loadPlayer({
[weak self](loadURLString) in
if let strongSelf = self {
strongSelf.player = strongSelf.videoPlayer?.player
strongSelf.startPlay()
}
})
Call seekToTime method to specify the play:
self.player?.currentItem?.seekToTime(CMTimeMakeWithSeconds(time, Int32(NSEC_PER_SEC)), toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero) {
[weak self] finished in
if let weakSelf = self {
if weakSelf.isPlaying {
weakSelf.videoPlayerDelegate?.videoPlayerDidplayerItemSeekToTime?(weakSelf)
}
}
}
Some pictures of the stuck interface:
In the first picture the sound is audible, but the interface is stuck.
In the second picture, the video works, but I get no sound.
My question:
When I call the seekToTime method upon completion, sometimes the video has sound, but the interface is stuck, occasionally the video works. I tried to call the CALayer setNeedsDisplay methods, to update the AVPlayerLayer picture, but that didn't help. I don't know what to do anymore, i would be grateful for every and any help.
Since this is happening for many of us in a different way and is not answered, I am answering it here, For me this was happening when I was trying to play the video in UIView using AVPlayer and AVPlayerLayer but when I used the same AVPlayer to load the video in AVPlayerViewController the video was getting stuck and audio kept on playing.
The trick is to destroy the AVPlayerLayer before using the same AVPlayer somewhere else ,Cant believe I found this enlightenment over here.
https://twitter.com/skumancer/status/294605708073263104
How I implemented this?
I created 2 view controllers in my main story board.
1) View Controller Scene
i) This has a UIView in it referenced to IBOutlet 'videoView'
ii) A button with segue navigation to 'show' AV Player View Controller Scene referenced to IBOutlet 'fullscreen'
2) AV Player View Controller Scene
// Code in your controller class.
// Make sure to reference your IBOutlet in your StoryBoard.
#IBOutlet weak var videoView : UIView!
#IBOutlet weak var fullscreen : UIButton!
var player:AVPlayer? = nil;
var playerLayer:AVPlayerLayer? = nil;
override func viewDidLoad() {
super.viewDidLoad()
// Your Url to play, could be hls type video
let url = URL(string: "https://your_url.m3u8");
let asset = AVAsset(url : url!)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
}
override func viewDidAppear(_ animated: Bool) {
playInUIView()
pauseVideo()
}
// Creates An AVPlayerLayer and adds it as a sublayer in UIView.
func playInUIView(){
// Creating player layer to render video.
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = self.videoView.bounds
playerLayer?.videoGravity = .resizeAspect
// Adding a sub layer to display it in UIView
self.videoView.layer.addSublayer(playerLayer!)
}
// Destroyes an AVPlayerLayer and Sublayer of your UIView
func destroyPlayInView(){
self.videoView.layer.sublayers=nil
playerLayer = nil
}
// On button click open AVPlayerViewController.
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
let destination = segue.destination as! AVPlayerViewController
// Destroy AVPlayerLayer before rendering video in AVPlayerViewController.
destroyPlayInView()
// Set same AVPlayer in AVPlayerViewController.
destination.player = self.player
self.present(destination, animated: true) {
destination.player!.play()
}
}
PS - I have not implemented any controls as of yet and there could be coding malpractices as I have just started with swift and IOS development couple of days back, so please let me know wherever I am wrong.
Try this approach as I have encountered the same issue and solve it by this kind approach.
player.pause()
player.currentItem?.seek(to: CMTime(), completionHandler: { (_) in
player.play()
})
The same code of user nferocious76 but in Objective C
[self.player pause];
AVPlayerItem *playerItem = [self.player currentItem];
__weak typeof(self) weakSelf = self;
[playerItem seekToTime:playerItem.currentTime completionHandler:^(BOOL finished){
if (finished){
[weakSelf.player play];
}
}];

Resources