I am using Fairplay implementation as per Apple's Fairplay Streaming sample code at https://developer.apple.com/streaming/fps/, although I tried to choose only parts that are related to Online Fairplay Streaming, not the persistence/offline playback. In the below code a video without Fairplay plays/pauses/seeks normally, but when I play a Fairplay protected video, only the video track behaves correctly.
Pausing playback won't stop the audio playback, changing audio track won't stop the previous audio track, so both plays together and perhaps the seek also does not work.
Besides this helper class below, I have AssetLoaderDelegate and AssetPlaybackManager from Apple's client sample code of FairPlay Streaming Server SDK https://developer.apple.com/streaming/fps/ and I have updated the code to handle SPC/CKC for our DRM keys provider.
Did I miss to implement some important part of the code to handle audio for FPS Streaming? Can you please point me into right direction? Many thanks.
class PlayHelper {
static let shared = PlayHelper()
fileprivate var playerViewController: PlayerViewController?
init() {
AssetPlaybackManager.sharedManager.delegate = self
}
// Play video without DRM
func playVideo(from urlString: String, at context: UIViewController) {
guard let videoURL = URL(string: urlString) else {
Log.error("Video URL can't be created from string: \(urlString)")
return }
let player = AVPlayer(url: videoURL)
let playerViewController = PlayerViewController()
playerViewController.player = player
context.present(playerViewController, animated: true) {
playerViewController.player?.play()
}
}
// Play FPS video
func playFpsVideo(with asset: AVURLAsset, at context: UIViewController) {
// Cleanup, should be done when playerViewController is actually dismissed
if self.playerViewController != nil {
// The view reappeared as a results of dismissing an AVPlayerViewController.
// Perform cleanup.
AssetPlaybackManager.sharedManager.setAssetForPlayback(nil)
self.playerViewController?.player = nil
self.playerViewController = nil
}
let item = AVPlayerItem(asset: asset)
let player = AVPlayer(playerItem: item)
// Customize player
player.appliesMediaSelectionCriteriaAutomatically = true
let playerViewController = PlayerViewController()
playerViewController.player = player
self.playerViewController = playerViewController
context.present(playerViewController, animated: true) {
playerViewController.player?.play()
}
}
// Stop video
func stop() {
// Cleanup, should be done when playerViewController is dismissed
if self.playerViewController != nil {
// Results of dismissing an AVPlayerViewController, perform cleanup
AssetPlaybackManager.sharedManager.setAssetForPlayback(nil)
self.playerViewController?.player = nil
self.playerViewController = nil
}
}
}
// MARK: - Extend `PlayHelper` to conform to the `AssetPlaybackDelegate` protocol
extension PlayHelper: AssetPlaybackDelegate {
func streamPlaybackManager(_ streamPlaybackManager: AssetPlaybackManager, playerReadyToPlay player: AVPlayer) {
player.play()
}
func streamPlaybackManager(_ streamPlaybackManager: AssetPlaybackManager, playerCurrentItemDidChange player: AVPlayer) {
guard let playerViewController = playerViewController, player.currentItem != nil else { return }
playerViewController.player = player
}
}
I can also provide the code in AssetLoaderDelegate and AssetPlaybackManager if needed.
My bad. I called play() twice in the code above... Grrr.. Once when the presentation of the PlayerViewController finished and second time in the callback from AssetPlaybackDelegate that is triggered by KVO in AssetPlaybackManager. This way the player controls stopped playing the video, but most probably a second (audio) stream was still playing there. I removed the play() in playerReadyToPlay callback and now all the controls in the Player works as expected. I can pause, resume, seek, change audio tracks.
Related
I need to record a video and show a video with an AVPlayer at the same time. The result needs to be synchronized: if the video I'm showing is a metronome, or someone clapping, the recorded video and the player should click or clap at the same time.
I play the AVPlayer using preroll and setRate, with a delegate in the camera controller.
Using AVCaptureMovieFileOutput, I've tried calling setRate on the player once fileOutput(output:didStartRecordingTo) is called, but the videos end up desynchronized, with the recorded video going behind the player.
Using an AssetWriter, I've tried calling setRate on the player once captureOutput(captureOutput:sampleBuffer) is called, and the first buffer is appended, but the result is the same.
Is there any other way to do this?
EDIT: Adding some code to show what I'm doing:
Camera with AVCaptureMovieFileOutput:
func startRecordingVideo() {
guard let movieFileOutput = self.movieFileOutput else {
return
}
sessionQueue.async {
if !movieFileOutput.isRecording {
let movieFileOutputConnection = movieFileOutput.connection(with: .video)
movieFileOutput.setOutputSettings([AVVideoCodecKey: AVVideoCodecH264], for: movieFileOutputConnection!)
let outputFileName = NSUUID().uuidString
let outputFilePath = (NSTemporaryDirectory() as NSString).appendingPathComponent((outputFileName as NSString).appendingPathExtension("mov")!)
movieFileOutput.startRecording(to: URL(fileURLWithPath: outputFilePath), recordingDelegate: self)
} else {
movieFileOutput.stopRecording()
}
}
}
func fileOutput(_ output: AVCaptureFileOutput, didStartRecordingTo fileURL: URL, from connections: [AVCaptureConnection]) {
cameraDelegate?.startedRecording()
}
Implementing delegate's startRecording in the viewController:
private func setRateForAll() {
let hostTime = CMClockGetTime(CMClockGetHostTimeClock())
activePlayers.forEach {
$0.setRate(atHostTime: hostTime)
}
}
Once the active player or players end, I call stopRecording on the camera, and show the resulting recorded video in another player, along with the player or players I've been showing.
I ended up using the recording session master clock as the hostTime for the setRate method. The desynchronization is almost imperceptible now!
In fileOutput(output:didStartRecordingTo):
cameraDelegate?.startedRecording(clock: CMClockGetTime(self.session.masterClock!))
In the view with the players:
private func setRateForAll(clock: CMTime) {
activePlayers.forEach {
$0.setRate(atHostTime: clock)
}
}
We load an MP4 video from a URL into an AVPlayer. The AVPlayer is behind a skeleton image which we hide when the AVPlayer gets to status "Ready To Play".
We expect to see the first frame of the video as soon as we hide the skeleton image. However, that first frame of the video appears after a slight delay. What is the status that indicates that the video in the AVPlayer is loaded and ready?
func prepareVideo(videoUrl: URL?, owner: UIView, autoStart: Bool = false) {
self.player = AVPlayer(url: videoUrl!)
let playerController = AVPlayerViewController()
playerController.player = player
playerController.view.layer.shouldRasterize = true
playerController.view.frame = owner.frame
playerController.view.isOpaque = true
playerController.view.backgroundColor = UIColor.clear
playerController.view.layer.borderColor = UIColor.clear.cgColor
playerController.view.layer.borderWidth = 0
playerController.showsPlaybackControls = false
playerController.updatesNowPlayingInfoCenter = false
owner.addSubview(playerController.view)
owner.sendSubview(toBack: playerController.view)
timerStart() // this starts a timer that checks the AVPlayer status
if autoStart {
playVideo()
}
}
#objc func timerStatusCheck() {
// function called by a Timer and checks the status of AVPlayer
if player!.status == AVPlayerStatus.readyToPlay {
print("ready to play")
timerStop()
if (readyToPlayHandler != nil) {
self.readyToPlayHandler!() // here we hide the skeleton image that shows while video is loading
}
} else if player!.status == AVPlayerStatus.failed {
timerStop()
MessageBox.showError("Video Failed to start")
}
}
When the AVPlayer reports rate = 1, it's playing the video. However, that doesn't mean the video is visible in the AVPlayerViewController. For that, you need the AVPlayerViewController's property "isReadyForDisplay" to be true.
https://developer.apple.com/documentation/avkit/avplayerviewcontroller/1615830-isreadyfordisplay
Note both AVPlayerViewController.isReadyForDisplay and AVPlayer.status are both KVO observable, which will be more responsive than using a Timer.
Also note if you use an AVPlayerLayer to display the video (instead of AVPlayerViewController), you need to observe the AVPlayerLayer's "isReadyForDisplay" for the same reason.
#objc func timerStatusCheck() {
// function called by a Timer and checks the status of AVPlayer
if player!.status == AVPlayerStatus.readyToPlay {
print("ready to play")
// check that controller's view is showing video frames from player
if playerController.isReadyForDisplay == false {
print("view not yet showing video")
return
}
timerStop()
if (readyToPlayHandler != nil) {
self.readyToPlayHandler!() // here we hide the skeleton image that shows while video is loading
}
} else if player!.status == AVPlayerStatus.failed {
timerStop()
MessageBox.showError("Video Failed to start")
}
}
I am trying to make a video play if a certain view is tapped once. The .play() function works well if I directly call it within an if statement inside a function (if which "checks" URL inside the setupPlayerView() function). The first functions (setupPlayerView and defInteractions) I am going to show below are going to be called in an override init which sets the properties and subviews/sublayers etc. for the video player. The last function is triggered by the second function . Pay attention to the declaration of player and my comment in the bottom...
Code: func #1
func setupPlayerView() {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
//check URL if can be converted to NSURL
if let videoURL = NSURL(string: urlString){
//player's video
let player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//if I call player.play() here the video plays directly
}
}
As I the UITapGestureRecognizers to detect a single tap I have the following function which is called in the override init as well (shortly after the previous function):
Code: func #2
//set interactions
func defInteractions (){
//enable interaction
controlsContainerView.isUserInteractionEnabled = true
//singletap
let singleTap = UITapGestureRecognizer(target: self, action: #selector(singleTapDetected(_:)))
singleTap.numberOfTapsRequired = 1
//controlsContainerView
controlsContainerView.addGestureRecognizer(singleTap)
}
Now, I would like to call player.play() inside the function singleTapDetected which currently looks like this:
Code: func #3
func singleTapDetected(_ sender: UITapGestureRecognizer) {
player.play()
}
However, it does not work of course as this function is outside the override init as opposed to the others and I get the error use of unresolved identifier 'player'. How can I call player.play() and get the same result as if I would call it in the first function? Can I access it within the if? I could need some help...
Make player an instance variable:
var player: AVPlayer?
Then initialize it like this:
if self.player == nil {
player = AVPlayer(url: videoURL as URL)
}
Then you can easily access it from anywhere in your class.
Make the player property Global like this :
let player : AVPlayer?
Initialize it in setupPlayerView() :
func setupPlayerView() {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
//check URL if can be converted to NSURL
if let videoURL = NSURL(string: urlString){
//player's video
self.player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
//if I call player.play() here the video plays directly
}
}
and use it like this:
func singleTapDetected(_ sender: UITapGestureRecognizer) {
// with self
self.player.play()
}
Hope it helps
Declare player at the class level with optional.
var player: AVPlayer?
Instead of declaring player as a local variable in setupPlayerView function, you should declare it a an instance variable to be accessible in the scope of the whole class/struct.
For your case, I would suggest that instead of implementing setupPlayerView function, it would be a good practice to declare it as a lazy property , as follows:
lazy var player: AVPlayer? = {
//insert url
let urlString = "https://blurtime.com/images/testvideo.mov"
guard let videoURL = NSURL(string: urlString) else {
return nil
}
let player = AVPlayer(url: videoURL as URL)
//add sub-layer
let playerLayer = AVPlayerLayer(player: player)
self.layer.addSublayer(playerLayer)
playerLayer.frame = self.frame
//when are frames actually rendered (when is video loaded)
player.addObserver(self, forKeyPath: "currentItem.loadedTimeRanges", options: .new, context: nil)
return player
}()
Usage:
// this means if the urlString is valid
if let player = player {
player.play()
}
That leads to: all the needed setup for player should be ready only if needed. You might want to check this Q&A.
I just switched from AVAudioPlayer to AVPlayer and I'm going through my old functions and making the appropriate adjustments. The AVPlayer is being used to play remote audio files using URL's. When I select a file to play and pause everything works perfectly. However, when I want to resume playing the paused file, the player won't play even though I see the play function is being called. I set up an observer to know when the player is done playing so the play/pause button can toggle and what I noticed is that the observer is getting called after I hit pause. This shouldn't happen since the player isn't done playing right? Anyways, I set breakpoints and everything is getting called correctly. Any idea why the AVPlayer won't resume playing after being paused?
var playerItem: AVPlayerItem?
var newPlayer: AVPlayer?
var trackIDplaying: Int?
func playPausePressed(_ sender:UIButton)
{
if let selectedTrackID = trackIDplaying
{
do
{
try AVAudioSession.sharedInstance().setCategory(AVAudioSessionCategoryPlayback)
try AVAudioSession.sharedInstance().setActive(true)
if selectedTrackID == track.trackId
{
//if playing, then pause. If not playing, then play
if (self.newPlayer!.rate != 0)
{
self.newPlayer!.pause()
self.isPaused = true
}
else
{
if self.newPlayer!.currentItem != nil
{
self.newPlayer!.play()
print(self.newPlayer!.currentItem)
}
}
}
else
{
//If song is playing, switch to new song
let trackURL = URL(string: track.preSignedURL!)
trackIDplaying = track.trackId
self.playerItem = AVPlayerItem(url: trackURL!)
NotificationCenter.default.addObserver(self, selector: #selector((HomeController.playerDidFinishPlaying)), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.playerItem)
self.newPlayer = AVPlayer(playerItem: self.playerItem!)
self.newPlayer!.play()
}
}
catch let error1 as NSError
{
error = error1
self.newPlayer = nil
}
}
else
{
do
{
//play selected song if no other songs are playing
let trackURL = URL(string: track.preSignedURL!)
print(trackURL!)
self.playerItem = AVPlayerItem(url: trackURL!)
NotificationCenter.default.addObserver(self, selector: #selector((HomeController.playerDidFinishPlaying)), name: NSNotification.Name.AVPlayerItemDidPlayToEndTime, object: self.playerItem)
self.newPlayer = AVPlayer(playerItem: self.playerItem!)
self.newPlayer!.play()
self.trackIDplaying = track.trackId
}
}
if let err = error
{
print("audio player error \(err.localizedDescription)", terminator: "")
}
}
func playerDidFinishPlaying(note: NSNotification) {
guard let indexPath = self.playingAudioIndexPath, let cell = self.audioTable.cellForRow(at: indexPath) as? AudioCell else {
return;
}
cell.playButton.isSelected = false
self.playingAudioIndexPath = nil
}
try adding observer in only viewDidLoad() or viewWillAppear() adding observer should only done in one time for these kind of problems.
Still you have any problem feel free to ask me.
I go to create AVPlayerItem through AVURLAsset, my code:
let asset = AVURLAsset(URL: safeURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
asset.loadValuesAsynchronouslyForKeys([assetKeyPlayable, self.assetKeyTracks, self.assetKeyHasProtectedContent]) {
() -> Void in
dispatch_async(dispatch_get_main_queue(), {
() -> Void in
// Use the AVAsset playable property to detect whether the asset can be played
if !asset.playable {
let localizedDescription = "Item cannot be played,Item cannot be played description"
let localizedFailureReason = "The assets tracks were loaded, but could not be made playable,Item cannot be played failure reason"
let userInfo = [NSLocalizedDescriptionKey: localizedDescription, NSLocalizedFailureReasonErrorKey: localizedFailureReason]
let error = NSError(domain: "domain", code: 0, userInfo: userInfo)
self.videoPlayerDelegate?.videoPlayer?(self, playerItemStatusDidFail: error)
self.cleanPlayer()
return
}
// At this point we're ready to set up for playback of the asset. Stop observing
if let _ = self.player?.currentItem {
self.cleanPlayer()
}
if asset.URL.absoluteString != safeURL.absoluteString {
return
}
var error: NSError?
let status = asset.statusOfValueForKey(self.assetKeyTracks, error: &error)
var playerItem = AVPlayerItem(URL: safeURL)
if status == .Loaded {
playerItem = AVPlayerItem(asset: asset)
} else {
// You should deal with the error appropriately.If Loaded fails, create an AVPlayerItem directly from the URL
playerItem = AVPlayerItem(URL: safeURL)
}
self.player = self.playerWithPlayerItem(playerItem)
self.registerMonitoring()
self.registerNotification()
self.addTimeObserver()
completionBlock?(loadURLString: playerURL.absoluteString)
})
}
Add AVPlayerLayer display video in my View, my code:
// MARK: - Property
var player: AVPlayer? {
get {
return playerLayer.player
}
set {
playerLayer.player = newValue
}
}
var playerLayer: AVPlayerLayer {
return layer as! AVPlayerLayer
}
When displaying video after completion of loading
self.videoPlayer?.loadPlayer({
[weak self](loadURLString) in
if let strongSelf = self {
strongSelf.player = strongSelf.videoPlayer?.player
strongSelf.startPlay()
}
})
Call seekToTime method to specify the play:
self.player?.currentItem?.seekToTime(CMTimeMakeWithSeconds(time, Int32(NSEC_PER_SEC)), toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero) {
[weak self] finished in
if let weakSelf = self {
if weakSelf.isPlaying {
weakSelf.videoPlayerDelegate?.videoPlayerDidplayerItemSeekToTime?(weakSelf)
}
}
}
Some pictures of the stuck interface:
In the first picture the sound is audible, but the interface is stuck.
In the second picture, the video works, but I get no sound.
My question:
When I call the seekToTime method upon completion, sometimes the video has sound, but the interface is stuck, occasionally the video works. I tried to call the CALayer setNeedsDisplay methods, to update the AVPlayerLayer picture, but that didn't help. I don't know what to do anymore, i would be grateful for every and any help.
Since this is happening for many of us in a different way and is not answered, I am answering it here, For me this was happening when I was trying to play the video in UIView using AVPlayer and AVPlayerLayer but when I used the same AVPlayer to load the video in AVPlayerViewController the video was getting stuck and audio kept on playing.
The trick is to destroy the AVPlayerLayer before using the same AVPlayer somewhere else ,Cant believe I found this enlightenment over here.
https://twitter.com/skumancer/status/294605708073263104
How I implemented this?
I created 2 view controllers in my main story board.
1) View Controller Scene
i) This has a UIView in it referenced to IBOutlet 'videoView'
ii) A button with segue navigation to 'show' AV Player View Controller Scene referenced to IBOutlet 'fullscreen'
2) AV Player View Controller Scene
// Code in your controller class.
// Make sure to reference your IBOutlet in your StoryBoard.
#IBOutlet weak var videoView : UIView!
#IBOutlet weak var fullscreen : UIButton!
var player:AVPlayer? = nil;
var playerLayer:AVPlayerLayer? = nil;
override func viewDidLoad() {
super.viewDidLoad()
// Your Url to play, could be hls type video
let url = URL(string: "https://your_url.m3u8");
let asset = AVAsset(url : url!)
let playerItem = AVPlayerItem(asset: asset)
player = AVPlayer(playerItem: playerItem)
}
override func viewDidAppear(_ animated: Bool) {
playInUIView()
pauseVideo()
}
// Creates An AVPlayerLayer and adds it as a sublayer in UIView.
func playInUIView(){
// Creating player layer to render video.
playerLayer = AVPlayerLayer(player: player)
playerLayer?.frame = self.videoView.bounds
playerLayer?.videoGravity = .resizeAspect
// Adding a sub layer to display it in UIView
self.videoView.layer.addSublayer(playerLayer!)
}
// Destroyes an AVPlayerLayer and Sublayer of your UIView
func destroyPlayInView(){
self.videoView.layer.sublayers=nil
playerLayer = nil
}
// On button click open AVPlayerViewController.
override func prepare(for segue: UIStoryboardSegue, sender: Any?) {
let destination = segue.destination as! AVPlayerViewController
// Destroy AVPlayerLayer before rendering video in AVPlayerViewController.
destroyPlayInView()
// Set same AVPlayer in AVPlayerViewController.
destination.player = self.player
self.present(destination, animated: true) {
destination.player!.play()
}
}
PS - I have not implemented any controls as of yet and there could be coding malpractices as I have just started with swift and IOS development couple of days back, so please let me know wherever I am wrong.
Try this approach as I have encountered the same issue and solve it by this kind approach.
player.pause()
player.currentItem?.seek(to: CMTime(), completionHandler: { (_) in
player.play()
})
The same code of user nferocious76 but in Objective C
[self.player pause];
AVPlayerItem *playerItem = [self.player currentItem];
__weak typeof(self) weakSelf = self;
[playerItem seekToTime:playerItem.currentTime completionHandler:^(BOOL finished){
if (finished){
[weakSelf.player play];
}
}];