After several (depends on device - 2-3 transitions on first SE, more than 20 transitions on 11) transitions between two vc in navigation stack (from custom camera vc to custom video player vc) I get blank player layer without any playing video or sound. I checked up memory graph and don't found any leaks. play(), seek() and other methods don't works. I tried to change player layer frame directly in this case but it has right values. Unfortunately there is unpredictable trouble inside custom player. Any ideas?
Related
I am currently displaying a video embedded in a view. On tap of this video I need to present it in full screen just like how App Store does it.
Right now I am using AVPlayer to embed in the view with a AVPlayerLayer. When I detect a tap on this embedded player, I am creating a AVPlayerViewController and setting its player object with the player currently embedded, but still I can notice a delay of 3~4 seconds until the player in fullscreen starts playing. How can I make this transition smooth ?
I have created instance of AVPlayer and playing content in it. I need to provide one option to show this movie in full screen. Does any other option present instead of creating instance of AVPlayerController and using native playback button options.
AVPlayer is containing only video view. Buttons and basically all the UI over the video are responsibility of developer.
So you need to do view controller that have view where AVPLayer will be embedded, and over it another views (like buttons, labels). And of course rig everything with constraints. When you tap "full screen" button, animate constraints so you resize your video. This is it.
Much simpler (considering lines of code that must be written) is to use AVPLayerViewController, but you loose possibility of custom UI. On other hand most of the logic is there. (except for support HLS EVENT type playlists that are not closed, there is a bug that will be fixed in iOS11)
Update:
Bug in AVPLayerViewController, regarding HLS EVENT type is fixed in iOS11.
while playing remote HLS videos,
I am re-initializing AVQueue player which is already initialized with items by using
(AVQueuePlayer *)initWithItems:(NSArray<AVPlayerItem *> *)items
However, by doing this sound plays in background but the AVPlayerLayer is stuck at the last frame of the previous video, the video does not update. In order to make sure that video gets updated, I need to remove the previous layer for UIView of video player, re-create the new AVPlayerLayer and assign it to the UIView for player using following :
[oldAVPlayerLayer removeFromSuperLayer]
[newAVPlayerLayer playerLayerWithPlayer: myAVQueuePlayer]
[myViewForPlayerLayer addSublayer : newAVPlayerLayer]
This causes a flicker on the screen, which is okay if the device was just an iPhone/iPad, but problem is with abrupt Airplay behaviour, causing the UISlider for sound to show in the remote controls.
Is there a way to re-initialize the AVQueuePlayer without recreating or reassigning the AVPlayerLayer?
Ended up using AVPlayer instead of AVQueuePlayer, and using method
replaceCurrentItemWithPlayerItem
Does not cause the glitch in airplay, and saves the memory as well since the items are instantiated as and when required.
I have a UICollectionViewController that autoplays video in its cells (similar to the Vine App). I'm finding that frames drop when scrolling, and I've isolated it to allocating the AVPlayer object in each cell. It doesn't matter if it's done on a background thread or on the main one. Are there any ways of working around this?
In my app the user can play a video and leave the screen and it will continue to play in the background (just the audio). They can then return to continue to watch the video. This means that the view the video is in is destroyed and then recreated at a later point. Whenever the view is recreated and the player is set on it's AVPlayerLayer there is a noticeable lag in the video and more importantly the audio.
Does anyone know how to eliminate this lag?
The key to making this work without any lag / delay in the audio or video is to store the view with the AVPlayerLayer outside of the view. When reloading the controller, instead of creating a new view and assigning its player to the same player, simply attach the old view to the new view controller's view.
The view stays in memory as long as the video is still playing, that way a new AVPlayerLayer is not created and assigned to. It is the reassigning that causes the lag.