set AVPlayer AVPlayerItem buffer size? - ios

Playing video with avplayer, listening to the loadedTimeRanges property, playing about 10 minutes of video, avplayer always preload the video, and feel very costly, is there a way to limit the size of the preloaded area? Such as pre-loading video half of the time?

I think you're looking for AVPlayerItem's preferredForwardBufferedDuration property.
Per Apple:
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
See https://developer.apple.com/reference/avfoundation/avplayeritem/1643630-preferredforwardbufferduration?language=objc

If you use resourceLoaderdelegate you can control the exact amount of content that gets preloaded/downloaded prior to playback.
The code example here is a good start - AVPlayer stalling on large video files using resource loader delegate
Basically, you would have to maintain an array of pendingRequests and process them once by one by firing up URLSession dataTask until you have downloaded enough content that you would like to preload.
Cheers.

Related

Streaming video from https with AVPlayer causes initial delay

I am using AVPlayer to play a video from an https url with a setup this:
player = AVPlayer(url: URL(string: urlString))
player?.automaticallyWaitsToMinimizeStalling = false
But since the video is a little long, there is a short blank screen delay before the video actually starts to play. I think this is because it is being loaded from https.
Is there anyway to remove that delay by making AVPlayer play the video right away without loading the whole thing?
I added .automaticallyWaitsToMinimizeStalling but that does not seem to make a difference.
If anyone has any other suggestions please let me know.
I don't think there is nothing to do with loading from https. what is your video file format? I think you are thinking of adaptive bitrate streaming behavior.
https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Apple_HTTP_Live_Streaming
HTTP Live Streaming (HLS) is an HTTP-based media streaming
communications protocol implemented by Apple Inc. as part of QuickTime
X and iOS. HLS supports both live and Video on demand content. It
works by breaking down streams or video assets into several small
MPEG2-TS files (video chunks) of varying bit rates and set duration
using a stream or file segmenter. One such segmenter implementation is
provided by Apple.[29] The segmenter is also responsible for producing
a set of index files in the M3U8 format which acts as a playlist file
for the video chunks. Each playlist pertains to a given bitrate level,
and contains the relative or absolute URLs to the chunks with the
relevant bitrate. The client is then responsible for requesting the
appropriate playlist depending on the available bandwidth.
For more information about HTTP Live Streaming
https://developer.apple.com/documentation/http_live_streaming
This tutorial includes some experiments on HTTP Live Streaming version and Non-HTTP Live Streaming version.
https://www.raywenderlich.com/5191-video-streaming-tutorial-for-ios-getting-started
Have you tried using AVPlayerItem's preferredForwardBufferDuration? You can manage how long AVPlayer continues to buffer using this property.
player.currentItem?.preferredForwardBufferDuration = 1
From Apple's own documentation:
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption.
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
Suggestion:
As a video is streaming, we are also relying on a network connection. So for poor network connection, It always has a chance to display a blank screen.
We can do like, We can fetch thumbnail of streaming video on the previous screen from the server or can generate thumbnail from application side from streaming URL. When streaming screen open, display thumbnail to a user with streaming indicator and when video starts streaming hide thumbnail.
In that particular case you can place an UIImageView above to the AVPLayerlayer view.
That image work as a landscape/cover image of your video which matched the first frame of your video along an UIActivityIndicator on subview.
Now hide that image when video got about to play.
This helps to hide the black frame of your video as it inevitable to handle the initial buffer state of the video.

Does AVPlayer auto adjust when playing a m3u8 playlist?

After spending some time setting up a transcoding process on AWS I am finding that the loading times for videos has not been lowered as expected with HLS (m3u8).
It seems that if I am using AVPlayer directly, without AVPlayerViewController, I a may need to do the managing of the video stream quality myself? My understanding was that if I had a m3u8, that things would be done automatically and the best quality would be used depending on network conditions / device / etc?
So far it seems that the loading times are the same if not slightly worse than without the m3u8 if AVPlayer is used as is.
To better understand what's going on I've been trying out a few things.
1) While doing this has worked to reduce loading times, I would prefer to do a bit more than just lower it all the way when not on wfifi:
self.player?.currentItem?.preferredPeakBitRate = 1
This seems to give me a pretty low quality video but it loads pretty quickly. I have yet to figure out how to detect the actual bitrate being used though (since setting this value has improved loading times dramatically, I am going to assume AVPlayer does not handle the adjustments on its own?).
2) Also, haven't had any luck with (causes infinite spinner, even with the preferredPeakBitRate set to 1):
self.player.automaticallyWaitsToMinimizeStalling = false
3) I am open to using a third party library that might handle this, found something called VKVideoPlayer that might do some of this?
Thanks
It's possible now in iOS8 and onwards.
Following copied from Apple's documentation:
The desired limit, in bits per second, of network bandwidth
consumption for this item. SWIFT: var preferredPeakBitRate: Double
OBJECTIVE-C: #property(nonatomic) double preferredPeakBitRate
Set preferredPeakBitRate to non-zero to indicate that the player
should attempt to limit item playback to that bit rate, expressed in
bits per second.
If network bandwidth consumption cannot be lowered to meet the
preferredPeakBitRate, it will be reduced as much as possible while
continuing to play the item.

In AVFoundation, how to synchronize recording and playback

I am interested in recording media using an AVCaptureSession in iOS while playing media back using an AVPlayer (specifically, I am playing back audio and recording video, but I'm not sure it matters).
The problem is, when I play the resulting media back together later, they are out of sync. Is it possible to synchronize them, either by ensuring that playback and recording start simultaneously, or by discovering what the offset is between them? I probably need the sync to be on the order of 10 ms. It is unreasonable to assume that I can always capture audio (since the user may use headphones), so syncing via analysis of original and recorded audio is not an option.
This question suggests that it's possible to end playback and recording simultaneously and determine the initial offset from the resulting lengths that way, but I'm unclear how to get them to end simultaneously. I have two cases: 1) the audio playback runs out, and 2), the user hits the "stop recording" button.
This question suggests priming and then applying a fixed, but possibly device-dependent delay, which is obviously a hack, but if it's good enough for audio it's obviously worth considering for video.
Is there another media layer I can use to perform the required synchronization?
Related: this question is unanswered.
If you are specifically using AVPlayer to playback Audio and i would suggest you to use AudioQueueServices for the same. Its seamless and fast as it reads buffer by buffer and play pause is faster than AVPLlayer
There can also be the possibility that you are missing the initial statement of [avPlayer prepareToPlay] which might be causing much overhead for it to sync before playing the Audio.
Hope it helps you.

Playing an AVMutableComposition with AVPlayer audio gets out of sync

I have an AVMutableComposition with 2 audio tracks and one video track. I'm using the composition to string about 40 different video clips from .mov files, putting the video content of each clip in the video track of my composition and the audio in the audio track. The second audio track I use for music.
I also have a synchronized layer for titles graphics.
When I play this composition using an AVPlayer, the audio slowly gets out of sync. It takes about 4 minutes to start becoming noticeable. It seems like if I only string together a handfull of longer clips the problem is not as apparent, it is when there are many clips shorter clips (~40 in my test) that it gets really bad.
Pausing and Playing doesn't re-sync the audio, however seeking does. In other words, if I let the video play to the end, towards the end the lip sync gets noticeably off even if I pause and play throughout, however, if I seek to a time towards the end the audio gets back in sync.
My hacky solution for now is to seek to the currentTime + 1 frame every minute or so. This creates an unpleasant jump in the video caused by a lag in the seek operation, so not a good solution.
Exporting with an ExportSession doesn't present this problem, audio remains in sync in the output movie.
I'm wondering if the new masterClock property in the AVPlayer is the answer to this, and if it is, how is it used?
I had the same issue and fixed it, among many other audio and video things, by specifying times timescales in the following manner:
CMTime(seconds: my_seconds, preferredTimescale: CMTimeScale(600))
Before, my time scale was CMTimeScale(NSEC_PER_SEC). That caused me jittery when composing clips at a different frame rate, plus this audio out-of-sync that Eddy mentions here.
In spite of looking like a magic number, 600 is a common multiple of 24, 30, 60 and 120. These are usual frame rates for different purposes. The common multiple avoids dragging around rounding problems when composing multiple clips.

MPMoviePlayer Buffer size/Adjustment

I have been using MPMovieplayer and the playableDuration to check the available duration of a movie.
The duration always seems to be ~1 second further than my current duration and basically I would like to increase this.
I have tried to use the prepareToPlay but this seems to do nothing noticeable to the playable Duration.
I have tried to set as many parameters as possible to attempt to try and change the value via setting the defaults pre-emptively such as the MPMovieSourceType, MediaType and alike, but all to no avail.
Just to clear a few things up firstly: I am using both MPMoviePlayer and AVplayer which both play different streams simultaneously as the video/audio I am using is split.
EDIT
Seems like I overlooked the file size affecting the stream and should have read more in the apple resources then elsewhere, but as far as I can tell the issue is: the file size is too large and therefore a server side media segmenter has to be implemented.
Apple Resource on Media Segmenting

Resources