The main source of my worries comes from the m3u8 manifests I use to play video in my app.
They contain variants with audio & video and some containing only audio.
The problem occurs when I load this type of manifest in avplayercontroller/avplayer while having a bad network or low bandwidth.
In every case avplayer witch to the tracks with no video to clear some bandwidth, resulting in no image appearing on the player.
I would like to know if there is a way to force avplayer or avplayercontroller to only focus on the manifest tracks containing audio and video and forbid audio only tracks to be selected, loaded, and played.
Thank you in advance to any tips or pointers you can give me on this subject.
I am using AVPlayer to play a video from an https url with a setup this:
player = AVPlayer(url: URL(string: urlString))
player?.automaticallyWaitsToMinimizeStalling = false
But since the video is a little long, there is a short blank screen delay before the video actually starts to play. I think this is because it is being loaded from https.
Is there anyway to remove that delay by making AVPlayer play the video right away without loading the whole thing?
I added .automaticallyWaitsToMinimizeStalling but that does not seem to make a difference.
If anyone has any other suggestions please let me know.
I don't think there is nothing to do with loading from https. what is your video file format? I think you are thinking of adaptive bitrate streaming behavior.
https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Apple_HTTP_Live_Streaming
HTTP Live Streaming (HLS) is an HTTP-based media streaming
communications protocol implemented by Apple Inc. as part of QuickTime
X and iOS. HLS supports both live and Video on demand content. It
works by breaking down streams or video assets into several small
MPEG2-TS files (video chunks) of varying bit rates and set duration
using a stream or file segmenter. One such segmenter implementation is
provided by Apple.[29] The segmenter is also responsible for producing
a set of index files in the M3U8 format which acts as a playlist file
for the video chunks. Each playlist pertains to a given bitrate level,
and contains the relative or absolute URLs to the chunks with the
relevant bitrate. The client is then responsible for requesting the
appropriate playlist depending on the available bandwidth.
For more information about HTTP Live Streaming
https://developer.apple.com/documentation/http_live_streaming
This tutorial includes some experiments on HTTP Live Streaming version and Non-HTTP Live Streaming version.
https://www.raywenderlich.com/5191-video-streaming-tutorial-for-ios-getting-started
Have you tried using AVPlayerItem's preferredForwardBufferDuration? You can manage how long AVPlayer continues to buffer using this property.
player.currentItem?.preferredForwardBufferDuration = 1
From Apple's own documentation:
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption.
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
Suggestion:
As a video is streaming, we are also relying on a network connection. So for poor network connection, It always has a chance to display a blank screen.
We can do like, We can fetch thumbnail of streaming video on the previous screen from the server or can generate thumbnail from application side from streaming URL. When streaming screen open, display thumbnail to a user with streaming indicator and when video starts streaming hide thumbnail.
In that particular case you can place an UIImageView above to the AVPLayerlayer view.
That image work as a landscape/cover image of your video which matched the first frame of your video along an UIActivityIndicator on subview.
Now hide that image when video got about to play.
This helps to hide the black frame of your video as it inevitable to handle the initial buffer state of the video.
I wanted to implement a video quality control in video player in swift.I have used preferredPeakBitRate of avplayer item but I am not able to change the quality of video while the video is playing.I have the manifest file URL that contains different bit rates of video.Kindly suggest me any third party video player that used manual video quality control or tell me how I can achieve using avplayer in swift.
I have some Audio files in mp4 format - which I used thinking this would be the best format for use on both iOS and Android devices. However they do not play with either AVAudioPlayer or AVPlayer. Is there another player that may handle it, or some way (in Swift) to covert them to mp3? Note this only an issue with mp4 audio only files - mp4 videos are fine.
AvPlayer CAN play mp4 files, but not if they are encoded as AMR_NB they need to be encoded as AAC.
I am familar with google's swiffy tool and macvide. My question is How can I use one of them or any other tool to convert stream in flash into HTML5 video on the fly.
I imagine you have in hand a Flash-based video player. That is a swf that loads and plays a video stream. You shouldn't try to convert that swf to a Swiffy, but instead just use a HTML5 video player like VideoJS to play the video.