We're using Wowza 3.5 in our server to stream videos, and JW Player for the player.
The streaming of videos are working fine when viewed in iPhone and iPad.
What we are trying to do now is, when a video is playing on an iOS device and a user paused it we would like the video to buffer (download) while it is paused so even if the user lost connection to the internet he could still watch the video until. Currently it is only buffering for 30 seconds to 1 minute.
How could we increase this buffer time to possibly buffer / download the whole video while the player is paused?
Is this even possible in HLS?
This is not possible when you're playing a video using HLS in a webpage using the default browser.
You need to create your own app to control the video buffer when playing videos using HLS.
There is an App with Wowza called nDVR, you can pause, play and record using nDVR. Also I think you can increase the buffer length in application.xml, make sure.
Related
I am using AVPlayer to play a video from an https url with a setup this:
player = AVPlayer(url: URL(string: urlString))
player?.automaticallyWaitsToMinimizeStalling = false
But since the video is a little long, there is a short blank screen delay before the video actually starts to play. I think this is because it is being loaded from https.
Is there anyway to remove that delay by making AVPlayer play the video right away without loading the whole thing?
I added .automaticallyWaitsToMinimizeStalling but that does not seem to make a difference.
If anyone has any other suggestions please let me know.
I don't think there is nothing to do with loading from https. what is your video file format? I think you are thinking of adaptive bitrate streaming behavior.
https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Apple_HTTP_Live_Streaming
HTTP Live Streaming (HLS) is an HTTP-based media streaming
communications protocol implemented by Apple Inc. as part of QuickTime
X and iOS. HLS supports both live and Video on demand content. It
works by breaking down streams or video assets into several small
MPEG2-TS files (video chunks) of varying bit rates and set duration
using a stream or file segmenter. One such segmenter implementation is
provided by Apple.[29] The segmenter is also responsible for producing
a set of index files in the M3U8 format which acts as a playlist file
for the video chunks. Each playlist pertains to a given bitrate level,
and contains the relative or absolute URLs to the chunks with the
relevant bitrate. The client is then responsible for requesting the
appropriate playlist depending on the available bandwidth.
For more information about HTTP Live Streaming
https://developer.apple.com/documentation/http_live_streaming
This tutorial includes some experiments on HTTP Live Streaming version and Non-HTTP Live Streaming version.
https://www.raywenderlich.com/5191-video-streaming-tutorial-for-ios-getting-started
Have you tried using AVPlayerItem's preferredForwardBufferDuration? You can manage how long AVPlayer continues to buffer using this property.
player.currentItem?.preferredForwardBufferDuration = 1
From Apple's own documentation:
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption.
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
Suggestion:
As a video is streaming, we are also relying on a network connection. So for poor network connection, It always has a chance to display a blank screen.
We can do like, We can fetch thumbnail of streaming video on the previous screen from the server or can generate thumbnail from application side from streaming URL. When streaming screen open, display thumbnail to a user with streaming indicator and when video starts streaming hide thumbnail.
In that particular case you can place an UIImageView above to the AVPLayerlayer view.
That image work as a landscape/cover image of your video which matched the first frame of your video along an UIActivityIndicator on subview.
Now hide that image when video got about to play.
This helps to hide the black frame of your video as it inevitable to handle the initial buffer state of the video.
I need a custom video player which play based on internet speed and I have four video urls which is hd,high,medium and low quality here what I am doing is I am playing high resolution video when internet speed with certain limit and something like fast and want to play based on wifi or 3G speed and here the problem is I am not able to get internet speed. I searched lot of sites for this. and one more point is while playing I have to check internet spped for every 10 secs.
Assuming you are using AVFoundation for video playback, an easier solution to your problem is rather than creating four video files, convert the video for use with HTTP Live Streaming, which allows the player select the most appropriate bitrate media stream.
https://developer.apple.com/streaming/
I'm using AVPlayer to play online video content. Server supports byte ranges as according to RFC 2616.
Thing is that it takes too long for AVPlayer to have playable status.
For a 1GB video (duration = 1h 45min) it takes 1min-2min to start playback. I notice that seek works rather good after that.
Playing same video on default Android media player takes 15sec to start.
It seems as if AVPlayer downloads chunks in different parts of media space for better seek support. But I want video to start as soon as possible. Is such a thing possible - to configure AVPlayer for quicker play?
im working on Smart TV app for Samsung which should use youtube api to play videos. Embedded videos will work only when app resolution and yt player size are 960x540 or below,
if I set higher resolution (1280x720 or 1920x1080) player stucks, behaves really slow, and videos will buffer infinitely.
Has anyone succeeded in embedding yt videos with higher resolution player?
Thx in advance.
Video player work in FullHD resolution in fullscreen regardless of widget resolution.
If you have troubles with buffering, check your connection speed. Try play file from local network to check that selected resolution and codecs hadled well by TV.
recently i found this case. The youtube apps working great on 720p resolution if the video length is below 10min, but longer than that for example 30min the player will stuck just like as you said.
When changing the app resolution to 540p the youtube player working great again for all videos. I suppose the youtube is using progressive download on their player and Smart TV storage itself is not enough to prepare the long video storage space with 720p resolution rendering.
The conclusion is when using flash player/youtube in apps the best at using 540p app resolution.
Thx all for answering,
in the end I used different approach which showed like best solution.
I used 720p resolution, and youtube cue video functionality.
Basically i cued video, and on "videoCued" event i called "playVideo" method.
This allowed player to get ready and initialize before playing video.
I am Streaming a live video (.m3u8) with MPMoviePlayerViewController, however during playback sometimes the video is lost and only the Quicktime logo is displayed, however the audio still goes on.
This happens at random times, sometimes never, maybe when the internet is not as strong as it needs, but console doesn't log any errors or changes in the playback.
Is there a way to notice when this happens and to recover the video image from the streaming?
This is how HTTP live streaming is designed to work. It will progressively choose higher or lower quality streams based on the strength of the internet connection. If the connection is not fast enough the "last resort" is to continue to play audio but no video. The only way to recover the video image in this case is to improve the speed of the internet connection.