I have a few short video clips and have to decide whether the clips are played in a long video file.
It is actually a bunch of ~1min video ads baked in a 24 hour long TV stream. I want to find out when those ads happen, the ad videos are givens before hand.
Related
I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.
So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!
I have an iOS app that downloads videos from Firebase (Cloud Firestore) with a feed similar to Instagram/TikTok. However, I can't get the videos to be readily available before a user scrolls to the video. Any tips would be super helpful. How does TikTok do it? Do they save a whole bunch of videos to file in the background on load?
My current VideoDownloadManager:
Checks if video is already loaded in temp cache or local file,
and if not:
Downloads video url (view Firebase's downloadURL)
Returns the video url for immediate use (before playing it still
has some delay for buffering)
Stores the video url in a temp
cache (in case user scrolls away and scrolls back)
Begins to write video to file and removes video from temp cache on completion
With the current set up, videos play efficiently after they are downloaded. But if a video is not downloaded yet and a user scrolls to that video, it takes too long for (#1/2) above to complete and buffer enough to play. I am already using OperationQueues and prioritizing the current video over any other background videos - but this isn't fast enough still.
TikTok videos are almost always readily available as a user scrolls. What the secret?
Thanks for your help!
I have a few tips for you:
1- when you are loading your stream you should start preheating video URLs in a background thread.
2- try not to download complete files, and just cache or buffer a small amount of file like 1MB.
3- with .mp4 files you can play videos even if they are not downloaded completely.
4- start full download when the video starts playing based on the buffer rate or video length.
5- try to use videos with the smallest file size and bitrates. when you are creating them, try to convert them to a convenient format. my suggestion will be:
Video:
video bit rate -> 12.5
video size -> 960x540
conversion format -> h264
Sound:
rate -> 44100
encoding bit rate -> 96000
6- check if the video has a buffered range of more than 25% when you are going to start the playback.
7- don't forget to do the downloads in a temp folder and clean that folder regularly. This helps avoid huge app size, the consequence of not doing that will may lead users to delete your app!!.
For iOS developers: this is my videoConverter.
also, you can use this cache video player GSPlayer
I am using AVPlayer to play a video from an https url with a setup this:
player = AVPlayer(url: URL(string: urlString))
player?.automaticallyWaitsToMinimizeStalling = false
But since the video is a little long, there is a short blank screen delay before the video actually starts to play. I think this is because it is being loaded from https.
Is there anyway to remove that delay by making AVPlayer play the video right away without loading the whole thing?
I added .automaticallyWaitsToMinimizeStalling but that does not seem to make a difference.
If anyone has any other suggestions please let me know.
I don't think there is nothing to do with loading from https. what is your video file format? I think you are thinking of adaptive bitrate streaming behavior.
https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Apple_HTTP_Live_Streaming
HTTP Live Streaming (HLS) is an HTTP-based media streaming
communications protocol implemented by Apple Inc. as part of QuickTime
X and iOS. HLS supports both live and Video on demand content. It
works by breaking down streams or video assets into several small
MPEG2-TS files (video chunks) of varying bit rates and set duration
using a stream or file segmenter. One such segmenter implementation is
provided by Apple.[29] The segmenter is also responsible for producing
a set of index files in the M3U8 format which acts as a playlist file
for the video chunks. Each playlist pertains to a given bitrate level,
and contains the relative or absolute URLs to the chunks with the
relevant bitrate. The client is then responsible for requesting the
appropriate playlist depending on the available bandwidth.
For more information about HTTP Live Streaming
https://developer.apple.com/documentation/http_live_streaming
This tutorial includes some experiments on HTTP Live Streaming version and Non-HTTP Live Streaming version.
https://www.raywenderlich.com/5191-video-streaming-tutorial-for-ios-getting-started
Have you tried using AVPlayerItem's preferredForwardBufferDuration? You can manage how long AVPlayer continues to buffer using this property.
player.currentItem?.preferredForwardBufferDuration = 1
From Apple's own documentation:
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption.
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
Suggestion:
As a video is streaming, we are also relying on a network connection. So for poor network connection, It always has a chance to display a blank screen.
We can do like, We can fetch thumbnail of streaming video on the previous screen from the server or can generate thumbnail from application side from streaming URL. When streaming screen open, display thumbnail to a user with streaming indicator and when video starts streaming hide thumbnail.
In that particular case you can place an UIImageView above to the AVPLayerlayer view.
That image work as a landscape/cover image of your video which matched the first frame of your video along an UIActivityIndicator on subview.
Now hide that image when video got about to play.
This helps to hide the black frame of your video as it inevitable to handle the initial buffer state of the video.
I want to convert the slo-mo video picked from the iPhone's Gallery into my app and play it with normal speed at 240 FPS. Currently it plays with slow speed with 30 FPS.
There is a manual workaround as given here
But I want to do it programmatically after picking Video from Gallery.
Also, Is there any way to detect if it is a slo-mo video or normal video? This question is already asked here but there is no answer to it.
I'm new to Smooth Streaming. I am able to use Microsoft Smooth Streaming player to play live video that spans across multiple segments. The manifest seems to have information about all segments.
But for playback from archive, I am able to point the URL in the HTML to the ISM in one of the segments and I can play back that particular segment fine, but I don't know how to play back the entire video with abilities to rewind, forward, etc.
Is it possible to do such across multiple segments?
For playing back across multiple segments, I ended up adding the archived segments to the player's playlist and used player.GoToPlaylistItem() and player.SeekToPosition() to forward, rewind and play across multiple segments in an archive.