Fast video stream start - ios

I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.

So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!

Related

Video quality selector on iOS (React Native)

We are trying to create a video quality selector in a mobile app which works the same as youtube. The quality is selected and the video plays in that quality and does not automatically switch. From the backend we have m3u8 format video files which contain all the different resolutions that the video can play in and their URLs. On web and android, we have found that we can access the contents of this m3u8 files to display the available resolutions and select the corresponding one for the resolution selected. e.g select 720p in menu and then play 720p video track.
On iOS however, we are using react native video, which is a wrapper around AVPlayer on the native side which from my research does not seem to provide the access to select the different tracks from the m3u8 file as is possible on android. selectedVideoTrack is exposed on react-native-video for ios but does not work, likely due to above statement.
Adjusting the maximum bit rate is provided on react native video api, but in order to select the quality we need to set the exact bit rate, or at least the minimum bit rate, but none of these are exposed in AVPlayer and thus not in react native player either.
One solution I have tried was to create different m3u8 file for each resolution the video supports, only providing that option in the m3u8 file so it cannot auto degrade or auto improve resolution as is the nature of HLS when a specific quality is selected. Then, once a different quality option was selected, I change the underlying url source of the video and seek to the previous position to maintain the position of the video, instead of it replaying from the beginning. This solution works, however it requires a large change to our current backend infrastructure and seems to cause some waiting time while the new url is loaded. So I would like to know if there are any other better solutions before we are forced to go forward with this one.
So my question is how are Youtube and other iOS apps with a quality selector doing things? Are they following my method or am I missing something which allows them to work with different resolution videos? Answers can be in native code as well as javascript react native solutions.

How to get wifi or 3G internet speed programmatically in ios

I need a custom video player which play based on internet speed and I have four video urls which is hd,high,medium and low quality here what I am doing is I am playing high resolution video when internet speed with certain limit and something like fast and want to play based on wifi or 3G speed and here the problem is I am not able to get internet speed. I searched lot of sites for this. and one more point is while playing I have to check internet spped for every 10 secs.
Assuming you are using AVFoundation for video playback, an easier solution to your problem is rather than creating four video files, convert the video for use with HTTP Live Streaming, which allows the player select the most appropriate bitrate media stream.
https://developer.apple.com/streaming/

Removing low frequency (hiss) noise from video in iOS

I am recording videos and playing them back using AVFoundation. Everything is perfect except the hissing which is there in the whole video. You can hear this hissing in every video captured from any iPad. Even videos captured from Apple's inbuilt camera app has it.
To hear it clearly, you can record a video in a place as quiet as possible without speaking anything. It can be very easily detected through headphones and keeping volume to maximum.
After researching, I found out that this hissing is made by preamplifier of the device and cannot be avoided while recording.
Only possible solution is to remove it during post processing of audio. Low frequency noise can be removed by implementing low pass filter and noise gates. There are applications and software like Adobe Audition which can perform this operation. This video shows how it is achieved using Adobe Audition.
I have searched Apple docs and found nothing which can achieve this directly. So I want to know if there exists any library, api or open source project which can perform this operation. If not, then how can I start going in right direction because it does looks like a complex task.

Youtube api and Samsung SmartTV App resolutions

im working on Smart TV app for Samsung which should use youtube api to play videos. Embedded videos will work only when app resolution and yt player size are 960x540 or below,
if I set higher resolution (1280x720 or 1920x1080) player stucks, behaves really slow, and videos will buffer infinitely.
Has anyone succeeded in embedding yt videos with higher resolution player?
Thx in advance.
Video player work in FullHD resolution in fullscreen regardless of widget resolution.
If you have troubles with buffering, check your connection speed. Try play file from local network to check that selected resolution and codecs hadled well by TV.
recently i found this case. The youtube apps working great on 720p resolution if the video length is below 10min, but longer than that for example 30min the player will stuck just like as you said.
When changing the app resolution to 540p the youtube player working great again for all videos. I suppose the youtube is using progressive download on their player and Smart TV storage itself is not enough to prepare the long video storage space with 720p resolution rendering.
The conclusion is when using flash player/youtube in apps the best at using 540p app resolution.
Thx all for answering,
in the end I used different approach which showed like best solution.
I used 720p resolution, and youtube cue video functionality.
Basically i cued video, and on "videoCued" event i called "playVideo" method.
This allowed player to get ready and initialize before playing video.

iOS Video Creation Size

I am working on an app which is running some processing on the iOS AV Foundation video stream, and then generating a video using the processed output.
I'm noticing that if I make the output frames of the video too large, the processing time to render the video frames is too large, and my app gets choppy.
Does anyone have a good suggestion for a method I can use to determine at run-time what the largest video size I can create without affecting (drastically) the framerate of the video? This way, if the app is running on an iPhone 5, it should be able to create higher-resolution videos than if it's running on an iPhone 4.
One thought that I had was that before the recording starts, I could try and render a few frames at different resolutions behind the scenes, and time how long the render takes, and use the largest one is takes less than X, but if there's a better way, I'd love to hear it.
Another option would just be to experiment off-line with what gives me good performance on different devices, and hard-code the video resolution per device type, but I'd rather avoid that.
Thanks in advance!

Resources