Video quality selector on iOS (React Native) - ios

We are trying to create a video quality selector in a mobile app which works the same as youtube. The quality is selected and the video plays in that quality and does not automatically switch. From the backend we have m3u8 format video files which contain all the different resolutions that the video can play in and their URLs. On web and android, we have found that we can access the contents of this m3u8 files to display the available resolutions and select the corresponding one for the resolution selected. e.g select 720p in menu and then play 720p video track.
On iOS however, we are using react native video, which is a wrapper around AVPlayer on the native side which from my research does not seem to provide the access to select the different tracks from the m3u8 file as is possible on android. selectedVideoTrack is exposed on react-native-video for ios but does not work, likely due to above statement.
Adjusting the maximum bit rate is provided on react native video api, but in order to select the quality we need to set the exact bit rate, or at least the minimum bit rate, but none of these are exposed in AVPlayer and thus not in react native player either.
One solution I have tried was to create different m3u8 file for each resolution the video supports, only providing that option in the m3u8 file so it cannot auto degrade or auto improve resolution as is the nature of HLS when a specific quality is selected. Then, once a different quality option was selected, I change the underlying url source of the video and seek to the previous position to maintain the position of the video, instead of it replaying from the beginning. This solution works, however it requires a large change to our current backend infrastructure and seems to cause some waiting time while the new url is loaded. So I would like to know if there are any other better solutions before we are forced to go forward with this one.
So my question is how are Youtube and other iOS apps with a quality selector doing things? Are they following my method or am I missing something which allows them to work with different resolution videos? Answers can be in native code as well as javascript react native solutions.

Related

Fast video stream start

I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.
So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!

The YouTube iframe API serves streams at resolutions that will not playback over cellular connections

I hope that the YouTube API team will address the following issue.
YouTube has disabled the ability to request a specific size using the setPlaybackQuality() method.
If I am correct, the YouTube iframe API automatically determines the appropriate resolution / size to serve up (small, medium, large, hd720 etc) depending upon the pixel dimensions of the embedded player.
This is a huge problem over cellular networks.
AT&T, Verizon, TMobile and others have all begun to throttle video streams and / or disable playback all together in some cases for streams above 480p.
In our case, we are seeing 1.5 - 2 minutes of buffering before playback in the embedded YouTube player at widths above 360px.
In portrait mode this limit would at least be somewhat acceptable, but in full-screen landscape on mobile, the preferred method for watching video, YouTube changes the quality automatically and in most cases serves up HD720p which almost immediately becomes stuck in buffering mode over cellular connections.
We need the ability to request a specific resolution, and/or we need YouTube to serve up video at 480p over cellular connections.
The suggestedQuality parameter of player.setPlaybackQuality(suggestedQuality:String):Void determines appropriate playback quality not only depending upon the pixel dimensions of the embedded player, but actually varies for different users, videos, systems and other playback conditions.
Setting the parameter value to default instructs YouTube to select the most appropriate playback quality.
YouTube selects the appropriate playback quality. This setting effectively reverts the quality level to the default state and nullifies any previous efforts to set playback quality using the cueVideoById, loadVideoById or setPlaybackQuality functions.
I assume this also takes mobile connections into consideration, but if you believe there is an issue on this API feature, you can contact YouTube here.

Removing low frequency (hiss) noise from video in iOS

I am recording videos and playing them back using AVFoundation. Everything is perfect except the hissing which is there in the whole video. You can hear this hissing in every video captured from any iPad. Even videos captured from Apple's inbuilt camera app has it.
To hear it clearly, you can record a video in a place as quiet as possible without speaking anything. It can be very easily detected through headphones and keeping volume to maximum.
After researching, I found out that this hissing is made by preamplifier of the device and cannot be avoided while recording.
Only possible solution is to remove it during post processing of audio. Low frequency noise can be removed by implementing low pass filter and noise gates. There are applications and software like Adobe Audition which can perform this operation. This video shows how it is achieved using Adobe Audition.
I have searched Apple docs and found nothing which can achieve this directly. So I want to know if there exists any library, api or open source project which can perform this operation. If not, then how can I start going in right direction because it does looks like a complex task.

How to control the number of buffered frames of VLCkit?

I have just built VLC library for iOS at VLCKit
and using it to display a video stream. I need to make it displays in real-time with a lowest latency, so I tried to find a way to reduce the number of buffered frames (or something similar to it) before display on an UIView.
I started looking in the module MobileVLCKit but it seems no property allows me to control that.
I am wondering if the change can be accomplished on MobileVLCKit itself or on the VLC library.
If so, will I need to modify the library and rebuild it? What is the parameter should I need to change?
After spending a mount of time to look into the vlc library without successful, I tried to stream with rtsp instead of rtmp protocol and the real-time of video produced has been improved.
Thus i also found a workaround solution by setting a timer to force player moves forward the buffered frames. It might cause jagging but keep video in more real-time.

With html5 audio limitations on iOS, is it possible to play background music and sound effects at the same time?

I've been reading about the limitations of html5 on iOS.
Currently, all devices running iOS are limited to playback of a single audio or video stream at any time. Playing more than one video—side by side, partly overlapping, or completely overlaid—is not currently supported on iOS devices. Playing multiple simultaneous audio streams is also not supported. You can change the audio or video source dynamically, however. See “Replacing a Media Source Sequentially” for details.
Apparently I can only play one file at a time. A common technique is to have one file, but combine all of the sounds you need into this one file and seek to the parts you want to play. This is called an audio sprite.
But here's what's not clear to me: If I use an audio sprite, can I overlap it with itself? For example, can I have the sound of a bullet while I'm playing background music? Or, can I have the sound of two bullets firing simultaneously?
Recent versions of Mobile Safari (http://caniuse.com/audio-api) support the Web Audio API, which supports simultaneous playback.
Check this demo on an iOS device: https://webaudiodemos.appspot.com/TouchPad/index.html
Shameless plug for a simple wrapper: https://github.com/endemic/sona

Resources