Playing a part of video using SMIL on iPad - ipad

I have an IIS Server with Smooth Streaming Media Services installed and a video file, f.e. it length equal 100.
Have SMIL markup any parameter or something another to next case:
I set SMIL file as a source if html5 video tag
In SMIL I set attributes like offset and duration
Video plays from 20 second to 50 and html5 video controls should show that video length is 30 seconds
It should works on iPad
Thanks

I'm assuming based on answer found here: Does iPhone support SMIL in incoming MMS? indicate that Apple doesn't really care about or implement SMIL as a standard, whether in MMS, or browser.

Related

Video quality selector on iOS (React Native)

We are trying to create a video quality selector in a mobile app which works the same as youtube. The quality is selected and the video plays in that quality and does not automatically switch. From the backend we have m3u8 format video files which contain all the different resolutions that the video can play in and their URLs. On web and android, we have found that we can access the contents of this m3u8 files to display the available resolutions and select the corresponding one for the resolution selected. e.g select 720p in menu and then play 720p video track.
On iOS however, we are using react native video, which is a wrapper around AVPlayer on the native side which from my research does not seem to provide the access to select the different tracks from the m3u8 file as is possible on android. selectedVideoTrack is exposed on react-native-video for ios but does not work, likely due to above statement.
Adjusting the maximum bit rate is provided on react native video api, but in order to select the quality we need to set the exact bit rate, or at least the minimum bit rate, but none of these are exposed in AVPlayer and thus not in react native player either.
One solution I have tried was to create different m3u8 file for each resolution the video supports, only providing that option in the m3u8 file so it cannot auto degrade or auto improve resolution as is the nature of HLS when a specific quality is selected. Then, once a different quality option was selected, I change the underlying url source of the video and seek to the previous position to maintain the position of the video, instead of it replaying from the beginning. This solution works, however it requires a large change to our current backend infrastructure and seems to cause some waiting time while the new url is loaded. So I would like to know if there are any other better solutions before we are forced to go forward with this one.
So my question is how are Youtube and other iOS apps with a quality selector doing things? Are they following my method or am I missing something which allows them to work with different resolution videos? Answers can be in native code as well as javascript react native solutions.

Streaming video from https with AVPlayer causes initial delay

I am using AVPlayer to play a video from an https url with a setup this:
player = AVPlayer(url: URL(string: urlString))
player?.automaticallyWaitsToMinimizeStalling = false
But since the video is a little long, there is a short blank screen delay before the video actually starts to play. I think this is because it is being loaded from https.
Is there anyway to remove that delay by making AVPlayer play the video right away without loading the whole thing?
I added .automaticallyWaitsToMinimizeStalling but that does not seem to make a difference.
If anyone has any other suggestions please let me know.
I don't think there is nothing to do with loading from https. what is your video file format? I think you are thinking of adaptive bitrate streaming behavior.
https://en.wikipedia.org/wiki/Adaptive_bitrate_streaming#Apple_HTTP_Live_Streaming
HTTP Live Streaming (HLS) is an HTTP-based media streaming
communications protocol implemented by Apple Inc. as part of QuickTime
X and iOS. HLS supports both live and Video on demand content. It
works by breaking down streams or video assets into several small
MPEG2-TS files (video chunks) of varying bit rates and set duration
using a stream or file segmenter. One such segmenter implementation is
provided by Apple.[29] The segmenter is also responsible for producing
a set of index files in the M3U8 format which acts as a playlist file
for the video chunks. Each playlist pertains to a given bitrate level,
and contains the relative or absolute URLs to the chunks with the
relevant bitrate. The client is then responsible for requesting the
appropriate playlist depending on the available bandwidth.
For more information about HTTP Live Streaming
https://developer.apple.com/documentation/http_live_streaming
This tutorial includes some experiments on HTTP Live Streaming version and Non-HTTP Live Streaming version.
https://www.raywenderlich.com/5191-video-streaming-tutorial-for-ios-getting-started
Have you tried using AVPlayerItem's preferredForwardBufferDuration? You can manage how long AVPlayer continues to buffer using this property.
player.currentItem?.preferredForwardBufferDuration = 1
From Apple's own documentation:
The duration the player should buffer media from the network ahead of the playhead to guard against playback disruption.
This property defines the preferred forward buffer duration in seconds. If set to 0, the player will choose an appropriate level of buffering for most use cases. Setting this property to a low value will increase the chance that playback will stall and re-buffer, while setting it to a high value will increase demand on system resources.
Suggestion:
As a video is streaming, we are also relying on a network connection. So for poor network connection, It always has a chance to display a blank screen.
We can do like, We can fetch thumbnail of streaming video on the previous screen from the server or can generate thumbnail from application side from streaming URL. When streaming screen open, display thumbnail to a user with streaming indicator and when video starts streaming hide thumbnail.
In that particular case you can place an UIImageView above to the AVPLayerlayer view.
That image work as a landscape/cover image of your video which matched the first frame of your video along an UIActivityIndicator on subview.
Now hide that image when video got about to play.
This helps to hide the black frame of your video as it inevitable to handle the initial buffer state of the video.

The YouTube iframe API serves streams at resolutions that will not playback over cellular connections

I hope that the YouTube API team will address the following issue.
YouTube has disabled the ability to request a specific size using the setPlaybackQuality() method.
If I am correct, the YouTube iframe API automatically determines the appropriate resolution / size to serve up (small, medium, large, hd720 etc) depending upon the pixel dimensions of the embedded player.
This is a huge problem over cellular networks.
AT&T, Verizon, TMobile and others have all begun to throttle video streams and / or disable playback all together in some cases for streams above 480p.
In our case, we are seeing 1.5 - 2 minutes of buffering before playback in the embedded YouTube player at widths above 360px.
In portrait mode this limit would at least be somewhat acceptable, but in full-screen landscape on mobile, the preferred method for watching video, YouTube changes the quality automatically and in most cases serves up HD720p which almost immediately becomes stuck in buffering mode over cellular connections.
We need the ability to request a specific resolution, and/or we need YouTube to serve up video at 480p over cellular connections.
The suggestedQuality parameter of player.setPlaybackQuality(suggestedQuality:String):Void determines appropriate playback quality not only depending upon the pixel dimensions of the embedded player, but actually varies for different users, videos, systems and other playback conditions.
Setting the parameter value to default instructs YouTube to select the most appropriate playback quality.
YouTube selects the appropriate playback quality. This setting effectively reverts the quality level to the default state and nullifies any previous efforts to set playback quality using the cueVideoById, loadVideoById or setPlaybackQuality functions.
I assume this also takes mobile connections into consideration, but if you believe there is an issue on this API feature, you can contact YouTube here.

iPad alpha on video tag

Well I have a burning issue, with the iPad.. more specifically Safari on the iPad.
The tag doesn't play .mov's at all and converting using Quicktime to .m4v loses my alpha layer.
Currently working with a PNG sequence but its horrible.. Is there any way I can keep my alpha channel and have the video tag work?
Many thanks!
Dave
html5-video is able to play mov-files! usually this happens, if the server isn t well configured with the MIME-type of the video. check my answer here Playing a movie/DVD on a website maybe this solves your mov problem and you don t have to use .m4v at all
it turned out that the iPad doesn't support the alpha channel (or at least every method and format I tried wouldn't work) and it wouldn't play the video because Apple have disabled the autoplay attribute
But I did make a workaround, there appears to be a buffer at page load in which you can start the play() function on a video and stop it meaning its ready to play. But you can only get one video in...
Bummer..

Playing mpg file in XNA

Is there any way to play mpg files in XNA? (I want to develop a game that a video stream has to play at background)
XNA has built-in video playback. A good place to get started using it might be Catalin's XNA 3.1 Video Sample.
One downside to XNA's built-in functionality is that it has limited format support (specifically WMV9). So you will need to convert your video to that format. Two options for encoding are Windows Movie Maker and Windows Media Encoder (which seems to have recently become Expression Encoder 4).
Once in that format, you can simply add it as content to your project. Then load it as a Video through the content manager, and use VideoPlayer to play it back, calling videoPlayer.GetTexture() to get a texture of the current video frame you can set on the device or pass to spriteBatch.Draw().

Resources