Play segmented, archived video with Microsoft Smooth Streaming - smooth-streaming

I'm new to Smooth Streaming. I am able to use Microsoft Smooth Streaming player to play live video that spans across multiple segments. The manifest seems to have information about all segments.
But for playback from archive, I am able to point the URL in the HTML to the ISM in one of the segments and I can play back that particular segment fine, but I don't know how to play back the entire video with abilities to rewind, forward, etc.
Is it possible to do such across multiple segments?

For playing back across multiple segments, I ended up adding the archived segments to the player's playlist and used player.GoToPlaylistItem() and player.SeekToPosition() to forward, rewind and play across multiple segments in an archive.

Related

Fast video stream start

I am building an app that streams video content, something like TikTok. So you can swipe videos in table, and when new cell becomes visible the video starts playing. And it works great, except when you compare it to TikTok or Instagram or ect. My video starts streaming pretty fast but not always, it is very sensible to network quality, and sometimes even when network is great it still buffering too long. When comparing to TikTok, Instagram ... in same conditions they don't seam to have that problem. I am using JWPlayer as video hosting service, and AVPlayer as player. I am also doing async preload of assets before assigning them to PlayerItem. So my question is what else can I do to speed up video start. Do I need to do some special video preparations before uploading it to streaming service. (also I stream m3U8 files). Is there some set of presets that enables optimum streaming quality and start speed. Thanks in advance.
So theres a few things you can do.
HLS is apples preferred method of streaming to an apple device. So try to get that as much as possible for iOS devices.
The best practices when it comes to mobile streaming is offering multiple resolutions. The trick is to start with the lowest resolution available to get the video started. Then switch to a higher resolution once the speed is determined to be capable of higher resolutions. Generally this can be done quickly that the user doesn't really notice. YouTube is the best example of this tactic. HLS automatically does this, not sure about m3U8.
Assuming you are offering a UICollectionView or UITableView, try to start low resolution streams of every video on the screen in the background every time the scrolling stops. Not only does this allow you to do some cool preview stuff based off the buffer but when they click on it the video is already established. If thats too slow try just the middle video.
Edit the video in the background before upload to only be at the max resolution you expected it to be played at. There is no 4k resolution screen resolutions on any iOS device and probably never will be so cut down the amount of data.
Without getting more specifics this is all I got for now. Hope I understood your question correctly. Good luck!

iOS/AVFoundation: How to eliminate (occasional) blank frames between different videos within an AVComposition during playback

The app I’m working on loops a video a specified # of times by adding the same AVAssetTrack (created from the original video url) multiple times to the same AVComposition at successive intervals. The app similarly inserts a new video clip into an existing composition by 'removing' the time range from the composition's AVMutableCompositionTrack (for AVMediaTypeVideo) and inserting the new clip's AVAssetTrack into the previously removed time range.
However, occasionally and somewhat rarely, after inserting a new clip as described above into a time range within a repeat of the original looping video, there are resulting blank frames which only appear at the video loop’s transition points (within the composition), but only during playback - the video exports correctly without gaps.
This leads me to believe the issue is with the AVPlayer or AVPlayerItem and how the frames are currently buffered for playback, rather than how I'm inserting/ looping the clips or choosing the correct CMTime stamps to do so. The app is doing a bunch of things at once (loop visualization in the UI via an NSTimer, audio playback via Amazing Audio Engine) - could my issue be a result of competition for resources?
One more note: I understand that discrepancies between audio and video in an asset can cause glitches (i.e. the underlying audio is a little bit longer than the video length), but as I'm not adding an audioEncodingTarget to the GPUImageWriter that I'm using to record and save the video, the videos have no audio components.
Any thoughts or directions you can point me in would be greatly appreciated! Many thanks in advance.
Update: the flashes coincide with the "Had to drop a video frame" error logged by the GPUImage library, which according to the creator has to do with the phone not being able to process video fast enough. Can multi-threading solving this?
Update 2: So the flashes actually don't always correspond to the had to drop a video frame error. I have also disabled all of the AVRecorder/Amazing Audio Engine code and the issue still persists making it not a problem of resource competition between those engines. I have been logging properties of AVPlayer item and notice that the 'isPlayBackLikelyToKeepUp' which is always NO, and 'isPlaybackBufferFull' which is always yes.
So problem is solved - sort of frustrating how brutally simple the fix is. I just used a time range a frame shorter for adding the videos to the composition rather than the AVAssetTrack's time range. No more flashes. Hopefully the users won't miss that 30th of a second :)
shortened_duration = CMTimeSubtract(originalVideoAssetTrack.timeRange.duration, CMTimeMake(1,30));

With html5 audio limitations on iOS, is it possible to play background music and sound effects at the same time?

I've been reading about the limitations of html5 on iOS.
Currently, all devices running iOS are limited to playback of a single audio or video stream at any time. Playing more than one video—side by side, partly overlapping, or completely overlaid—is not currently supported on iOS devices. Playing multiple simultaneous audio streams is also not supported. You can change the audio or video source dynamically, however. See “Replacing a Media Source Sequentially” for details.
Apparently I can only play one file at a time. A common technique is to have one file, but combine all of the sounds you need into this one file and seek to the parts you want to play. This is called an audio sprite.
But here's what's not clear to me: If I use an audio sprite, can I overlap it with itself? For example, can I have the sound of a bullet while I'm playing background music? Or, can I have the sound of two bullets firing simultaneously?
Recent versions of Mobile Safari (http://caniuse.com/audio-api) support the Web Audio API, which supports simultaneous playback.
Check this demo on an iOS device: https://webaudiodemos.appspot.com/TouchPad/index.html
Shameless plug for a simple wrapper: https://github.com/endemic/sona

iOS, how to play multi- videos on the same view?

I want to play four videos on the same view within a small rectangular area, is that possible ?
I think you can do this with AVPlayer and using AVMutableComposition.
With AVMutableComposition you can combine multiple sources into one combined playback source - I'm currently working on an Application where i use this to combine different audio and video sources and play it back as a single track even though they come from seperate files.

How to display and control full-screen video in an XBOX game

I would like to create a game for XBox360 which is mostly full-screen HD videos. The player will be given choices during the game to determine which video is to be played.
I need very fine-grained control over the video such as controlling playback speed, seeking to video frames and possibly applying simple effects to the videos.
I also want to be able to use augmented reality to add elements to the videos so I need to be able to render 3d objects over the video.
It would be great if this could be done in XNA however there is only basic video playback functionality there. What other options do I have?
Your options for decoding videos are limited. The VideoPlayer class provides functionality for playing videos from the start, pausing and resuming them, looping them, and setting their audio volume.
As far as displaying videos goes - you have a huge degree of freedom. You basically get each frame of the video as a texture that you can draw as a sprite, or apply to any 3D object. This includes using it as an input to a pixel shader, allowing you to apply all kinds of effects to the video.
The only alternative to the built-in player is to create your own. If you want to target the Xbox 360 this will limit you to managed code only. I am not aware of any suitable video decoder libraries.
For Windows, a little Googling revealed this library, which may be a good starting point.

Resources