How to setup HLS Live Video Streaming from iOS Device - ios

Good day everyone!
So, as the title suggests, i am developing an app with similar functionality to that off Periscope and Facebook Live video streaming. Here is what the end goal is:
A Broadcasting device [user]
EC2 Instance [Hosting an ffmpeg transcoder]
Cloudfront Distrubution [CDN]
1 to n viewers of the live feed
I've been doing a lot of googling and what I cant seem to figure out is:
As you send chunks of video to the server from the Broadcaster, how do
you create an
.m3u8 playlist when you don't have all the chunks of video yet (e.g. the
device sends its first 5second chunk of video)?
It seems a .m3u8 file is created from a .mp4 file that is already complete, then broken down into chunks... But i'm sending chunks of the video to the server, how can it generate the .m3u8 file when more chunks are still coming from the Broadcaster, so the watchers / clients can continuously stitch together the video chunks?
I'll be happy to clarify this question further. Thanks!

If you take a look at the docs for the segment muxer you can specify the m3u8 to be outputted and you can also tell it to update the m3u8 as it goes. It might look something like this:
ffmpeg -i infile.mp4 -c:v copy -c:a copy -map 0 -f ssegment -segment_list playlist.m3u8 -segment_list_type hls -segment_list_size 10 -segment_list_flags +live -segment_time 4 outchunk%07d.ts
Note the segment_list_size is the maximum number of chunks referenced in the m3u8 file at one time and the segment_list_flags tells ffmpeg that this a live stream.

I think your confusion is that you are trying to send HLS fragments to their server. Don’t. Send a stream via another protocol like RTPM. Then let the server convert to HLS.

Related

How can I stream MP4 videos from S3 without AVPlayer downloading the files before playing them?

I have a lot of long (45 mins - 90 mins) MP4 videos in a public S3 bucket and I want to play them in my iOS app using AVPlayer.
I am using AVPlayerViewController to play them but I need to wait several minutes before they start playing as it downloads the whole video rather than streaming it.
I am caching it locally so this is only happening the first time but I would love to stream the video so the user doesn't have to wait for the entire video to download.
Some people are pointing out that I need Cloudfront to stream videos but in the documentation, I've read that this is only necessary when you have many people streaming the same file. I'm building a MVP so I only need a simple solution.
Is there any way to stream an MP4 video from an S3 bucket with AVPlayerViewController without it fully downloading the file before playing it to the user?
TLDR
AVPlayer does not support 'streaming' (HTTP range requests) as you would define it, so either use an alternative video player that does or use a real media streaming protocol like HLS which is supported by AVPlayer & would start the video before downloading it all.
CloudFront is great for delivery in general but is not truly needed - you may have seen it mentioned due to CloudFront RTMP distributions but they now have been discontinued.
Detailed Answer
S3 supports a concept called byte-range fetches using HTTP range requests - you can verify this by doing a HEAD request to your video file & seeing that the Accept-Ranges header exists with a value set to bytes (or not 'none').
Load your MP4 file in the browser & notice that it can start as soon as you click play. You're also able to move to the end of the video file and yet, you haven't really downloaded the entire video file. HTTP range requests are what allow this mechanism to work. Small chunks of the video can be downloaded as & when the user gets to that part of the video. This saves the file server & the user bandwidth while providing a much better user experience than the client downloading the entire file.
The server would need to support byte-range fetches in the first instance before the client can then decide to make range requests (or not to). The key is that, once the server supports it, it is up to the HTTP client to decide whether it wants to fetch the data in chunks or all in one go.
This isn't really 'streaming' as you know it & are referring to in your question but it is more 'downloading the video from the server in chunks and playing it back' using HTTP 206 Partial Content responses.
You can see this in the Network tab of your browser as a series of multiple 206 responses when seeking in the video. The entire video is not downloaded but the video is streamed from whichever position that you skip to.
The problem with AVPlayer
Unfortunately, AVPlayer does not support 'streaming' using HTTP range requests & HTTP 206 Partial Content responses. I've verified this manually by creating a demo iOS app in Xcode.
This has nothing to do with S3 - if you stored these files on any other cloud provider or file server, you'd see that the file is still fully loaded before playing.
The possible solutions
Now that the problem is clear, there are 2 solutions.
Using an alternative video player
The easiest solution is to use an alternative video player which does support byte-range fetches. I'm not an expert in iOS development so I sadly can't help in recommending an alternative but I'm sure there'll be a popular library that the industry prefers over the in-built AVPlayer. This would provide you with your (extremely common) definition of 'streaming'.
Using a video streaming protocol
However, if you must use AVPlayer, the solution is to implement true media streaming with a video streaming protocol - true streaming also allows you to leverage features like adaptive bitrate switching, live audio switching, licensing etc.
There are quite a few of these protocols available like DASH (Dynamic Adaptive Streaming over HTTP), SRT (Secure Reliable Transport) & last but not least, HLS (HTTP Live Streaming).
Today, the most widely used streaming protocol on the internet is HLS, created by Apple themselves (hey, maybe the reason to not support range requests is to force you to use the protocol). Apple's own documentation is really wonderful for delving deeper if you are interested.
Without getting too much into protocol detail, HLS will allow playback to start more quickly in general, fast-forwarding can be much quicker & delivers video as it is being watched for the true streaming experience.
To go ahead with HLS:
Use AWS Elemental MediaConvert to convert your MP4 file to HLS format - the resulting output will be 1 (or more) .M3U8 manifest files in addition to .ts media segment file(s)
Upload the resulting output to S3
Point AVPlayer to the .M3U8 file
let asset = AVURLAsset(url: "https://ermiya.s3.eu-west-1.amazonaws.com/videos/video1playlist.m3u8")
let item = AVPlayerItem(asset: asset)
...
Enjoy near-instant loading of the video
CloudFront
In regards to Amazon CloudFront, it isn't required per se & S3 is sufficient in this case but a quick Google search will mention loads of benefits that it provides, especially caching which can help you save on S3 costs later on.
Conclusion
I would go with converting to HLS if you can, as it will yield more possibilities down the line & is a better true streaming experience in general, but using an alternative video player will work just as well due to iOS AVPlayer restrictions.
Whether to use CloudFront or not, will depend on your user base, usage of S3 and other factors.
As you're creating an MVP, I would recommend just doing a batch conversion of your MP4 files to HLS format & not using CloudFront which would add additional complexity to your cloud configuration.
Like #ErmiyaEskandary said, you could just use HLS to solve your problem, which is probably a good idea, but you should not have to wait for the entire MP4 file to download before playing it with AVPlayer. The issue is actually not with AVPlayer or byte-range requests at all, but rather with how your MP4 files are formatted.
You could have your MP4 file configured incorrectly for streaming. MP4's have a metadata section called the MOOV atom. By default, many encoders put this at the back of the file. In this case, the player would have to download the entire file before it could begin playing.
For streaming usecases, this would need to be put at the front of the file. The player then will only need to buffer the MOOV atom, and it can begin playing the video as the data is loaded.
You can use ffmpeg with the fast start flag enabled to move the MOOV atom to the beginning of the file.

How to download a LIVE HLS m3u8 stream on iOS

HLS streams can be "live" or "VOD". Downloading a VOD HLS stream is easy.
However, I want to download (or record) say 5 minutes of a LIVE HLS stream. Is this possible?
If I do so, I am sure I have to make significant changes to the m3u8 file... One reason is live streams do not have a "duration", but the stream I download has to be streamed as VOD so it must have a duration. There might be various other changes required that I am not aware of. Presumably URLs of ts segments would also need to be changed.
Any tips or advice (hopefully actual code!)?
Thanks!
PS. Note that this question is not about playing back the stream in offline mode - I know I need an HTTP server for that.
The Live playlist uses a sliding-window. You need to periodically reload it after target-duration time and download only the new segments as they appear in the list (they will be removed at a later time).
Save the #EXTINF for each segment and start writing them in a VOD playlist using the same target-duration and a media sequence starting at 0.
When you want to stop recording add the EXT-X-ENDLIST tag at the end.
It doesn't matter how you name your segments as long as you use the same name in the m3u8.

FFmpeg save stream to mp3

I have an iOS project that play online radio streams, it is use FFmpeg to play. Also I added ability to record streams, decode streams via avcodec_decode_audio4 function, and write output to .wav file. But this files are too big, because it is uncompressed format, so I want to decode files to .mp3.
I have found couple ways to convert audio but only when audio it is ready file, but I want decode to some compressed format as soon as I get chunk of data from stream, not ready file.
Is it possible?
Can you give me some advise how to achieve this?
You can use ffmpeg (aka libav) to encode the audio you're reading with avcodec_decode_audio4 into a file as mp3, as long as libav was configured with lame (--enable-libmp3lame).
Basically, you configure an mp3 codec, then call avcodec_encode_audio2 (who names these things?) on the progressive output of avcodec_decode_audio4.
The canonical example can be confusing because it also deals with video, but you should be able to tease the details you want out of it.
This post on transcoding audio by arashafiei is broadly helpful.

Flash Media Server, HLS and FLV

I use RTMP to stream from my iPhone to my server with FMS. I followed some tutorials and now I have the flv playback file in /webroot/live_recorded.
What i want to do is the following.
1) Stream from iPhone to server using RTMP : DONE
2) Stream back to iPhone using HLS : I don't understand the docs and i read hundreds of threafds but none helped me. I would like the user to read the stream from the beginning, as it is stored on my server. Thanks
i'm actually not about FMS.. i work with Wowza and i suppose you'll need something like nDVR feature or have someone write special module for you that will split live stream into small recordings, and so you'll need to play playlist of such recorded files from your iPhone.
hopefully someone will recommend true solution, not just some assumptions :)

Transcode/remux FLV and stream on the fly

I'm trying to teach myself a bit about video streaming and transcoding, with some Roku app development on the side. I have a number of video files (mostly in FLV format (H.264/AAC)) that I would like to stream to a client, which in this case is a Roku box (that accepts MP4 (H.264/AAC) and HTTP Live Streaming (HLS)). I'm wondering if it is possible to transcode/remux the FLV files and stream them to the client on the fly, perhaps over HLS?
I have tried using ffmpeg to remux the files and serve them immediately during the transcoding process, but they are unplayable until the write process is complete. I can get the Roku to play my completed MP4 files just fine via Apache/Rails.
But I'm wondering... is it possible to set up a server to transcode/remux a file and immediately have the output file (from ffmpeg/whatever tool I'm using) streamed to the client? If so, what tools are required to accomplish this? Is it possible to use a media file segmenter to chop up a file as it's being transcoded or remuxed?
I'm well aware that the transcoding process is CPU intensive, but I'm not so much worried about the practicality of transcoding and streaming on the fly since this is simply a personal education project (and I have an idle system that is capable if handling this).
Apologies if I'm way off base here, just trying to hack my way through this.
Thanks!
The trick to getting HLS served immediately that a TS segment has been completed is getting the playlist to dynamically update as the data arrives on disk.
What you are trying to do is essentially stream a Live event over HLS, which absolutely can be done, it just takes co-ordination between tools.
The opensource segmenter is able to do this, the trick is to have ffmpeg write out a single MPEG-TS stream (Unsegmented) and write this to a named pipe (Or equivalent for your OS), then have segmenter read from this named pipe and write the files to a directory within your shared webspace.
The segmenter repeatedly updates the M3U8 file on disk while processing so it can be used as a "Live" stream until the task is finished.
When ffmpeg closes its output the segmenter puts the end tag in the M3U8 and the file becomes "VOD".
The segmenter can be downloaded here

Resources