How can I save what AVPlayer is currently playing (both video and audio) from Live HLS stream?
I know how to load and play m3u8 video file using AVPlayer.
Please note that the HLS stream is live and not Video on demand so cannot use AVAggregateAssetDownloadTask In the perfect scenario will get CMSampleBuffer objects which can save to file easily. Also AVPlayerItemOutput is not entirely an option because I am unable to see how will get the audio channel.
Seems not possible with the current SDK. I've implemented it using ffmpeg
Related
In earlier case of my I have played non encrypted .m3u8 video in offline mode
I have download a .m3u8 extension video file which is not encrypted
and I have played that video.
This video is download in the formate of .movpkg extension format.
I have download this video using AVAsset download Session
But in the case of encrypted video
I can download the .m3u8 extension video and this video get saved in extension of .movpkg
The issue rise when the saved video is played without any internet connection I know this reason for this because the decryption key is not available for it. So that video can decrypt and play in player level it self
Is there any possibility to play the saved video file without using internet(offline)
Do I need to use the apple fair play concept to play m3u8 video?
if so how can implement it in server side?
Have anyone used apple fair play concept?
How can implement the fair play concept in the FFmpeg Convertor?
Is there any method to override the key delivery mode in AVPlayer?
So i'm trying to play HLS streams on HTML5 without using Flash. We've tried many video players but they all relay on a flash player. My question, is it possible to play HLS streams (any) on HTML5 without using Flash?
(I know of the https://github.com/RReverser/mpegts but it doesn't work on mobile and is pretty laggy.)
m3u8 file is for live streaming. so please play your url in AVPlayer
https://developer.apple.com/streaming/
Well I'm trying to upload a video recorded with ffmpeg, but Youtube fail at processing it.
Here's the video information:
Here's the link https://www.youtube.com/watch?v=7XlxLh0usnY.
It turn's out that the problem in my case was the video stream and audio stream had differente lenghts. Always the video was shorter than the audio.
I had to use ffmpeg to make both streams with the same lenght.
I have a remote MP4 file that has video and sound. I want to only stream the audio track of the mp4 file. I don't need the video and want to decrease the internet usage in my app. I have no idea where to start with this, Google doesn't seem to help. Is this impossible? Any ideas?
You can't... If you want audio only, you need to split video / audio on the server or put a stream software (like vlc, ffmpeg or mplayer) who can re-encode file in realtime (so you can drop video for new stream)
It's better, for me, to process all files on server and extract audio track...
I have managed to read matroska container and able to extract video and audio streams from a mkv file in my metro app. Now, I don't know how to feed the streams for playing. I want to know the concept of throwing media data on display. I have another option to repack the video and audio(s) with subtitles into MP4 container which will be played by MediaElement by default but that will not look like having a matroska codec in my metro app.
So, basically my question is: How to use MediaElement or any display graph to read the video and audio pulled out from any video container.
Please guide me. Thanks.