Stream only audio track from mp4 file - ios

I have a remote MP4 file that has video and sound. I want to only stream the audio track of the mp4 file. I don't need the video and want to decrease the internet usage in my app. I have no idea where to start with this, Google doesn't seem to help. Is this impossible? Any ideas?

You can't... If you want audio only, you need to split video / audio on the server or put a stream software (like vlc, ffmpeg or mplayer) who can re-encode file in realtime (so you can drop video for new stream)
It's better, for me, to process all files on server and extract audio track...

Related

Extract CMSampleBuffer from HLS video stream on iOS

How can I save what AVPlayer is currently playing (both video and audio) from Live HLS stream?
I know how to load and play m3u8 video file using AVPlayer.
Please note that the HLS stream is live and not Video on demand so cannot use AVAggregateAssetDownloadTask In the perfect scenario will get CMSampleBuffer objects which can save to file easily. Also AVPlayerItemOutput is not entirely an option because I am unable to see how will get the audio channel.
Seems not possible with the current SDK. I've implemented it using ffmpeg

iOS Playing Video from memorystream

My iOS app downloading encrypted mp4 file from server to my document folder.
I'll decrypt mp4 to my Memory and play video from memory.
For security, my app should not make decrypted mp4 file to doc folder.
How can I play Video from memorystream?
I'm trying with FFMpeg..or is there any other solutions?
Can I customize avio_open2() and avformat_open_input()?
Please help me....
Set AVFormatContext->pb to a self-created AVIOContext that wraps your memory stream. Most important are the read_packet() and seek() function callbacks, which your application should implement to do actual packet reading and seeking for the (mp4) demuxer. You can also look at earlier questions along the same path.

IOS Live streaming

Problem:
I need to download mp4 video from media server and simultaneously feed the localhost httpserver (GCDWebServer) to enable video streaming.
What have i tried so far:
I am able to download a video file from media server and then provide the localhost url for streaming the video.
But this takes a lot of time in downloading. So i wanted to know if there is a way i can segment the video on the fly and keep updating the m3u8 file as and when my next segment is ready.
Ok, Why downloading? Basically i need to deal with a custom encrypted video file. So i need to download, then decrypt the stream and then feed to my local httpserver.
I hope i'm clear in the requirements.

How to play video and audio extracted from mkv file in metro app

I have managed to read matroska container and able to extract video and audio streams from a mkv file in my metro app. Now, I don't know how to feed the streams for playing. I want to know the concept of throwing media data on display. I have another option to repack the video and audio(s) with subtitles into MP4 container which will be played by MediaElement by default but that will not look like having a matroska codec in my metro app.
So, basically my question is: How to use MediaElement or any display graph to read the video and audio pulled out from any video container.
Please guide me. Thanks.

How to play stream audio in iPhone/iPad

I want to play sounds from an internet server in my own program. But the sample codes Apple supplied concerned about sound play are all open an audio file, and then play it.
I want to know how can I play PCM data from memory, which received from internet continuously. Either OpenAL or AudioQueue is OK.
Give this a look:
http://cocoawithlove.com/2008/09/streaming-and-playing-live-mp3-stream.html
http://developer.apple.com/iphone/library/documentation/iPhone/Conceptual/iPhoneOSProgrammingGuide/AudioandVideoTechnologies/AudioandVideoTechnologies.html#//apple_ref/doc/uid/TP40007072-CH19-SW8

Resources