How to Play video data coming from server in iOS - ios

I am working on iOS application which needs showing video coming from server. Video data is coming in packets and its H264 encoded Format. After looking for AVFoundation option I didnt find any thing which plays video data received in packets from socket connection. If some one could point me in right direction it would be of very helpful. I am stuck over this for couple of days.

Related

iOS RTP live audio receiving

I'm trying to receive a live RTP audio stream in my iPhone but I don't know how to start. I'm seeking some samples but I can't find them anywhere.
I have a Windows desktop app which captures audio from the selected audio interface and streams it as ยต-law or a-law. This app works as an audio server that serves any incoming connection with that streaming. I have to say that I've developed an Android app that receives that stream and it works, so I want to replicate this functionality on iOS. In Android we have "android.net.rtp" package to manage this and transmit or receive data streams over the network.
Is there any kind of equivalent package for iOS to implement this? Could you give me any kind of reference / sample to do this, or just tell me where to start?
You can see this libraryHTTPLiveStreaming, But his protocol maybe is not standard one, You can check my fork aelam/HTTPLiveStreaming-1, I'm still working on it, it can be played by ffplay. You can try
check the file rtp.c in ffmpeg, I think it will help out

iOS MobileVLCKit and VideoCore conflict

I'm using MobileVLCKit to stream video and audio from Wowza RTMP server. At the same time I'm using VideoCore]1 to stream audio to Wowza RTMP server (I closed off the video channel in VideoCore). Now I'm attempting to make this sort of a teleconferencing solution. Now I'm limited to RTMP or RTSP, not teleconferencing solution (WebRTC or SIP or what not...I am not familiar with these at the moment) because of the limitation on the other end of the line.
The above setup doesn't work. Turning the both functions (video and audio streaming down and audio streaming up) individually runs fine. But not when run simultaneously as audio cannot be heard on the other end. In fact, when app started with VideoCore streaming audio upstream, as soon as I started to downstream via MobileVLCKit, audio cannot be heard on the other end, even though the stream is open. It appears that microphone is somehow wrested away from VideoCore, even though MobileVLC should not need the microphone.
However, when I made the two into two apps and allow them to run in the background (audio & airplay background mode), the two runs fine with one app stream down video & audio and the other picking up microphone voices and stream to the other end.
Is there any reason why the two functions appear to apparently be in conflict within the same app, and any ideas how to resolve the conflict?
I encountered the same problem. Say I have two objects, a vlc player and the other audio processor which listens the microphone. It works fine in simulator for operating both functions in the same time. But conflict in the iPhone device. I think the root cause is that there is only one position or right for listening the microphone. And vlc occupies the right so that my audio processor cannot work. But for some reasons, I cannot modify the vlc code. So I'd to figure out the workaround resolution. And I found one.
The problem comes from vlc which occupies the right but doesn't event use the microphone, and my audio processor did. So the way appears clearly. That is, vlc player plays first and then we new the other object instance, audio processor in my case, which needs to listen the microphone. Since audio processor comes after vlc player, it takes back the right of microphone listening. And they both work properly.
For your reference and hope it can help you.

ios audio input and output via rtp stream

I want to create an App, which can play sound received by rtp-stream and capture sound from the microphone and send it via rtp-stream. The problem is, that I have no idea how to start. Can someone tell me a library for rtp-streaming on ios devices? I took a look at Live555, but I didn't understand how it works.
Thanks

How to display RTSP from IP Camera/CCTV in iOS

There is obviously a way to do this because so many applications are already doing it - NetCamViewer and iCamviewer to name just one.
I have searched and searched, but I'm not finding anything of value that gives a hint as to how this is done. I'm reaching out hoping that someone will give me a clue.
I'm trying to connect to an video security camera (Y-CAM), which supports the RTSP protocol, and display the video from my iPhone/iPad application. The camera has an IP address and I can view the video from a web browser and from Quicktime running on my Mac. The problem is that RSTP is not supported on iOS so even trying to connect using Safari on an iPad doesn't work.
I've read that some are trying to use Live5555, but I haven't seen an article that describes if it has been done successfully and how.
An alternative is to capture the RTSP stream on a server, convert it to an HTTP Live stream and then connect to the HTTP Live stream from iOS. Unfortunately, this hasn't proved as easy as it sounds.
I'd prefer to go directly to the camera like other applications I've seen do. the RTSP to Live is a fall back if I have to.
Any hints are greatly appreciated. Thanks!
This is wrong :) or not necessary (An alternative is to capture the RTSP stream on a server, convert it to an HTTP Live stream and then connect to the HTTP Live stream from iOS. Unfortunately, this hasn't proved as easy as it sounds.)
You should use ffmpeg library, as this library can connect any streaming server (supporting rtsp, mms, tcp, udp ,rtmp ...) and then draw pictures to the screen.. (for drawing you can use opengles or uiimage also works)
First of all, use avformat_open_input to connect to your ip address
then use avcodec_find_decoder & avcodec_open2 to find codecs and to open them (you should call them for both audio & video)
Then, in a while loop read packets from server by using av_read_frame method
When you get frame, if it is audio then sent it to AudioUnit or AudioQueue,
if it is video, then convert it from yuv to rgb format by using sws_scale method and draw the picture to the screen.
That's all.
look at this wrapper also (http://www.videostreamsdk.com), it's written on ffmpeg library and supports iOS
You really need to search stack overflow before posting , this question has been asked many times. Yes live 555 sort of works and some of us have gotten it to work..
There are other players too, including ours http://www.streammore.tv/
You can find an open source FFMepg Decoder for iOS (and somes samples) on GitHub : https://github.com/mooncatventures-group
Sample use of this library : http://sol3.typepad.com/exotic_particles/
There are two general technology to display RTSP video on iOS Safari:
RTSP / HLS (H.264+AAC)
RTSP / Websocket (H.264+AAC ==> MPEG+G.711 or H.264+?)
For HLS you can consider Wowza server.
For Websocket playback in iOS Safari you can use WCS4 server.
Main idea for websocket playback is direct HTML5 rendering to HTML page Canvas element and audio context. In the case of MPEG playback video decoding will be done on iOS Safari side using plain JavaScript.
Another option - install a WebRTC plugin with getUserMedia support and play this stream via WebRTC. Anyway you will need a server side RTSP-WebRTC transcoder in such case.

Capture video on iOS device and live stream it to a server (or another mobile)

I want to be able to record footage using my iOS device and stream it directly to a server.
There's quite a few articles on S.O. that talk about this, but I'm not sure any have answered the question very well.
Should I be using HTTP Live Streaming, or is this just for sending data to an iPhone?
Should I be using AVCaptureSession to grab the video (a segment at a time?), sending each segment to the server?
Should I be using AVCaptureVideoDataOutput and ffmpeg for streaming?
I'm a little lost with all this, so any sample code or docs or links would be really appreciated.
Thanks for your help guys.
Duncan
You have to choose a network protocol for that purpose and find an appropriate media server to receive and process the stream. If the RTMP format is ok for your project, check angl library which supports RTMP streaming from iOS. Currently it's compatible with iOS 6 and 7.

Resources