I'm working on an app that connects to a security camera. The camera has its own SIP server (Asterisk).
I'm having a very hard time finding a reliable iOS library to connect to the camera.
Can anyone recommend a high-quality SIP library that will stream video? I've tried several so far and none of them are fit for the task (I don't want mention them by name).
Or is there another way to access the video (using webRTC or possibly AVFoundation via the Asterisk server)?
I do not have a lot of experience with hardware, so I'm a bit lost.
What are you looking for called MCU(media control unit). There are some free availible for vido, but all are early beta and very hard to setup.
Related
I want to build an iOS application that people can video call or audio call to each other. Stable calling is my goal, and it means I need less connection interrupt as much as good, I also need light application (not too high application size because of video libraries)
I've googled about "ios video chat' keywords since last few days. Researched and found that the most popular framework (technology, library) for video/audio calling are XMPP and WebRTC (I'm I right or do guys have something better?)
XMPP - Client/server TCP communication
WebRTC - P2P Connection
The information about these libraries make me confused, so which library I should use for better performance, light application, stable?
Any idea?
XMPP is about signaling (reaching from A to B, indicating the desire to have a "call", disconnecting, etc).
WebRTC is about media (actually sending voice and video).
You need both signaling and media in your app.
For media use WebRTC. There's nothing else that will make sense. On iOS, it is kind of tricky at the moment, as iOS 11 incorporates WebRTC already, so how this will apply and help you in your development is yet to be seen (see here).
My suggestion is to aim for a web app and then figure out if you need to go for a fully native implementation and port WebRTC to iOS - or just use a webview inside an app (Cordova or Crosswalk should do).
For signaling, you can use XMPP. Or anything else for that matter. My own personal preference is a proprietary protocol. Look at Matrix or SimpleWebRTC for that.
Also - don't forget that you will need to deal with STUN and TURN - NAT traversal, but that's a simpler thing to handle.
XMPP Framework: https://github.com/robbiehanson/XMPPFramework/wiki/IntroToFramework
WebRTC Native Code: https://webrtc.org/native-code/ios/
its not about which is best its about what fulfills our requirements
Just from my own general curiosity, I was wanting to see if it was possible to stream LIVE audio between iOS devices by online using bluetooth or by "enslaving" one device over a local network—basically the same experience as a phone/Skype call. I have found tuts/information on how to stream a saved file and solutions that use a server side solution, but not exactly what I am looking for.
Anyone with solutions, information, or what not to get me started would be much appreciated.
A Little Background On Why I Have To Do This
I am currently optimising an app in order to improve the transferring of media files to the WiFi speakers that our team developed. Our solution before was using iPhone as an HTTP server and then allow the speakers to connect and download music from it. But unfortunately a lot of problems occurred such as frequent slow transfer speed, file read failure, and when user uses the "seek" command, the speakers would have to download the whole file in order for it to seek into that particular time before it starts to play. This is a very bad experience for our users.
What I Need
In order to solve the problem I mentioned above. We thought of changing the HTTP server to an RTP server that will be ran on an iPhone and then allows the WiFi speakers to stream music from it. However, from what I read on other Q&A platforms they mentioned that iPhone does that support transferring of data using RTP. I also tried searching here in stack but were not able to find an answer that solves my problem.
My Question
Is it possible to run an RTP server on iPhone and is there any demo about this that I can refer to?
Any suggestions would be high appreciated.
Please read link http://dss.macosforge.org/
Darwin Streaming Server from Apple official.
However, I'm not sure it can work on iOS.
Best regards,
I've been searching for a while on stackoverflow and around the web for a solution to my video-streaming problem. I need to stream live video being captured from the camera (no high-quality required) from an iOS device to a remote PC in one way, i.e., the iOS device will be sending a video stream to the server/PC but not the opposite.
What appears after some googling and documentation browsing is that there are two main major standards/protocols that can be used:
Apple's HTTP Live Streaming (HLS)
Adobe's RTMP
Again, my requirement is that the iPhone/iPad will be streaming the video. From what appears on Apple's website, I understand that HLS is to be used from an encoding perspective server-side, and a decoding perspective iOS side. As of RTMP, most libraries that allow iOS streaming have commercial licenses and closed code or require you to go through their P2P infrastructure (for instance angl.tv or tokbox.com/opentok/quick-start). As of HLS, no encoding libraries seem to exist iOS side.
So my questions are:
Do you know of any SDK/Library preferably open and free that I could integrate to stream captured video from within my app?
If no, do you think developing a custom library would be a risky jungle-crossing endeavour? My guess is to go through AVFoundation and capture camera frames, compress them frame by frame and send them over HTTP. Does that sound crazy performance and bandwidth wise? Note that in that case I would need an HLS or RTMP encoder either ways.
I thank you very much in advance dear friends.
Mehdi.
I have developed such a library, and you can find it at github.com/jgh-/VideoCore
I am updating this answer because I have created a simplified iOS API that will allow you to easily setup a Camera/Mic RTMP session. You can find it at https://github.com/jgh-/VideoCore/blob/master/api/iOS/VCSimpleSession.h.
Additionally, VideoCore is now available in CocoaPods.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.