I'm essentially working as a junior software engineer and I've been tasked with creating an iPad application that is capable of receiving a real-time stream from a UAV. The camera hardware hasn't been determined so I need to put a spec together so that it can be used in my iOS app.
However I feel like I'm massively inexperienced to do so as I don't have much of an understanding about media streaming and experience writing iOS apps to produce a decoder to play the video data - I started with Obj-C about 4 months ago when I joined the company.
I apologise if this isn't the best outlet for such a request but could anyone shed some light on the process of receiving low latency video streams or is it far beyond my current ability?
Thanks for any advice!
If you need to play the video you are receiving or do any playback operations, this should be what you are looking for: Writing an app to stream video to iPhone
You can also record the video, documentation from apple
Related
I'm working on an app that connects to a security camera. The camera has its own SIP server (Asterisk).
I'm having a very hard time finding a reliable iOS library to connect to the camera.
Can anyone recommend a high-quality SIP library that will stream video? I've tried several so far and none of them are fit for the task (I don't want mention them by name).
Or is there another way to access the video (using webRTC or possibly AVFoundation via the Asterisk server)?
I do not have a lot of experience with hardware, so I'm a bit lost.
What are you looking for called MCU(media control unit). There are some free availible for vido, but all are early beta and very hard to setup.
Just from my own general curiosity, I was wanting to see if it was possible to stream LIVE audio between iOS devices by online using bluetooth or by "enslaving" one device over a local network—basically the same experience as a phone/Skype call. I have found tuts/information on how to stream a saved file and solutions that use a server side solution, but not exactly what I am looking for.
Anyone with solutions, information, or what not to get me started would be much appreciated.
I'm having some trouble working with the iOS Audio frameworks to create a simple app. I would like to record audio through the Microphone and play it back to the user while recording.
I have tried each of the audio framework layers(AVFoundation, Audio Queue API, and RemoteIO), but have only found old documentation and broken examples.It seems like a simple request that AVFoundation should handle, but I have explored the following other SO questions and still find myself circling for hours to get the hang of this. Here is what I have reviewed:
iOS: Sample code for simultaneous record and playback (Other SO Users also state the accepted answer is not concrete and difficult to implement even with a delay of ~70ms.)
Record and play audio Simultaneously (From 2010 and very high level, I have downloaded the sample code and can't find a working example that does simultaneous playback and recording).
Adjust the length of an AudioUnit Buffer (RemoteIO is so confusing to me right now, is this really required?)
I have also downloaded and reviewed both the SpeakHere and AurioTouch sample projects from Apple. I promise I wouldn't post up without hours of googling and struggling. You can see "record audio and playback iOS simultaneously" returns many dated and non-working examples.I know myself and the community could really benefit from some updated documentation and examples in the audio section. RemoteIO seems to be too advanced for such a simple task. Thanks again for your help and consideration.
The appropriate way to do this is via AudioUnit APIs, even though it seems like a common scenario which should be handled by higher level APIs.
I wrote a small demo app using AudioUnit. You're free to try it our and modify it for suiting your purpose. The demo app does record audio and play it simultaneously, but it's recommended to use a ear phone to see the effect.
The RemoteIO Audio Unit is the only way to play back what is being recorded with low latency. RemoteIO is low latency because it runs audio callbacks in a separate dedicated real-time thread which is why it is fast, but also why it is a bit more complex to code. All the other iOS audio APIs are built on top of RemoteIO and thus add latency.
You will also need to configure the app's Audio Session APIs to request low latency with the appropriate audio session type. The foreground app can request and get audio input and output latencies as low as 5.6 milliseconds on most iOS devices most of the time.
After a lot of searching in the web I am going to ask here my question:
I want to stream audio between iPods WITHOUT or with less buffering. I am thinking of something like RTP or WebRTC for iOS.
The app which I will develop, enables one iPod to capture the microphone and to stream the audio to the other 50-100 iPods (Broadcast), but all this has to be LIVE.
One speaker's talk is recorded live and the other iPod's can hear the speaker at the same time with very less delay...
Is such a thing possible? Are there frameworks or software existing to handle this? I prefer Objective-C and HTML5.
I would be really happy getting any help. Thanks in forward.
I'd like to stream video from the camera on an iOS device to a receiver via wifi, in effect turning the device into a wireless webcam. Is there a way to build a small app that captures video input on an iOS app and sends it via an RTSP stream or similar?
As this is an ad hoc experiment, I'm not concerned about App Store guidelines and can jailbreak if necessary.
If I interpret your question correctly you more or less need to solve four problems:
Get the camera feed.
Convert/encode this to the right format.
Stream the data.
Prevent the phone from locking itself and going into deep sleep.
The first one is fairly simple and Apple has as always provided good documentation and examples -> API link. Make sure you check out their example in the end as you will get a CMSampleBufferRef data object back.
For the second and third part, you should check out the CFNetwork framework and specially CFFTPStream for streaming using FTP.
If your are only building this for yourself then you can always turn off the Auto-Lock feature in the settings. If you on the other hand would like to distribute this to other users you could use a trick to play a mute sound every 10 seconds. This is more or less how all the alarm clocks work in the App Store. Here's a tutorial. =)
I hope I helped a little bit at least.
Good luck and best regards!
I'm 70% of the way to doing the same thing. Here's how I did it:
Capture content from video input
Chop video into files for use in HTML Live Streaming.
Spin up a web server on the iPhone and make the video files available.
Connect to the IP address of the phone and viola! you've got live streaming video.
Last time I touched the code I was trying to debug my Live Streaming not working. I'll try and get my source code posted on github this weekend, if you'd like to take a look.