external video input into iOS devices - ios

I am working on the experimental project that need to connect external video camera to iPhone.
I found out that we can connect iPhone to external interface like Arduino using redpark cable that ship together with SDK. But I am not sure how iOS handle the raw data taken from the external camera.
I am wondering if AVFoundation can handle this part because we can specify the input device. But I am not sure how to point it to external device.
Or is there any other frameworks that can handle this task?
I am looking for tutorial or sample project that I can learn more about this.

The decoding you need to do depends entirely on the camera you will use.
But, given the data rate limitation of the serial cable you are considering, you will be practically limited to using a camera that can provide a low bit rate h.264 stream.
Decoding such a stream can be done with the ffmpeg library. Instructions for integrating it in an iOS project can be found in this SO question.

Related

P2P Video in iOS + Unity

What would be the best way to realize p2p video (no audio) streaming between 2 iOS clients in real time? This would need to be inside of a Unity3D (or perhaps Cocos3D) game engine.
I've looked at some WebRTC based solutions like Icelink and OpenTok, but I don't have much experience with these technologies. Can someone recommend any de facto solutions for this type of task?
You can use Opentok webrtc-based platform to enable video (and audio) communication between two or more peers.
Opentok has native SDKs for Android and iOS so it should work for you since you are working in iOS.
In order to use it from another SDK such as Unity3d or Cocos3d, Opentok exposes the sent and received video frames (RGB or YUV) to the client, so you can take that video frame image data and render it any view inside the game engine using, for example, OpenGL.
As everything is implemented in the SDK and supported by Opentok platform, enabling the video communication is a matter of interacting with the SDKs so it shouldn't be so hard.

Video Streaming in iOS through WebRTC

I am trying to build a audio/video streaming app that works cross platform on iOS and Android mobile devices.
No matter how deep I Google, I'm ending up with suggestions that point me towards OpenTok/TokBox API. But this is what I wish to avoid.
I've checked a few demo, but WebRTC/HTML5 do not seem to work with streaming video/audio in iOS browser. For example, the https://apprtc.appspot.com demo does not work in Safari or Opera Mini in iOS.
When I try http://dev.opera.com/articles/media-capture-in-mobile-browsers/demo/ ... I can capture image using the default iOS camera picker from my browser but streaming video fails.
It seems like the getUserMedia() stuff is not supported by any browser in iOS.
Moreover, I am planning to put this on a WebView in a native iOS app. This sounds like a really far cry.
I wish someone could point me towards something that helps me build a video streaming app (hopefully using HTML5), that works uniformly for iOS and android (without TokBox).
You might want to look into Ericsson's Bowser App http://www.ericsson.com/research-blog/context-aware-communication/bowser-openwebrtc-released-open-source. It claims to provide WebRTC on Android and IOS. Apparently the App is currently under review in the App Store so if you wait it may just be a case of downloading the App. However it's also open source so if you can't wait then you can build it yourself https://github.com/ericssonresearch/bowser.
getUserMedia and WebRTC Peer-to-peer connections APIs are not supported in iOS.
One of the reason is that at the moment efforts around WebRTC focus on VP8 video codec which Apple and Microsoft do not support natively. Support in the near future is unlikely with Microsoft pushing for its own standard.
Doing what you want on iOS requires you use a native iOS compatible solution like OpenCV which supports video capture. You can find on Google tutorials on how to implement a solution based on OpenCV.
good news, will be supported at Safari 11.0
https://developer.apple.com/library/content/releasenotes/General/WhatsNewInSafari/Safari_11_0/Safari_11_0.html

Using the ImageCaptureCore framework on iOS

I'm using the ImageCaptureCore framework to control a DSLR camera connected via USB in a Cocoa application. Now I'd like to do the same on iOS (camera connected to an iPad via a "Lightning to USB Camera Adapter") and wondered which framework to use.
I'm not going to submit the App to the AppStore, so using a private framework is totally fine.
I searched for appropriate headers in iOS-Runtime-Headers, but only found ImageCapture. Any hint in the right direction is most welcome.
As of iOS 13.0+, the ImageCaptureCore framework is available for use on iPadOS to
Discover connected cameras and scanners
View and modify the folders, files, and metadata on a connected camera
Take photos directly on a connected camera using tethered capture
Perform overview scans and scans on a connected scanner
Since there is no framework like ImageCaptureCore on the iOS platform available there are three options I think:
As stated in this question there may be the chance of talking to the device on low level. I don't know if there are any frameworks or standards that work for cameras or taking photos with cameras.
Depending on the camera you want to trigger there also might be the way of connecting your iPhone to the camera via cable to the trigger port of the DSLR. Specifications for the different manufacturer's specifications can be found here.
However I think you don't want just to trigger the camera but to transfer the taken images.
You could use a SD Card with an integrated access-point to connect to and copy the image via a high level protocol.

Streaming AAC+ (inside FLV) on iOS using AIR

For a streaming radio station, I have an AAC+ audio stream, inside an FLV container, delivered via HTTP. An example URL is http://3023.live.streamtheworld.com/ALTROCK_S01A_AAC. I wrote a simple AIR app (using the latest AIR and Flex SDK's) to play this stream, and it works fine on PC and Android, but doesn't play anything when deployed to the iOS simulator or a device (i.e., the bytes are loaded but there is no sound).
This is similar to Can FLV AAC stream be played in Android, but for iOS.
I wanted to use AIR in this scenario, since I need to listen for the Cue Points in the FLV - and this is easy to do if you're playing Flash in a web browser, so AIR seems like the natural choice. I have also looked at http://code.google.com/p/haxecast/ and https://code.google.com/p/project-thunder-snow/ but they all seem to use the same basic idea (parse the FLV using Netstream in "data generation mode" and feed the AAC+ data to a Video object) - and so they all hit the same wall on iOS.
I also came across this post which seems possibly related although it's not quite the same situation (e.g., it's not FLV).
Is AIR on iOS supposed to support this scenario- namely, streaming AAC+/FLV audio via HTTP?
EDIT: This post also appears to hit the same obstacle - so a lot of people are asking about this situation. Anyone from Adobe have any insight?
After much further research I've concluded that AIR on iOS just doesn't support this, and you have to build a native app (or at least use framework other than AIR) instead.

Using gPhoto on iOS to communicate with digital cameras over USB

I want bidirectional USB communication between an iOS device and a digital camera using gPhoto2. gPhoto2 "abstracts communication ports and camera protocol, to allow a complete modularity."
Issues I've found:
Apple's strict requirements for apps interfacing with the iOS hardware layer will enviably lead to rejection in the App Store approval process. MFi may mediate this issue.
Getting full access to the lightning/30-pin doc connector to send/receive USB packets may require a private iOS library such as IOKit, and that will get my binary rejected from the App Store.
Connecting a camera via lightning/30-pin launches a PTPCamera-like task to allow the photo app to take over to import photos. That task must be killed get full USB access on OSX, so I'm guessing it's similar on iOS, and killing a task from an app's sandbox seems impossible.
Compiling gPhoto2 for iOS is inherently difficult since I can't dynamically link the gphoto2 library, and thus I must compile it as a static library.
Those are some of the issues I've run into. Is this project worth pursuing? Do you think it's even possible?
Yes, I know there are other solutions, such as using a wifi router or a custom built bluetooth module plugged into the camera to shuttle USB packets to and from the iOS device.

Resources