I'm using the ImageCaptureCore framework to control a DSLR camera connected via USB in a Cocoa application. Now I'd like to do the same on iOS (camera connected to an iPad via a "Lightning to USB Camera Adapter") and wondered which framework to use.
I'm not going to submit the App to the AppStore, so using a private framework is totally fine.
I searched for appropriate headers in iOS-Runtime-Headers, but only found ImageCapture. Any hint in the right direction is most welcome.
As of iOS 13.0+, the ImageCaptureCore framework is available for use on iPadOS to
Discover connected cameras and scanners
View and modify the folders, files, and metadata on a connected camera
Take photos directly on a connected camera using tethered capture
Perform overview scans and scans on a connected scanner
Since there is no framework like ImageCaptureCore on the iOS platform available there are three options I think:
As stated in this question there may be the chance of talking to the device on low level. I don't know if there are any frameworks or standards that work for cameras or taking photos with cameras.
Depending on the camera you want to trigger there also might be the way of connecting your iPhone to the camera via cable to the trigger port of the DSLR. Specifications for the different manufacturer's specifications can be found here.
However I think you don't want just to trigger the camera but to transfer the taken images.
You could use a SD Card with an integrated access-point to connect to and copy the image via a high level protocol.
Related
I have one requirement in my iOS app.
1:- Can we use external camera to capture and Record Videos in my App.
2:- I am sure its not possible connecting external camera through USB without MFi License.
3:- So i want to go for Wifi option.
Please Comments.Any Library and reference would be really helpful.
I owned a Sony QX-100 and had been playing around with their Camera Remote APIs ( https://developer.sony.com/develop/cameras/ ), a functional example is included and you may reference from it.
I'm trying to connect USB camera to iOS in Xcode using Swift or Objective C.
I saw online to try and use IOKit.framework, but I believe it's for MAC, which I'm not using. dos NSSstring converters but I'm unsure that will work either. I'm trying to do it through iOS.
I also have a WIFI adapter that will connect to my NIKON DSLR and shoot photos to a PC over wifi but not sure how to get the iOS device to connect and download these images.
Thanks in advance!
Looks like this isn't possible, at least for what you want.
Communicating with an external accessory requires you to work closely
with the accessory manufacturer to understand the services provided by
that accessory. Manufacturers must build explicit support into their
accessory hardware for communicating with iOS. As part of this
support, an accessory must support at least one command protocol
These USB devices obviously have not been designed with iOS in mind, so there is a very small chance you can get it to work the apple way.
Alternatively, you can look into jailbreaking and that sorta deal with the cameraconnectionkit library. This is way out of my scope though, so good luck!
I am searching for a wifi or bluetooth camera that can be controlled with an iPhone using an app that I will create. I mean I want a programmable camera for iPhone, is this camera exist ?
If you have any info or a product please provide a link.
You could buy a Raspberry Pi with a camera module and build one yourself :)! You could then connect it via Wifi. There are plenty of resources out there, so just Google it.
I am working on the experimental project that need to connect external video camera to iPhone.
I found out that we can connect iPhone to external interface like Arduino using redpark cable that ship together with SDK. But I am not sure how iOS handle the raw data taken from the external camera.
I am wondering if AVFoundation can handle this part because we can specify the input device. But I am not sure how to point it to external device.
Or is there any other frameworks that can handle this task?
I am looking for tutorial or sample project that I can learn more about this.
The decoding you need to do depends entirely on the camera you will use.
But, given the data rate limitation of the serial cable you are considering, you will be practically limited to using a camera that can provide a low bit rate h.264 stream.
Decoding such a stream can be done with the ffmpeg library. Instructions for integrating it in an iOS project can be found in this SO question.
I want bidirectional USB communication between an iOS device and a digital camera using gPhoto2. gPhoto2 "abstracts communication ports and camera protocol, to allow a complete modularity."
Issues I've found:
Apple's strict requirements for apps interfacing with the iOS hardware layer will enviably lead to rejection in the App Store approval process. MFi may mediate this issue.
Getting full access to the lightning/30-pin doc connector to send/receive USB packets may require a private iOS library such as IOKit, and that will get my binary rejected from the App Store.
Connecting a camera via lightning/30-pin launches a PTPCamera-like task to allow the photo app to take over to import photos. That task must be killed get full USB access on OSX, so I'm guessing it's similar on iOS, and killing a task from an app's sandbox seems impossible.
Compiling gPhoto2 for iOS is inherently difficult since I can't dynamically link the gphoto2 library, and thus I must compile it as a static library.
Those are some of the issues I've run into. Is this project worth pursuing? Do you think it's even possible?
Yes, I know there are other solutions, such as using a wifi router or a custom built bluetooth module plugged into the camera to shuttle USB packets to and from the iOS device.