How to capture a photo automatically in iPhone and iPad - ios

How to capture photo automatically in android phone? is about how to take a picture automatically without people's interaction. This feature is needed in many applications. For example, when you are going to take a picture of a document, you expect that the camera can take it automatically when the full document is insider the picture (or four corners of the document). So my question is how about doing it in iPhone or iPad?
Recently, I am working on Cordova, and does someone know that there are some plugins that have already existed for this kind of camera operations? Thanks
EDIT:
This operation will be done in an APP that will be given the full access of the camera, and the task is how to develop such an APP.

Instead of capturing photo, you should capture video frames. When the captured frame satisfies your requirements, stop capturing the video and proceed.

Related

How can I leverage the camera to detect certain occurrences?

This is kind of what a barcode scanner does, except I do not wish to detect a barcode (I will write the code for what I want to detect). How do I even set up the camera so it is a continuos scanner? Like the user just presses a play button and the camera will automatically scan for stuff? Just as an example, say I wish to run the scanner until the camera runs into the event that the whole screen is pure black, at which point it will display the message "detected all black".
There is an older Apple Technical Q&A that details how to use AVFoundation to continuously generate low resolution UIImages from a video capture session that you could then sample and use for your detection:
https://developer.apple.com/library/ios/qa/qa1702/_index.html

Can iPhone 5, 6 or 6+ take a PICTURE with both cameras at the same time?

I found some answers regarding both front and back camera usage at the same time regarding AUDIO/VIDEO recording, which is impossible.
In detail here:
Can the iPhone4 record from both front and rear-facing camera at the same time?
However, is it possible to use both cameras at the same time to take pictures for iOS?
No this is definitely not possible I'm afraid.
Only one camera session can be used at a time when using AVCaptureSession (the lower level API for camera interaction on iOS).
If you try to invoke multiple sessions (from each camera) as soon as one session begins, the other will stop.
You could quickly alternate between sessions, but the images will not be taken in synchronicity.

Disable camera shaking in ios

I am creating simple camera app and I want to add 'image stability' so when hands are shaking the camera does not twitch. Is it possible to do in iOS?
You can do this by getting the raw image from the camera, and only using a subset of the raw image frame, then programmatically picking a new subset for each raw image to use for the next frame. Needless to say, this is a large amount of work and should only be undertaken if you know what you are doing or want to have the most impressive video/picture taking app.
The iPhone 6+ has this built into the hardware and is, I believe, what the previous comment link to avfoundation is talking about.

iOS 7+ Is there a possibility to capture video from frontal camera while showing another video on the screen?

I have a task.
There is iOS device. There is an app I should create.
The app shows some video file (local video file from the device) while frontal camera captures users' face.
Showing video and capturing user's face via frontal camera are simultaneous.
I see that FaceTime and Skype for iOS can do this. But the former one created by Apple (they can do whatever on their devices) while latter one is owned by Microsoft (big companies/big money sometimes allowed more than usual developers).
Moreover, I doubt on co-existense of video capturing along with video player at the same time.
So, I am not sure that this task is 100% implement-able and publish-able.
Is it possible on iOS 7+?
Is it allowed by Apple to do this (I mean that there are many technical possibilities on iOS but only some of them are OK for Apple. Especially during moderation process)?
Are there good technical references?
I believe so. Doing a search on Appstore shows a number of video conferencing apps:
Zoom cloud
Polycom
VidyoMobile
Fuze
Just search for "video conferencing".

AV Foundation camera preview layer gets zoomed in, how to zoom out?

The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html.
My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than the image we can see through the still camera of the iPhone. Our customer needs to hold the iPhone around 5cm distance from the bar code when he is scanning, but if you hold the iPhone to that parameter, the whole QR code won't be visible and the decoding fails.
Why is Video camera in iPhone 4 enlarges the image (by seeing through the AVCaptureVideoPreviewLayer) ?.
This is a function of the AVCaptureSession video preset, accessible by using the .sessionPreset property. For example, after configuring your captureSession, but before starting it, you would add
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
See the documentation here:
iOS Reference Document
The default preset for video is 1280x720 (I think) which is a lower resolution than the max supported by the camera. By using the "Photo" preset, you're getting the raw camera data.
You see the same behaviour with the built-in iPhone Camera app. Switch between still and video capture modes and you'll notice that the default zoom level changes. You see a wider view in still mode, whereas video mode zooms in a bit.
My guess is that continuous video capture needs to use a smaller area of the camera sensor to work optimally. If it used the whole sensor perhaps the system couldn't sustain 30 fps. Using a smaller area of the sensor gives the effect of "zooming in" to the scene.
I am answering my own question again. This was not answered even in Apple Dev forum, therefore I directly filed a technical support request from Apple and they have replied that this is a known issue and will be fixed and released with a future version. So there is nothing we can do more than waiting and see.

Resources