Stock iOS camera app seems to have wider field of view - ios

I'm working on a video app for iOS, and everything seems to work fine other than if we go into stock iPhone camera app the field of view seems wider. I've tried playing with the different AVCaptureVideoPreviewLayer gravity settings (AVLayerVideoGravityResize, AVLayerVideoGravityResizeAspect and AVLayerVideoGravityResizeAspectFill) to no avail.
In fact, using other camera apps (like when taking a photo in Facebook) we get a much more similar output than we do with the stock camera app.
Does anyone have any information on why something like this might be? I've been scouring the internet and haven't been able to find anything.

It turns out the resolution of the still image camera is different then the resolution of the video camera, see for instance https://discussions.apple.com/thread/3490649?tstart=0

Related

Objective C Iphone take photos both cameras simultaneously

I need to take one picture from rear camera and another from back camera. I read that it wouldn´t possible at same time, but, you know if it is possible to switch between cameras in thi minimum time and try to take one fron and one back picture?
EDIT:
As I said before, I want to capture from both cameras at the same time. I know that it is not possible on Iphone devices but i tried to switch cameras very quickly. Iphone waste a lot of time switching between cameras. The ideal is to show in preview back camera and record frames from it, and record frames in front camera at the same time without previewing it and do not lose the front preview.
Thanks in advance.

How to capture a photo automatically in iPhone and iPad

How to capture photo automatically in android phone? is about how to take a picture automatically without people's interaction. This feature is needed in many applications. For example, when you are going to take a picture of a document, you expect that the camera can take it automatically when the full document is insider the picture (or four corners of the document). So my question is how about doing it in iPhone or iPad?
Recently, I am working on Cordova, and does someone know that there are some plugins that have already existed for this kind of camera operations? Thanks
EDIT:
This operation will be done in an APP that will be given the full access of the camera, and the task is how to develop such an APP.
Instead of capturing photo, you should capture video frames. When the captured frame satisfies your requirements, stop capturing the video and proceed.

Disable camera shaking in ios

I am creating simple camera app and I want to add 'image stability' so when hands are shaking the camera does not twitch. Is it possible to do in iOS?
You can do this by getting the raw image from the camera, and only using a subset of the raw image frame, then programmatically picking a new subset for each raw image to use for the next frame. Needless to say, this is a large amount of work and should only be undertaken if you know what you are doing or want to have the most impressive video/picture taking app.
The iPhone 6+ has this built into the hardware and is, I believe, what the previous comment link to avfoundation is talking about.

GPUImage presents vertical black bars when using iPhone front camera

I am trying to integrate GPUImage into my iPhone app to take photos with real-time filters. But I found an annoying problem: when I use the front camera to take photos, there are always some vertical black bars in the result image.
Rear camera never has this problem;
It has nothing to do with filters, I have tried without any filter.
First I think it has something to do with my iPhone hardware, then I searched on internet, other developers have met the same problem (with no solution).
see image here:
http://i.stack.imgur.com/rPAJ9.jpg
Maybe this can help you out:
I had the same exact problem. In the GPUImageStillCamera.m, I commented out "[self captureOutput:photoOutput didOutputSampleBuffer:imageSampleBuffer fromConnection:[[photoOutput connections] objectAtIndex:0]];" and now the glitches are gone. That's the only way I was able to fix the problem without forcing GPUImageOpenGLESContext's +supportsFastTextureUpload to return NO.
for more see: here

AV Foundation camera preview layer gets zoomed in, how to zoom out?

The application currently I am using has a main functionality to scan QR/Bar codes continuously using Zxing library (http://code.google.com/p/zxing/). For continuous frame capturing I used to initialize the AVCaptureSession and AVCaptureVideoOutput, AVCaptureVideoPreviewLayer described in the apple Q&A http://developer.apple.com/iphone/library/qa/qa2010/qa1702.html.
My problem is, when I used to run the camera preview, the image I can see through the Video device is much larger (1.5x) than the image we can see through the still camera of the iPhone. Our customer needs to hold the iPhone around 5cm distance from the bar code when he is scanning, but if you hold the iPhone to that parameter, the whole QR code won't be visible and the decoding fails.
Why is Video camera in iPhone 4 enlarges the image (by seeing through the AVCaptureVideoPreviewLayer) ?.
This is a function of the AVCaptureSession video preset, accessible by using the .sessionPreset property. For example, after configuring your captureSession, but before starting it, you would add
captureSession.sessionPreset = AVCaptureSessionPresetPhoto;
See the documentation here:
iOS Reference Document
The default preset for video is 1280x720 (I think) which is a lower resolution than the max supported by the camera. By using the "Photo" preset, you're getting the raw camera data.
You see the same behaviour with the built-in iPhone Camera app. Switch between still and video capture modes and you'll notice that the default zoom level changes. You see a wider view in still mode, whereas video mode zooms in a bit.
My guess is that continuous video capture needs to use a smaller area of the camera sensor to work optimally. If it used the whole sensor perhaps the system couldn't sustain 30 fps. Using a smaller area of the sensor gives the effect of "zooming in" to the scene.
I am answering my own question again. This was not answered even in Apple Dev forum, therefore I directly filed a technical support request from Apple and they have replied that this is a known issue and will be fixed and released with a future version. So there is nothing we can do more than waiting and see.

Resources