Setting White balance and Exposure mode for iphone camera + enum default - ios

I am using the back camera of an iphone4 and doing the standard and lengthy process of creating an AVCaptureSession and adding to it an AVCaptureDevice.
Before attaching the AvCaptureDeviceInput of that camera to the session, I am testing my understanding of white balance and exposure, so I am trying this:
[self.theCaptureDevice lockForConfiguration:nil];
[self.theCaptureDevice setWhiteBalanceMode:AVCaptureWhiteBalanceModeLocked];
[self.theCaptureDevice setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[self.theCaptureDevice unlockForConfiguration];
1- Given that the various options for white balance mode are in an enum, I would have thought that the default is always zero since the enum Typedef variable was never assigned a value. I am finding out, if I breakpoint and po the values in the debugger, that the default white balance mode is actually set to 2. Unfortunately, the header files of AVCaptureDevice does not say what the default are for the different camera setting.
2- This might sound silly, but can I assume that once I stop the app, that all settings for whitebalance, exposure mode, will go back to their default. So that if I start another app right after, the camera device is not somehow stuck on those "hardware settings".

After a bit more research and help, I found the answers:
1- All camera settings (White balance, Focus and exposure) default to their "continuous" setting so that the camera is continuously adjusting for all. Check AVCaptureDevice.h for enum values.
2- All apps function in silo. Camera stops when the app that calls it is moved to the background. When a new app calls the camera, the above defaults are set again.

Related

Avoid scene monitoring in AVFoundation?

In my current project I need very specific control over the AVCaptureDevice and the lighting settings (ISO, exposure,flashMode and even TorchMode). I am not trying to get "high quality" photos, in the sense of typical photography, but it is really important for me to be able to control the settings of the camera precisely, in order to get useable photos.
This poses no problem, as long as I am keeping the flash turned off.
But when setting the flashMode to .on, scene monitoring is enabled in the capturePhoto()-method which determines the flash intensity and auto-focusses while using the torch, even though it is set .off.
https://developer.apple.com/documentation/avfoundation/avcapturephotooutput/1778634-photosettingsforscenemonitoring
The flash is required in every photo taken, regardless of lighting conditions. So my question is: Is there a way to avoid scene monitoring and have the capturePhoto() method always using the flash without the torch?
Thanks for your help!

Is it possible to create a custom flashMode in AVFoundation?

Today I found out that settings like a custom exposuremode or a fixed lens position in the focus mode, which I set for the AVCaptureDevice in my AVCaptureSession conflicted with the flashMode.on option of the AVCaptureSettings, as it alters the previously specified ISO and exposure duration values etc. .
So my question is: Is it possible to specify a custom flashMode that does not analyze the viewed scene and uses strictly pre-defined settings?
For me it is more valuable to be in control of these settings, even though it might lead to a loss of photo quality.

AVCaptureSession captures black/dark frames after changing preset

I'm developing app which supports both still image and video capture with AVFoundation. Capturing them requires different AVCaptureSession presets. I check for canSetSessionPreset(), begin change with beginConfiguration(), set required preset with sessionPreset and end with commitConfiguration().
I found if I'm capturing still image with AVCaptureStillImageOutput immediately after changing preset, it returns no errors, but the resulting image is black or very dark sometimes.
If I start capturing video with AVCaptureMovieFileOutput immediately after changing preset, first several frames in a resulting file are also black or very dark at times.
Right after changing preset the screen flickers likely due to the camera adjusting the exposure. So it looks like immediately after changing preset camera start measuring exposure from very fast shutter speed, which results in black/dark frames.
Both problems goes away if I insert a 0.1 second delay between changing the preset and starting capture, but that's ugly and no one can guarantee it will work all the time on all devices.
Is there a clean solution to this problem?
This is for future users...
It was happening for me when I was setting the sessionPreset as high and as soon as I was starting recording I was making changes to video output connection and setting focus which I then moved to while setting up camera and it worked!!!

Rear and front camera open together

Can we open rear and front camera together. I checked App in iTunes it is taking pic front camera and rear camera together. I have r&D but I have no proper solutions regarding this features. It is possible we can take pic from front and rear camera together.
No you can't open both cameras at a time. The only option you have it to toggle between the front and back cam.
Even if you do this by using NSTimer and toggle at the rate of 0.1 seconds to create an impression that you running both cams at a time, it will effect the camera performance, you will not get clear camera view.
You need to rely on AVFoundation to perform this operation if you want to have more control on the camera/video session.
Unfortunately AVCaptureSession startRunning method runs on serial queue, which doesnt allow parallel camera sessions to run.

How do I programmatically configure the accelerometer for the iPhone?

I'm not sure how to set the null point of the iPhone in Xcode. I have an app that has a UIimage that moves around the screen and works well.
What I want to do is set it indefinitely so that when the iPhone is held as you would when reading a text (i.e 45 degree angle) it is level (e.g., image doesn't move). I don't need calibration options for the user to set I just need to set it in the code.

Resources