How can you adjust white balance setting for a custom iOS Camera App? - ios

I want to manually adjust the white balance using a slider before I start recording video from the camera. I have looked at the AVFoundation Framework but it does not allow to pick a value for WB. What frameworks/classes do I need to to adjust the WB in this way?

I haven't been able to find any info on setting the camera's white balance (though I don't know for sure that it's not possible). But, you can always post-process with the white balance Core Image filter (aka CIWhitePointAdjust).
You can read about applying Core Image filters here.

Related

Different methods of displaying camera under SceneKit

I'm developing a AR Application which can use few different engines. One of them is based on SceneKit (not ARKit).
I used to make SceneView background transparent, and just display AVCaptureVideoPreviewLayer under it. But this have created a problem later - turns out, that if you use clear backgroundColor for SceneView, and then add a floor node to it, which has diffuse.contents = UIColor.clear (transparent floor), then shadows are not displaying on it. And the goal for now is to have shadows in this engine.
I think the best method of getting shadows to work is to set camera preview as SCNScene.background.contents. For this I tried using AVCaptureDevice.default(for: video). This worked, but it has one issue - you can't use video format that you want - SceneKit automatically changes format when it's assigned. I even asked Apple for help using one of two help requests you can send to them, but they replied, that for now there is no public api that would allow me to use this with the format I would like. And on iPhone 6s this format changes to 30 FPS, and I need it to be 60 FPS. So this option is no good.
Is there some other way I would assign camera preview to scene background property? From what I read I can use also CALayer for this property, so I tried assigning AVCaptureVideoPreviewLayer, but this resulted in black color only, and no video. I have updated frame of layer to correct size, but this didn't work anyway. Maybe I did something wrong, and there is a way to use this AVCaptureVideoPreviewLayer or something else?
Can you suggest some possible solutions? I know I could use ARKit, and I do for other engine, but for this particular one I need to keep using SceneKit.

iOS, Objective C auto image processing filters

I'm doing a photo app and sometimes the lighting is off in certain areas and the picture isn't clear. I was wondering if there was a feature that can auto adjust the brightness, contrast, exposure, saturation of a picture like in photoshop.
I don't want to manually adjust images like the sample code given by apple:
https://developer.apple.com/library/ios/samplecode/GLImageProcessing/Introduction/Intro.html
I want something that will auto adjust or correct the photo
As an alternative you could use AVFoundation to make your implementation of the camera and set the ImageQuality to high and the autofocus or tap to focus feature. Otherwise, I am almost certain you cannot set this properties, The UIImagePicker controller included in the SDK is really expensive memory wise and gives you an image instead of raw data (another benefit of using AVFoundation). This is a good tutorial for this in case you would like to check it out:
http://www.musicalgeometry.com/?p=1297
Apparently someone has created it on Github: https://github.com/proth/UIImage-PRAutoAdjust
Once imported, I used it the following:
self.imageView.image = [self.imageView.image autoAdjustImage];

iOS camera preview color temperature

Is it somehow possible to get the white balance color temperature (and tint) from a camera preview or from a saved picture?
I am able to get other exposure values in real time based on this SO thread, like f-stop, exposure time, ISO, etc. The white balance always returns just 0, probably meaning "auto white balance". When I save an image from live preview the EXIF data has the white balance still just a zero.
I need to get the white balance color temperature in Kelvins the image/live camera preview was balanced to. I read some stuff about hidden APIs to get/set color temperature, but I cannot use hidden APIs. Any ideas if/how is this possible on iOS 7? Thank you.
No, I'm afraid it's not possible (at least not without the hidden APIs to which you refer—and they don't use degK, but some internal system). And yes, 0 is the code for Auto white balance (1 would be manual).
It seems like it's possible to get values with the current API, you can check this out if you're still interested.
https://developer.apple.com/documentation/avfoundation/avcapturedevice/whitebalancetemperatureandtintvalues

cvCaptureFromCAM() / cvQueryFrame(): disable automatic image correction?

I'm using the two OpenCV functions mentioned above to retrieve frames from my webcam. No additional properties are set, just running with default parameters.
While reading frames in a loop I can see that the image changes, brightness and contrast seem to be adjusted automatically. It definitely seems to be a operation of OpenCV because the image captured by the camera is not changed and lit constantly.
So how can I disable this automated correction? I could not find a property that seems to be able to do that job.
You should try to play around with these three parameters:
CV_CAP_PROP_BRIGHTNESS Brightness of the image (only for cameras)
CV_CAP_PROP_CONTRAST Contrast of the image (only for cameras)
CV_CAP_PROP_SATURATION Saturation of the image (only for cameras)
Try to set them all to 50. Also (if it won't help) try to change another camera capture parameters from documentation.
To answer that for my own: OpenCV is buggy or outdated here.
it seems to be impossible to get images in native resolution of the camera, they're always 640x480; also forcing it to an other value by setting width and height properties does not change anything
it seems to be impossible to disable the automatic image correction, the properties mentioned above seem not to work
the brightness/contrast properties doesn't seem to work as well - or at least I could not find any good values for it or the automatic image correction always overrides them
To sum it up: I'd not recommend to use OpenCV for some more enhanced image capturing.

Blur effect in a view of iOS

I want to use an UIImagePicker to have a camera preview being displayed. Over this preview I want to place an overlay view with controls.
Is it possible to apply any effects to the preview which will be displayed from camera? I particularly need to apply a blur effect to the camera preview.
So I want to have a blurred preview from camera and overlay view with controls. If I decide to capture the still image from the camera, I need to have it original without blur effect. So blur effect must applied only to the preview.
Is this possible using such configuration or maybe with AVFoundation being used for accessing the camera preview or maybe somehow else, or that's impossible at all?
With AV foundation you could do almost everything you want since you can obtain single frame from the camera and elaborate them, but it could lead you at a dead-end applying a blur on an image in realtime is a pretty intensive task with laggy video results, that could lead you to waste hours of coding. I would suggest you to use the solution of James WebSster or OpenGL shaders. Take a look at this awesome free library written by one of my favorite guru Brad http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework even if you do not find the right filter, probably it will lead you to a correct implementation of what you want to do.
The right filter is Gaussian blur of course, but I don't know if it is supported, but you could do by yourself.
Almost forgot to say than in iOS 5 you have full access to the Accelerate Framework, made by Apple, you should look also into that.
From the reasonably limited amount of work I've done with UIImagePicker I don't think it is possible to apply the blur to the image you see using programatic filters.
What you might be able to do is to use the overlay to estimate blur. You could do this, for example, by adding an overlay which contains an image of semi-transparent frosted glass.

Resources