iOS camera preview color temperature - ios

Is it somehow possible to get the white balance color temperature (and tint) from a camera preview or from a saved picture?
I am able to get other exposure values in real time based on this SO thread, like f-stop, exposure time, ISO, etc. The white balance always returns just 0, probably meaning "auto white balance". When I save an image from live preview the EXIF data has the white balance still just a zero.
I need to get the white balance color temperature in Kelvins the image/live camera preview was balanced to. I read some stuff about hidden APIs to get/set color temperature, but I cannot use hidden APIs. Any ideas if/how is this possible on iOS 7? Thank you.

No, I'm afraid it's not possible (at least not without the hidden APIs to which you refer—and they don't use degK, but some internal system). And yes, 0 is the code for Auto white balance (1 would be manual).

It seems like it's possible to get values with the current API, you can check this out if you're still interested.
https://developer.apple.com/documentation/avfoundation/avcapturedevice/whitebalancetemperatureandtintvalues

Related

Device brightness with video on screen affecting white color

Video demonstration (please watch, 18s): https://youtu.be/4HtkadKEnWM
When I reduce the brightness of the device, iOS seems to be changing white to off-white. For my use case, it is important that the view background is actually white so the video appears to seamlessly blend into the background. Interestingly, when I take a screenshot or screen recording this effect does not appear.
I am on light mode and don't have any custom color space settings.
Does anyone have any ideas on how to fix this? Thank you!
You have a misunderstanding of colours.
White doesn't really exist, but it is defined as (non-specular) reflection of standard illumination (ISO standard about paper say nearly 90% of reflection).
So white depends on brightness, so with the same light, you can see black, grey, white, or super-white. Just out eyes adapt quickly to a reference white.
For emissive devices, this is more tricky because emission doesn't depend on ambient light. For this reason, since years we have some sort of ambient light detection, so that software can tell screen to reduce brightness (or/and to change white balance [in this case "white balance" is about colours, not brightness).
If one person dim the screen, probably it is because it has less ambient light, and such person is not seeing white correctly. In any case, the eyes (and brain) will adapt quickly.
In short: white is a uniform colour at maximum brightness. if you dim the screen, it will appear grey for a short amount of time, because we had a different idea of maximum brightness. but we adapt.
Note: we have two adaptation: brightness and colours. Good filmakers may use it for good, but also colourists must handle it (on long grey scenes, not to make us to adapt and see it white).
Note: hardware may have two kind of brightness: one, the classic done by RGB, and one about global brightness (the backlight, also if now there is not more backlight). If you save screenshot, it takes just the RGB, and not the corrected values. But also if you take a photo, every cameras check the brightness, and automatically expose the photo (the average brightness of an image should be rendered as mid-grey). So also in this case you see no differences (if you really want to see it, you should set your camera in manual exposition mode, and take the two photos with same setting).
After communicating with Apple support, we discovered that the video_full_range_flag was not being set by ffmpeg. Re-encoding the videos using the following ffmpeg command worked: ffmpeg -i input.mp4 -map 0 -c copy -bsf:v h264_metadata=video_full_range_flag=1 output.mp4.

Increasing the brightness level for .continuousAutoExposure Mode

I implemented a custom camera with the AVCaptureDevice.Preset.High preset and I'm using .continuousAutoExposure. Everything works as expected, however, the brightness of the picture is sometimes quite low.
I have researched the official documentation and found out, that I am able to set a custom ISO with setExposureModeCustomWithDuration. Unfortunately, doing it this way results in the loss of the wished automation of the exposure.
My question now is, is there a way to increase the overall brightness Percentage of the .continuousAutoExposure Mode? I need it to increase the exposure to around 5% only, but I also need to stick with the .continuousAutoExposure mode.
The trick is to set exposureTargetOffset property of the AVCaptureDevice instance. You need to use KVO to observe changes in the value of captureDevice.exposureTargetOffset and change it to your required exposure level. For more details, check this answer.

iOS: Is there a way to alter the color of every pixel on the screen?

How does apple alter the color of every single pixel on the screen (i.e. grayscale / inversion of colors), regardless of what object the color belongs to. It obviously isn't reading background color properties since it even affects images, as well.
How would one approach this?
To clarify my question, how can I change the intensity / hue of every pixel on the screen, similar to how f.lux does it?
How does apple alter the color of every single pixel on the screen?
Apple probably uses an API called CGSetDisplayTransferByTable which is not publicly available on iOS.
The display transfer table controls how each possible value in each of the three RGB channels is displayed on screen and can convert it to a different value. It works similar to Photoshop's "Curves" tool. By using the right transfer table it's possible to invert the screen, adjust the hue or enhance contrast.
Since the transfer table is part of the graphics hardware and is always active, there's zero performance overhead involved. On Mac OS there are actually two transfer tables: one for the application and one for the OS.
how can I change the intensity / hue of every pixel on the screen
Without jailbreak, you can't.

iOS, Objective C auto image processing filters

I'm doing a photo app and sometimes the lighting is off in certain areas and the picture isn't clear. I was wondering if there was a feature that can auto adjust the brightness, contrast, exposure, saturation of a picture like in photoshop.
I don't want to manually adjust images like the sample code given by apple:
https://developer.apple.com/library/ios/samplecode/GLImageProcessing/Introduction/Intro.html
I want something that will auto adjust or correct the photo
As an alternative you could use AVFoundation to make your implementation of the camera and set the ImageQuality to high and the autofocus or tap to focus feature. Otherwise, I am almost certain you cannot set this properties, The UIImagePicker controller included in the SDK is really expensive memory wise and gives you an image instead of raw data (another benefit of using AVFoundation). This is a good tutorial for this in case you would like to check it out:
http://www.musicalgeometry.com/?p=1297
Apparently someone has created it on Github: https://github.com/proth/UIImage-PRAutoAdjust
Once imported, I used it the following:
self.imageView.image = [self.imageView.image autoAdjustImage];

How can you adjust white balance setting for a custom iOS Camera App?

I want to manually adjust the white balance using a slider before I start recording video from the camera. I have looked at the AVFoundation Framework but it does not allow to pick a value for WB. What frameworks/classes do I need to to adjust the WB in this way?
I haven't been able to find any info on setting the camera's white balance (though I don't know for sure that it's not possible). But, you can always post-process with the white balance Core Image filter (aka CIWhitePointAdjust).
You can read about applying Core Image filters here.

Resources