I have a project and the background color for UIView is ok in iPad Mini 1. The color is FEEB04.
But testing in iPad 2 Air the gamma is too bright.
What could be wrong?
Well, Apple has a history of using gamma to suit it's target audience's perceived needs. This couldbe a continuation of that - altering gamma as a substitute to more involved methods to counter the simultaneous contrast effect (also explained in the gamma faq) occurring when staring at your pad in lighter or darker conditions than anticipated.
That's somthing that could be "wrong".
Related
Video demonstration (please watch, 18s): https://youtu.be/4HtkadKEnWM
When I reduce the brightness of the device, iOS seems to be changing white to off-white. For my use case, it is important that the view background is actually white so the video appears to seamlessly blend into the background. Interestingly, when I take a screenshot or screen recording this effect does not appear.
I am on light mode and don't have any custom color space settings.
Does anyone have any ideas on how to fix this? Thank you!
You have a misunderstanding of colours.
White doesn't really exist, but it is defined as (non-specular) reflection of standard illumination (ISO standard about paper say nearly 90% of reflection).
So white depends on brightness, so with the same light, you can see black, grey, white, or super-white. Just out eyes adapt quickly to a reference white.
For emissive devices, this is more tricky because emission doesn't depend on ambient light. For this reason, since years we have some sort of ambient light detection, so that software can tell screen to reduce brightness (or/and to change white balance [in this case "white balance" is about colours, not brightness).
If one person dim the screen, probably it is because it has less ambient light, and such person is not seeing white correctly. In any case, the eyes (and brain) will adapt quickly.
In short: white is a uniform colour at maximum brightness. if you dim the screen, it will appear grey for a short amount of time, because we had a different idea of maximum brightness. but we adapt.
Note: we have two adaptation: brightness and colours. Good filmakers may use it for good, but also colourists must handle it (on long grey scenes, not to make us to adapt and see it white).
Note: hardware may have two kind of brightness: one, the classic done by RGB, and one about global brightness (the backlight, also if now there is not more backlight). If you save screenshot, it takes just the RGB, and not the corrected values. But also if you take a photo, every cameras check the brightness, and automatically expose the photo (the average brightness of an image should be rendered as mid-grey). So also in this case you see no differences (if you really want to see it, you should set your camera in manual exposition mode, and take the two photos with same setting).
After communicating with Apple support, we discovered that the video_full_range_flag was not being set by ffmpeg. Re-encoding the videos using the following ffmpeg command worked: ffmpeg -i input.mp4 -map 0 -c copy -bsf:v h264_metadata=video_full_range_flag=1 output.mp4.
When I test my unity 2D game on my iPhone X, all background and sprite elements on the screen have a blue "halo" when moving my character. I have explored the issue with transparency on mobile, but the issue seems really strange. The blue halo appears only when the background is black. Anything brighter and it is absolutely fine. So I doubt it's a transparency issue given that it appears only when a dark background is present.
It is visible only on mobile, so taking a screenshot is useless.
If anyone wants to test do the following. Download or open the image attached here to full screen. Zoom in just a bit so the shapes are taking most of the screen. Start moving the image left and right. Slow and fast and you should see a blueish after-image around the edges. This should happen only on some OLED mobile screens.
If anyone ever encounters this. The result I mentioned is an after-image effect from the OLED screen on the iPhone X. I haven't tested on other OLED devices, but I assume depending on the software it is possible other models can experience this. The levels of Black are incredible, but when you have a high contrast situation between light and dark, an after-image is created around the edges of the contrast zone.
How to fix this?
Simply do not use full black backgrounds or elements. Near black colors in a game situation is indistinguishable from a true black, 0, 0, 0 RGB, choice. This might be a common game design principle I am unaware of and I am the only person stupid enough to use 0,0,0 in the first place, but anyway, I hope if someone has the same issue to read this and fix it easily,
This post is regarding an app that I already have in the AppStore. Its not been updated for months now so I'm totally sure that the version in AppStore was working before the iOS 11.2 update. Here is the problem/bug:
In the app you can choose between different types of dice to roll in AR. You can select the regular dice with is white/grey with black dots. The dice node is a SCNBox with 6 images attached to its sides. These images looks like this:
and I know that these are the images that gets added as the nodes materials:
But when I roll the dice it looks like this😩:
THE DICE ARE RED!?!?!😱 but somehow the black dots still appear
So the question I would like to ask you guys is this, do you know what might have caused this? Couldn't find anything in the iOS 11.2 release notes that should affect the dice in any way.
EDIT:
Tried setting material.diffuse.textureComponents = .red
The dice are now white, but the colors are wrong. The dots are not black anymore and the white/grey gradient is inverted.
This can happen if the Asset Catalog compiler automatically converts the images to ones with a grayscale colour space. In this case setting .textureComponents to SCNColorMaskRed will solve the issue.
The idea is that the red colour channel will also be used for the green and blue colour channels, instead of having a value of 0.0.
in an iOS application I need to recognize colors through camera, but analyzing the problem, I noticed that different kinds of light make the colors observed in the captured picture a little bit different from the real ones. For example, under a high neon light a light blue seems like a gray.
What is the cause and what kind of approch I could follow to solve this problem of "fake colors"?
The colors are not fake, they are just different than what you expect them to be. As #Piglet said this has a lot to do with the physics of light and white balance may help.
If you want to read more about it look at:
Color Rendering Index
Color Metamerism
Sensitivity Metamerism Index
Color Constancy
These all refer to the physics behind why different illuminations create different colors. There is also the camera color pipeline that contributes its share, so you can also read about white balance and tone mapping...
What would be a better approach in Corona re static background for a 2D scrolling based game?
Let's say the game level "size" is the equivalent of two screens wide and two screens deep.
Q1 - Would one large background image be an OK approach? This would probably be easier as you could do the work in Photoshop to prepare. Or is there a significant advantage (re performance) in Corona to have a small image "pattern" and repeat this in Corona (Lua code) to create the backdrop?
Q2 - If one large background image approach is OK, would I assume that one might have to sacrifice the resolution of the image, noting the size (2xscreens wide, and 2xscreens deep) correct for the higher resolution devices? That is for the iPad 3 say, assuming your configuration would normally would try to pickup the 3x image version (for other images, say smaller play icons, etc.) that for the background you might have to remain with the 1x or 2x image size. Otherwise, it may hit the texture limit (I've read "Most devices have a maximum texture size of 2048x2048"). Is this correct / does this make sense?
I used both approaches on my games.
Advantages of tiled mode:
You can make huge backgrounds.
Can be made to use less memory (specially with smallish tiles with lots of repeating, like a real world wallpaper)
Allow for some interesting effects (like parallax scrolling).
Problems of tiled mode:
Use more CPU performance
Might be buggy and hard to make behave correctly (for example one of my games gaps showed between tiles, but only on iPad Retina... it required some heavy math hackery to make it work)
It is hard to make complex and awesome backgrounds (reason why my point and click adventure games don't use tiled backgrounds).
Pay attention that some devices, has a limit in the size of the textures in pixels, this might be the ultimate limit of how large a single-texture background can be.