OpenCV Colour Detection Error - opencv

I am writing a script on the raspberry pi to detect the majority colour featured in a frame of a webcam and I seem to be having an issue. The following image is me holding up my phone with a blank red image on it. I seem to be getting an orange colour instead.
Now when I angle the phone I do in fact produce the red colour expected.
I am not sure why this is the case.
I am using a logitech c920 webcam that emits a blue light when activated and also have the monitor going. I am wondering whether the light from these two are causing this issue and when I angle it, these lights are not hitting it front on and thus not distributing the image.
I am still not heavily experienced in this area so I would enjoy hearing explanations and possible work arounds for my problem.
Thanks

There are a few things that can mess this up:
As you already mention, the light from the monitor and the camera.
The iPhone screen is a display, so flicker and sync might also be coming to play.
Reflection from the iPhone screen.
If your camera has automatic control for exposure and color balance etc., the picture quality can change as you move around.
I suggest using a colored piece of non-glossy paper so that you can remove the iPhone display's effects.

Related

1024x768 resolution with nearest neighbor scaling

I'm using Arch Linux with Cinnamon and a 4K or rather UHD screen (3840x2160) and would like to use a 1024x768 resolution with black bars on
the left and right side and nearest neighbor scaling instead of bilinear filtering which is the default for all monitors that I've ever seen.
There's ways to kind of get this to work by actually running 3840x2160 but rendering a 1024x768 and scaling it up.
This can be done with xrandr or nvidia-settings.
I also managed to get some black bars going.
So what worked best for me so far was this command:
nvidia-settings -a CurrentMetaMode="DP-2: 3840x2160_60 {ViewPortIn=1024x768, ViewPortOut=2880x2160+480+0, ResamplingMethod=Nearest}"
This gives me the crisp upscaling and black bars.
There's one problem though: The right side of the screen is "cut off".
Which means when I maximize windows it acts as if I was still using a 16:9 resolution, rendering the right part of let's say a browser not accessible.
In games that scroll by putting the mouse at the edge of the screen it makes this not work for the right side, while working on the left, top and bottom.
Does anyone know about this problem or has a better solution?
I'm open to anything, like for example using some WINE settings to pull this off or something. Since this is mainly for playing old games a completely different approach with WINE would be totally fine to solve the problem.
I've already tried all kinds of things over the last few days. At this point I would jump into the air if somebody knows of a way to get this to work.

Unity 2D: after-image from OLED screens in a high contrast situation

When I test my unity 2D game on my iPhone X, all background and sprite elements on the screen have a blue "halo" when moving my character. I have explored the issue with transparency on mobile, but the issue seems really strange. The blue halo appears only when the background is black. Anything brighter and it is absolutely fine. So I doubt it's a transparency issue given that it appears only when a dark background is present.
It is visible only on mobile, so taking a screenshot is useless.
If anyone wants to test do the following. Download or open the image attached here to full screen. Zoom in just a bit so the shapes are taking most of the screen. Start moving the image left and right. Slow and fast and you should see a blueish after-image around the edges. This should happen only on some OLED mobile screens.
If anyone ever encounters this. The result I mentioned is an after-image effect from the OLED screen on the iPhone X. I haven't tested on other OLED devices, but I assume depending on the software it is possible other models can experience this. The levels of Black are incredible, but when you have a high contrast situation between light and dark, an after-image is created around the edges of the contrast zone.
How to fix this?
Simply do not use full black backgrounds or elements. Near black colors in a game situation is indistinguishable from a true black, 0, 0, 0 RGB, choice. This might be a common game design principle I am unaware of and I am the only person stupid enough to use 0,0,0 in the first place, but anyway, I hope if someone has the same issue to read this and fix it easily,

Any way to control the Quad LED individually in iOS swift?

iPhones now have four LEDs to produce the flash light.
I want to programatically turn each LED on or off to produce different light effects, or even just specify the balance between pink LEDs and white leds.
Any way to have a tighter control on those in swift ?
Digging the API didn't show much...
Thanks !!

How to track an opened hand in any environment with RGB camera?

I want to make a movable camera that tracks an opened hand (toward the floor). It just needs to track the opened hand but it has to also know the rotation (2d rotation).
This is what I searched for so far:
Contour- As the camera is movable, the background is unknown, even the lighting is not fixed. It's hard for me to get a clear hand
segment in real time.
Haar- It seems this just returns a rect and can't deal with rotation.
Feature detect- A hand doesn't have enough detail for this.
I am using the Opencv Unity plugin to do this.
EDIT
https://www.codeproject.com/Articles/826377/Rapid-Object-Detection-in-Csharp
I see another library can do something like this. Can OpenCV also do this?

Turn off part of the iPhone Screen

I'm trying to work on an iOS app that turns off the screen (so that when you are in the dark, it doesn't show any light at all), but I can turn on and off certain pixels of the screen.
Does anyone know a way to do this? I am assuming that it's either all or nothing... Either turn off all the screen, or turn it all on, but maybe someone knows better?
I am guessing that the best I could do is turn the background black so that as little light as possible comes through. However, I tried this using my iPad and it still generates too much light in a pitch black room. I want for instance, only 1 pixel to shine brightly, and the rest to not glow at all. And if this is not possible, is there any way to mimic this?
As far as I'm aware the screen is either on or off, there's no controller for setting specific regions to on/off.
Also you can't change the pixel level brightness, as the screen isn't illuminated that way. You can only control the overall screen brightness, not each pixel, as the screen lighting technology doesn't work that way, it's specifically an LED backlight.
I'm afraid I agree with #garretmurray. What you're trying to do just isn't possible. You're on the best track though.
You can change the background color of the view you're on to black, and use the following to kill the screen's brightness:
[[UIScreen mainScreen] setBrightness:0.0f];
What you're trying to do is not possible.

Resources