Ambient light not realistic - lighting

Do you have an idea how to make my light sources more realistic and ambient? Lights from windows is just a material with color and luminance. But the lamps (highlighted) are made with a basic Light. But as you can see, it doesn't lighten the are around, but it only creates a "light ball" area that is also passing trough objects(the bridge for example) and enabling shadows didn't fix the problem. Also, I turned off Global Illumination cause it just slowed my render, but the quality was EXACTLY the same as it is without GI. Any suggestions would help. (Btw, I added Fog to my Physical Sky to create a moody atmosphere)

Related

Rendering White in USDZ files

Problem: USDZ files appear at about 80% white (light grey) even if perfect (0,0,0) is set on the texture files.
Troubleshooting: with/without AO files, tested our own USDZs and also created a simple project in Reality Converter with a primitive object at full white glossy paint.
Q: Is it possible to render bright whites in AR in ARKit Quicklook?
ARKit in QuickLook uses PBR (Physically-Based Rendering) to render your models. In real life, white objects never really look fully white because they only reflect the light they receive, so I'm not surprised you can't get a fully white object.
There is an exception to this, both in PBR and real life, and that is light sources.
PBR shaders have an "emissive" channel. Try setting that channel to white and it'll probably look a lot whither.
Doing a bit of googling, I found this Sketchfab resource that suggests exactly this:
Quick Look does not have a Shadeless mode, so 3D scans and other models set to Shadeless may look darker than expected. A workaround could be to duplicate the base color texture in the emission channel.

OpenCV Colour Detection Error

I am writing a script on the raspberry pi to detect the majority colour featured in a frame of a webcam and I seem to be having an issue. The following image is me holding up my phone with a blank red image on it. I seem to be getting an orange colour instead.
Now when I angle the phone I do in fact produce the red colour expected.
I am not sure why this is the case.
I am using a logitech c920 webcam that emits a blue light when activated and also have the monitor going. I am wondering whether the light from these two are causing this issue and when I angle it, these lights are not hitting it front on and thus not distributing the image.
I am still not heavily experienced in this area so I would enjoy hearing explanations and possible work arounds for my problem.
Thanks
There are a few things that can mess this up:
As you already mention, the light from the monitor and the camera.
The iPhone screen is a display, so flicker and sync might also be coming to play.
Reflection from the iPhone screen.
If your camera has automatic control for exposure and color balance etc., the picture quality can change as you move around.
I suggest using a colored piece of non-glossy paper so that you can remove the iPhone display's effects.

Fake colors through camera

in an iOS application I need to recognize colors through camera, but analyzing the problem, I noticed that different kinds of light make the colors observed in the captured picture a little bit different from the real ones. For example, under a high neon light a light blue seems like a gray.
What is the cause and what kind of approch I could follow to solve this problem of "fake colors"?
The colors are not fake, they are just different than what you expect them to be. As #Piglet said this has a lot to do with the physics of light and white balance may help.
If you want to read more about it look at:
Color Rendering Index
Color Metamerism
Sensitivity Metamerism Index
Color Constancy
These all refer to the physics behind why different illuminations create different colors. There is also the camera color pipeline that contributes its share, so you can also read about white balance and tone mapping...

Add sunflare kind of lighting effect on image

I am looking for sunflare kind of lighting effect on Image. In my case light source is bulb instead of Sun. And I understand that changing light source changes rays effect and even different type of bulb produces different type of light beams.
Here is an example of focus light : http://www.photoshopessentials.com/photo-effects/focus-light/
I have looked into standard photo filters and OpenCV but we didn't find something obvious. I am looking for approach and direction to achieve it.
My knowledge is limited to iOS business apps only. I have heard only name of frameworks like OpenGL,SceneKit and MetalKit.So, I would prefer to have solution which fits into my knowledge stack but yes I would love to know all possible solutions.
Let me know if you want me to explain more.
Any help would be appreciated.

SKEmitterNode how to maintain same effect with blend mode "add" across different backgrounds

I have a really cool effect that I like that I made using sks files in xcode and the blend mode 'add'. Now I didn't realize it at the time but after looking at the apple docs I saw that the effect is actually based off the background color, specifically:
Adds the pixel values of the particle and underlying images. Creates a white pixel if this value is greater than 1
Now, I want to have the same effect across every different background color but as far as I know the only way to do that is to use the "Alpha" blend effect. But this only gives me the option of having solid colors. This is the graphics that I want to apply across all different background colors:
How can I go about having this effect across all different background colors? I'm using the default spark particle file.
UPDATE:
I'm leaving this question unanswered until either apple comes up with a way to do what I want or someone else finds a way to do it.
Due to the unique nature of particle systems AND the very limited masking facilities of SpriteKit, I don't think this can be done.
Availability of inversion masking, in an unnested way that's not the clusterfuck of masking in SpriteKit as we currently know it, would instantly solve this problem.
The way to do this, ordinarily without inversion masking, would be to have two instances of the exact same particle system, one acting as a mask to cut out the excess black, one the visual elements you see over the black, that's then composited (as a whole) over your background.
Here's KnightOfDragon suffering with the individuality of particle systems for another use case: Duplicating a particle emitter effect in Sprite Kit

Resources