GPUImage presents vertical black bars when using iPhone front camera - ios

I am trying to integrate GPUImage into my iPhone app to take photos with real-time filters. But I found an annoying problem: when I use the front camera to take photos, there are always some vertical black bars in the result image.
Rear camera never has this problem;
It has nothing to do with filters, I have tried without any filter.
First I think it has something to do with my iPhone hardware, then I searched on internet, other developers have met the same problem (with no solution).
see image here:
http://i.stack.imgur.com/rPAJ9.jpg

Maybe this can help you out:
I had the same exact problem. In the GPUImageStillCamera.m, I commented out "[self captureOutput:photoOutput didOutputSampleBuffer:imageSampleBuffer fromConnection:[[photoOutput connections] objectAtIndex:0]];" and now the glitches are gone. That's the only way I was able to fix the problem without forcing GPUImageOpenGLESContext's +supportsFastTextureUpload to return NO.
for more see: here

Related

Unity 2D: after-image from OLED screens in a high contrast situation

When I test my unity 2D game on my iPhone X, all background and sprite elements on the screen have a blue "halo" when moving my character. I have explored the issue with transparency on mobile, but the issue seems really strange. The blue halo appears only when the background is black. Anything brighter and it is absolutely fine. So I doubt it's a transparency issue given that it appears only when a dark background is present.
It is visible only on mobile, so taking a screenshot is useless.
If anyone wants to test do the following. Download or open the image attached here to full screen. Zoom in just a bit so the shapes are taking most of the screen. Start moving the image left and right. Slow and fast and you should see a blueish after-image around the edges. This should happen only on some OLED mobile screens.
If anyone ever encounters this. The result I mentioned is an after-image effect from the OLED screen on the iPhone X. I haven't tested on other OLED devices, but I assume depending on the software it is possible other models can experience this. The levels of Black are incredible, but when you have a high contrast situation between light and dark, an after-image is created around the edges of the contrast zone.
How to fix this?
Simply do not use full black backgrounds or elements. Near black colors in a game situation is indistinguishable from a true black, 0, 0, 0 RGB, choice. This might be a common game design principle I am unaware of and I am the only person stupid enough to use 0,0,0 in the first place, but anyway, I hope if someone has the same issue to read this and fix it easily,

OpenCV Colour Detection Error

I am writing a script on the raspberry pi to detect the majority colour featured in a frame of a webcam and I seem to be having an issue. The following image is me holding up my phone with a blank red image on it. I seem to be getting an orange colour instead.
Now when I angle the phone I do in fact produce the red colour expected.
I am not sure why this is the case.
I am using a logitech c920 webcam that emits a blue light when activated and also have the monitor going. I am wondering whether the light from these two are causing this issue and when I angle it, these lights are not hitting it front on and thus not distributing the image.
I am still not heavily experienced in this area so I would enjoy hearing explanations and possible work arounds for my problem.
Thanks
There are a few things that can mess this up:
As you already mention, the light from the monitor and the camera.
The iPhone screen is a display, so flicker and sync might also be coming to play.
Reflection from the iPhone screen.
If your camera has automatic control for exposure and color balance etc., the picture quality can change as you move around.
I suggest using a colored piece of non-glossy paper so that you can remove the iPhone display's effects.

Stock iOS camera app seems to have wider field of view

I'm working on a video app for iOS, and everything seems to work fine other than if we go into stock iPhone camera app the field of view seems wider. I've tried playing with the different AVCaptureVideoPreviewLayer gravity settings (AVLayerVideoGravityResize, AVLayerVideoGravityResizeAspect and AVLayerVideoGravityResizeAspectFill) to no avail.
In fact, using other camera apps (like when taking a photo in Facebook) we get a much more similar output than we do with the stock camera app.
Does anyone have any information on why something like this might be? I've been scouring the internet and haven't been able to find anything.
It turns out the resolution of the still image camera is different then the resolution of the video camera, see for instance https://discussions.apple.com/thread/3490649?tstart=0

Image tracking - tracking a screen with a camera

I want to track the relative position of a camera aimed at a computer screen.
I can’t control what is displayed on the computer screen but I can receive screen dumps whenever something changes on the screen. Those screen dumps can hopefully be used to find the screen when analyzing the video from the camera.
I see many videos on youtube for face, logo or single colored objects tracking using OpenCV but I’m unsure those methods would work finding and tracking a more detailed image like a screen dump.
Maybe Template Matching is the way to go? But I need to find the screen even at an angle.
Basically I don’t know where to begin and need help from people with experience in this field to find the best way for achieving what I want.
Thanks
Using feature matching should do the trick (Sift/SURF/ORB/...)

Blur effect in a view of iOS

I want to use an UIImagePicker to have a camera preview being displayed. Over this preview I want to place an overlay view with controls.
Is it possible to apply any effects to the preview which will be displayed from camera? I particularly need to apply a blur effect to the camera preview.
So I want to have a blurred preview from camera and overlay view with controls. If I decide to capture the still image from the camera, I need to have it original without blur effect. So blur effect must applied only to the preview.
Is this possible using such configuration or maybe with AVFoundation being used for accessing the camera preview or maybe somehow else, or that's impossible at all?
With AV foundation you could do almost everything you want since you can obtain single frame from the camera and elaborate them, but it could lead you at a dead-end applying a blur on an image in realtime is a pretty intensive task with laggy video results, that could lead you to waste hours of coding. I would suggest you to use the solution of James WebSster or OpenGL shaders. Take a look at this awesome free library written by one of my favorite guru Brad http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework even if you do not find the right filter, probably it will lead you to a correct implementation of what you want to do.
The right filter is Gaussian blur of course, but I don't know if it is supported, but you could do by yourself.
Almost forgot to say than in iOS 5 you have full access to the Accelerate Framework, made by Apple, you should look also into that.
From the reasonably limited amount of work I've done with UIImagePicker I don't think it is possible to apply the blur to the image you see using programatic filters.
What you might be able to do is to use the overlay to estimate blur. You could do this, for example, by adding an overlay which contains an image of semi-transparent frosted glass.

Resources