How to put the Camera input on the screen and blur it? - ios

I'm fairly new to iOS development.
What I want to achive is to put the stream from the camera in a UIView class. (and size it with a frame).
So i don't need controls or the possibility to capture images, just on the screen what the camera sees.
Furthermore, i want that view to be blurred. Is there a way (or a library) to put a gaussian blur on that videostream?
Thank you!

You can use GPUImage https://github.com/BradLarson/GPUImage try realtime effects they provide. That will solve your problem for sure.

To display the camera on screen without controls you will need to use AVFoundation. Take a look at Apple's SquareCam sample code.
As for the blur a simpler solution might be creating semi-transparent image with a blur effect and placing it about the camera view.

Related

Fake Animate a single image file in IOS possible?

Over the years software has been introduced to give a fake animation effect to a single photo image so the image appears moving as in an animated gif. I'm not talking about rotation or translation or animated gifs from multiple photos but rather the mimicking of video or Live Photos from a single photo by automagically perturbing the layers or pixels. After Effects, for example, lets you do this.
Does anyone know if something like this is possible with IOS libraries such core animation?
It's in motion effect of UIView. check UIInterpolatingMotionEffectType.

Making a Black & White only camera with UIImagePickerController

How can I make a black&white camera with UIImagePickerController? Is there another way to produce a camera app without using UIImagePickerController?
Is there another way without using a filter? I want the camera to be black&white while the user is trying to take a photo.
You can achieve it by using AVFoundation Framework Reference. Few sample code is also available which will help you for better understanding.
please check Managing White Balance section for it. It will help you for implementation.
Please let me know if you are still facing same problem.
You can use the UIImagePickerController as usual but just apply a black and white filter to the image when it is taken. Check out this question
You can start with AVCaptureSession and AVCaptureVideoPreviewLayer to take picture without using UIImagePickerController and use custom filters with AVCaptureVideoPreviewLayer to make it Black&White.

Simple way to capture a high quality image with AVCapture

All I need to do is capture an image, and all I can find is complicated code on capturing video or multiple frames. I can't use UIImagePickerController because I do not want to see the camera shutter animation and I have a custom overlay, and my app is landscape only. What is the simplest way to manually capture an image from the front or back live camera view in the correct orientation? I don't want to save it to the camera roll, I want to present it in a view controller for editing.
Take a look to the SquareCam (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html) example from Apple. It contains all what you need for high-quality capture of images. I recently copy-pasted the code from this project myself where I solved the same task as you. It works well :)

iOS: UIImagePickerController Overlay property Detects CameraSource Change

Background: I am implementing a face mask to help people focus their camera and to produce a uniform result across every picture. Sadly, the face mask needs to adjust its size while switching between front and back facing camera to provide a great guideline for people.
Problem: I have been trying to detect this switch between camera to adjust my face mask accordingly. I have not yet found how to detect it.
Additional Info: I have tried looking into delegate and/or subclassing the pickerController. There are no methods visible for this detection. My last resort would be having a thread keep on checking camera source and adjust if needed. I welcome anything better :)
I would take a look at the UIImagePickerController documentation around cameraDevice property.
https://developer.apple.com/library/ios/#documentation/UIKit/Reference/UIImagePickerController_Class/UIImagePickerController_Class.pdf
You can create an observer to run a selector when it changes:
http://farwestab.wordpress.com/2010/09/09/using-observers-on-ios/

Blur effect in a view of iOS

I want to use an UIImagePicker to have a camera preview being displayed. Over this preview I want to place an overlay view with controls.
Is it possible to apply any effects to the preview which will be displayed from camera? I particularly need to apply a blur effect to the camera preview.
So I want to have a blurred preview from camera and overlay view with controls. If I decide to capture the still image from the camera, I need to have it original without blur effect. So blur effect must applied only to the preview.
Is this possible using such configuration or maybe with AVFoundation being used for accessing the camera preview or maybe somehow else, or that's impossible at all?
With AV foundation you could do almost everything you want since you can obtain single frame from the camera and elaborate them, but it could lead you at a dead-end applying a blur on an image in realtime is a pretty intensive task with laggy video results, that could lead you to waste hours of coding. I would suggest you to use the solution of James WebSster or OpenGL shaders. Take a look at this awesome free library written by one of my favorite guru Brad http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework even if you do not find the right filter, probably it will lead you to a correct implementation of what you want to do.
The right filter is Gaussian blur of course, but I don't know if it is supported, but you could do by yourself.
Almost forgot to say than in iOS 5 you have full access to the Accelerate Framework, made by Apple, you should look also into that.
From the reasonably limited amount of work I've done with UIImagePicker I don't think it is possible to apply the blur to the image you see using programatic filters.
What you might be able to do is to use the overlay to estimate blur. You could do this, for example, by adding an overlay which contains an image of semi-transparent frosted glass.

Resources