Making a Black & White only camera with UIImagePickerController - ios

How can I make a black&white camera with UIImagePickerController? Is there another way to produce a camera app without using UIImagePickerController?
Is there another way without using a filter? I want the camera to be black&white while the user is trying to take a photo.

You can achieve it by using AVFoundation Framework Reference. Few sample code is also available which will help you for better understanding.
please check Managing White Balance section for it. It will help you for implementation.
Please let me know if you are still facing same problem.

You can use the UIImagePickerController as usual but just apply a black and white filter to the image when it is taken. Check out this question

You can start with AVCaptureSession and AVCaptureVideoPreviewLayer to take picture without using UIImagePickerController and use custom filters with AVCaptureVideoPreviewLayer to make it Black&White.

Related

i want to capture square video in iOS with custom control

i am able to show square preview of recording using AVCaptureVideoPreviewLayer class, But it saving video in rectangular in library. Instead of rectangle i want square video. I have used composition class to crop video and it is working but taking too much time. I check vine app which has square video output.
Please give me suggestion.
its late answer but it is used for another.. see the my ans
https://github.com/piemonte/PBJVision
record video in square

UIImage from AVCaptureSession

What I'm trying to achieve is grabbing a still frame from a AVCaptureSession for something like a QR Code scanner. I have a view with a AVCaptureVideoPreviewLayer sublayer.
I already tried using AVCaptureStillImageOutput, which worked, but that function makes the shutter sound. Since there's no way to mute that I can't use it. After that I tried to make a screenshot of the entire screen, which also failed, because it can't capture a AVCaptureVideoPreviewLayer. Now I'm kinda lost, the only real way to do this would be to feed a video feed into my OCR library but that would lag to much/be a lot of work.
Are there any other options?
Here's a tech note describing exactly what you want to achieve:
How to capture video frames from the camera as images using AV Foundation on iOS

How to put the Camera input on the screen and blur it?

I'm fairly new to iOS development.
What I want to achive is to put the stream from the camera in a UIView class. (and size it with a frame).
So i don't need controls or the possibility to capture images, just on the screen what the camera sees.
Furthermore, i want that view to be blurred. Is there a way (or a library) to put a gaussian blur on that videostream?
Thank you!
You can use GPUImage https://github.com/BradLarson/GPUImage try realtime effects they provide. That will solve your problem for sure.
To display the camera on screen without controls you will need to use AVFoundation. Take a look at Apple's SquareCam sample code.
As for the blur a simpler solution might be creating semi-transparent image with a blur effect and placing it about the camera view.

Simple way to capture a high quality image with AVCapture

All I need to do is capture an image, and all I can find is complicated code on capturing video or multiple frames. I can't use UIImagePickerController because I do not want to see the camera shutter animation and I have a custom overlay, and my app is landscape only. What is the simplest way to manually capture an image from the front or back live camera view in the correct orientation? I don't want to save it to the camera roll, I want to present it in a view controller for editing.
Take a look to the SquareCam (http://developer.apple.com/library/ios/#samplecode/SquareCam/Introduction/Intro.html) example from Apple. It contains all what you need for high-quality capture of images. I recently copy-pasted the code from this project myself where I solved the same task as you. It works well :)

Blur effect in a view of iOS

I want to use an UIImagePicker to have a camera preview being displayed. Over this preview I want to place an overlay view with controls.
Is it possible to apply any effects to the preview which will be displayed from camera? I particularly need to apply a blur effect to the camera preview.
So I want to have a blurred preview from camera and overlay view with controls. If I decide to capture the still image from the camera, I need to have it original without blur effect. So blur effect must applied only to the preview.
Is this possible using such configuration or maybe with AVFoundation being used for accessing the camera preview or maybe somehow else, or that's impossible at all?
With AV foundation you could do almost everything you want since you can obtain single frame from the camera and elaborate them, but it could lead you at a dead-end applying a blur on an image in realtime is a pretty intensive task with laggy video results, that could lead you to waste hours of coding. I would suggest you to use the solution of James WebSster or OpenGL shaders. Take a look at this awesome free library written by one of my favorite guru Brad http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework even if you do not find the right filter, probably it will lead you to a correct implementation of what you want to do.
The right filter is Gaussian blur of course, but I don't know if it is supported, but you could do by yourself.
Almost forgot to say than in iOS 5 you have full access to the Accelerate Framework, made by Apple, you should look also into that.
From the reasonably limited amount of work I've done with UIImagePicker I don't think it is possible to apply the blur to the image you see using programatic filters.
What you might be able to do is to use the overlay to estimate blur. You could do this, for example, by adding an overlay which contains an image of semi-transparent frosted glass.

Resources