Create custom UIImagePickerController to set aspect ratio of camera - ios

Stack Overflow maybe has 1000 threads addressing and re-addressing cropping images on iOS; many of which contains answers claiming to work just like Instagram (my guess is they never open Instagram). So instead of simply using the word Instagram, let me describe the functionalities of what I am trying to do:
I want to create CustomUIImagePickerController such that:
CAMERA: I can either set the exact dimension of the image that the camera takes; or on the very next screen (i.e. retake/use image) have a custom rectangle that the user can move to crop the image just taken.
GALLERY: (same as above:) set the dimensions of the frame that user will use to crop the image.
So far on SO one answer comes close: the answer points to https://github.com/gekitz/GKImagePicker.
But the crucial problem with that project is that it only works with gallery. It does not work with camera.
So lastly, if you look at the Instagram app for iOS-7, it takes complete control of the picture taking experience. How do I do that? I don’t want to have my users go through the whole standard iOS UIImagePickerController experience and then on top of that have them go through my own cropper just to load an image. That’s simply terrible user experience. How many steps or screens does one need to take or load a picture? Right? I know I can't be the only developer who would love to solve this problem. I don’t mind being the one to find the solution and then share it with the world. But presently I don’t even know where to start?
Does anyone have any ideas where I might start?
BTW: the following are not answers:
how to change the UIImagePickerController crop frame
Set dimensions for UIImagePickerController "move and scale" cropbox

Related

iOS, Objective C auto image processing filters

I'm doing a photo app and sometimes the lighting is off in certain areas and the picture isn't clear. I was wondering if there was a feature that can auto adjust the brightness, contrast, exposure, saturation of a picture like in photoshop.
I don't want to manually adjust images like the sample code given by apple:
https://developer.apple.com/library/ios/samplecode/GLImageProcessing/Introduction/Intro.html
I want something that will auto adjust or correct the photo
As an alternative you could use AVFoundation to make your implementation of the camera and set the ImageQuality to high and the autofocus or tap to focus feature. Otherwise, I am almost certain you cannot set this properties, The UIImagePicker controller included in the SDK is really expensive memory wise and gives you an image instead of raw data (another benefit of using AVFoundation). This is a good tutorial for this in case you would like to check it out:
http://www.musicalgeometry.com/?p=1297
Apparently someone has created it on Github: https://github.com/proth/UIImage-PRAutoAdjust
Once imported, I used it the following:
self.imageView.image = [self.imageView.image autoAdjustImage];

Image tracking - tracking a screen with a camera

I want to track the relative position of a camera aimed at a computer screen.
I can’t control what is displayed on the computer screen but I can receive screen dumps whenever something changes on the screen. Those screen dumps can hopefully be used to find the screen when analyzing the video from the camera.
I see many videos on youtube for face, logo or single colored objects tracking using OpenCV but I’m unsure those methods would work finding and tracking a more detailed image like a screen dump.
Maybe Template Matching is the way to go? But I need to find the screen even at an angle.
Basically I don’t know where to begin and need help from people with experience in this field to find the best way for achieving what I want.
Thanks
Using feature matching should do the trick (Sift/SURF/ORB/...)

Instagram like photo editing feature

I have seen apps like Instagram and many more, could take a photo and change the colour of the image, as in fade, brightness, gray-scale, and various image processing effects added to it.
I need to try this out, so can someone help me find a good tutorial, to begin with.
For now, my program could take the picture from the camera, but I am unable to process the image as shown on Instagram.
Here is an open source image filter library that might help: https://github.com/OmidH/Filtrr

Blur effect in a view of iOS

I want to use an UIImagePicker to have a camera preview being displayed. Over this preview I want to place an overlay view with controls.
Is it possible to apply any effects to the preview which will be displayed from camera? I particularly need to apply a blur effect to the camera preview.
So I want to have a blurred preview from camera and overlay view with controls. If I decide to capture the still image from the camera, I need to have it original without blur effect. So blur effect must applied only to the preview.
Is this possible using such configuration or maybe with AVFoundation being used for accessing the camera preview or maybe somehow else, or that's impossible at all?
With AV foundation you could do almost everything you want since you can obtain single frame from the camera and elaborate them, but it could lead you at a dead-end applying a blur on an image in realtime is a pretty intensive task with laggy video results, that could lead you to waste hours of coding. I would suggest you to use the solution of James WebSster or OpenGL shaders. Take a look at this awesome free library written by one of my favorite guru Brad http://www.sunsetlakesoftware.com/2012/02/12/introducing-gpuimage-framework even if you do not find the right filter, probably it will lead you to a correct implementation of what you want to do.
The right filter is Gaussian blur of course, but I don't know if it is supported, but you could do by yourself.
Almost forgot to say than in iOS 5 you have full access to the Accelerate Framework, made by Apple, you should look also into that.
From the reasonably limited amount of work I've done with UIImagePicker I don't think it is possible to apply the blur to the image you see using programatic filters.
What you might be able to do is to use the overlay to estimate blur. You could do this, for example, by adding an overlay which contains an image of semi-transparent frosted glass.

Image over a Image

i want to put a Big image over a small Image , condition is the image which is on top has some specific rectangular area where the second image will be displayed. I want the the small Image to be displayed inside the big image not over the big image. i don't no will it possible or not , if it is possible can any one provide me guidance or provide me a sample code or link
thanks alot
Here is solution. Why don't you put the small image on top of the big image? Will that work? That way the illusion is the same that the small image is inside. Otherwise you have to play around with alpha transparency.
PS. Rupesh, you should also go back to the 13 questions that you asked prior to this one and accepted at least some answers. Otherwise chances are you will not get many answers later on to any of your new questions, because you are not rewarding the people that take time to answer your questions, with positive karma.

Resources