Cropping image By selecting Object and color matching - ios

We are developing an app where we need to crop an image according to the selecting object area. User will draw a line and we need to select the object and crop it .This crop need to be like the app: YourMoji
So far we have tried to get the color of the pixels along the line and then comparing those with the color of every pixel in the image and making a path from it to clip the image. But the almost going no where.
Is it possible through this way to crop an image or we are going in the wrong way? Can anyone provide a way to do this Or suggest a way to modify the way we have worked so far? Any advice and suggestions will be greatly appreciated!
Thanks in advance.

I guess what you want is the image segmentation algorithm called Graph Cut.
Here are two Github repositories, hope these would help:
GraphCut
GrabCutIOS

I'm not exactly clued up on image manipulation, but the first algorithm that comes to mind is something like this:
Take the average of the pixels in the line (as you have)
Since you appear to want faces, you might want to weight reds and blues over green. Not much green in faces of any skin tone.
For each pixel, if the colour is within a given threshold outside of your selected average, remove it / make transparent.
Perhaps the closer to the original line (or centroid), the less strict the threshold becomes.
I'd then provide the user with some tools for:
Sensitivity: how large the threshold is
Eraser: to remove parts of the image that your algorithm missed
Paintbrush: to replace parts of the image that your algorithm incorrectly removed.

Related

Comparing Images via pixels

I was playing around with images and came across a little game I tried to create.
You see an Image (for simplicity let's say a circle) and you have to redraw the circle as exact as possible on top of it.
All of that works in my little project already.
I want to be able to tell how many % of the image was recreated correctly (and again for simplicity no colours needed. It's always black and white)
Could I just count the the black pixels overlaying the image subtract the ones that are not and divide it by the amount of black pixels in the original?
This would look like this I guess:
ratio = (correctPixelCount - wrongPixelCount) / originalPixelCount
If yes, how would I go about getting each pixel and compare them?
If no, what else could I do?
PS: I already tried a Image compare cocoa pod called AIImageCompare.
Unfortunately it crashes for some unknown reasons.
Thank you!

How to add rain effect to a picture?

Given a picture, I would like to modify it to create the effect of rain on glass. What steps should I take to achieve this goal?
Suppose we want to add the effect of a single drop of water on a given point in an image, some pixels around that point should be modified in some way: how these pixels should be modified?
Simple way is to just make transparent image that is actually image overlay. That looks like common approach of water drop effects in gimp.
example:
http://natural-drops.deviantart.com/art/drop-of-rain-373710307
I think that in order to make optically correct image one need to have full 3D info of environment, because most optics equations that one needs to simulate correct image includes each object distance.

How to do Water color painting in iOS using CoreGraphics? [duplicate]

I have been working with OpenGL in iOS, and setting the colors with glColor4f(r,g,b,a) and then drawing my own color on a white UIImageView. I basically have a brush, which is then moved around my user's touch, and then it paints the color onto the canvas. But this color needs to be water paint (like smudged color)
Does anyone understand/knows how to get a water color like this app does, and how the background UIImageView has a texture on it?
https://itunes.apple.com/us/app/hello-watercolor/id539414526?mt=8
or checkout water paint in this. http://www.fiftythree.com/paper
I created a bounty on this as I am really having a hard time to grasp how to derive such smooth flowing colors out of the normal colors. Even if you guys point me in the right direction, or to some sample code on how I can get the effect of water-paint, it would be really helpful ^_^
And as a bonus, it would be also be helpful if you can point out to me how to get canvas on which it is painted on looks realistic, and blended with the paint? Does Blending/GLSL have to do with any of this?
Is there any sample project on this?
If you are still struggling with the basics of getting realistic looking water colors working, you may want to experiment/prototype in photoshop first.
http://www.zoepiel.com/tutorials/watercolor/ shows some very effective tricks for creating watercolor images with simple tools.
The most interesting one is to multiply a group of watercolor layers with a greyscale watercolor paper image. The texture of the paper makes some parts remain white, and other parts saturate with color, just like real watercolor.
Each layer remains 'wet' in the sense that the colors within it blend, but the layers are 'dry' with respect to each other.
She also explains some of her brush and blur settings and shows what they do.
Once you can produce the desired effect in photoshop, you'll have clear specifications of what you want to do and you'll be quite a bit closer to programming it out.
Looking at the examples you posted, it looks like they are using a simple Gaussian Blur with a radius of double your brush size. This may be an incomplete solution, but it's at least the first level.

Render water-paint in iOS

I have been working with OpenGL in iOS, and setting the colors with glColor4f(r,g,b,a) and then drawing my own color on a white UIImageView. I basically have a brush, which is then moved around my user's touch, and then it paints the color onto the canvas. But this color needs to be water paint (like smudged color)
Does anyone understand/knows how to get a water color like this app does, and how the background UIImageView has a texture on it?
https://itunes.apple.com/us/app/hello-watercolor/id539414526?mt=8
or checkout water paint in this. http://www.fiftythree.com/paper
I created a bounty on this as I am really having a hard time to grasp how to derive such smooth flowing colors out of the normal colors. Even if you guys point me in the right direction, or to some sample code on how I can get the effect of water-paint, it would be really helpful ^_^
And as a bonus, it would be also be helpful if you can point out to me how to get canvas on which it is painted on looks realistic, and blended with the paint? Does Blending/GLSL have to do with any of this?
Is there any sample project on this?
If you are still struggling with the basics of getting realistic looking water colors working, you may want to experiment/prototype in photoshop first.
http://www.zoepiel.com/tutorials/watercolor/ shows some very effective tricks for creating watercolor images with simple tools.
The most interesting one is to multiply a group of watercolor layers with a greyscale watercolor paper image. The texture of the paper makes some parts remain white, and other parts saturate with color, just like real watercolor.
Each layer remains 'wet' in the sense that the colors within it blend, but the layers are 'dry' with respect to each other.
She also explains some of her brush and blur settings and shows what they do.
Once you can produce the desired effect in photoshop, you'll have clear specifications of what you want to do and you'll be quite a bit closer to programming it out.
Looking at the examples you posted, it looks like they are using a simple Gaussian Blur with a radius of double your brush size. This may be an incomplete solution, but it's at least the first level.

flood fill performance issue on iPad

I am using 4-Way floodfill algorithm.
I have a transparent image with black out line.
That is staring point image(without color).
And after filling the color in this image it look like this
Please help me and let me know what can i do for proper fill.
I used and implemented myself FloodFill in other projects and the algorithm goes trough the whole draw, looking for closed spaces and then draw inside (or outside) them.
Your problem happens with every tool in the world that fills a draw, and the problem is the same, the spaces are not 100% closed.
The floodfill algorithm goes pixel by pixel and when it detect a black pixel, it stops. For example, the arm of the scuba driver is not thick enough or it has holes on it, and the flood fill algorithm manages to go trough it and not detect it as an empty space.
Nobody here can tell you why unless we take your project and analyse it, so the best I can offer is a guideline about where your error could be.
I tried the code with an image that has a very precise defined border around it (from here) and it seems to work OK with that image. I suggest perhaps that if you zoom into your image that there is some grey aliasing around the edges which won't get filled. Perhaps the algorithm has a threshold function that can be tweaked?
Try setting the andTolerance value (I tried 4 which seemed to improve my example).
//Call function to flood fill and get new image with filled color
UIImage *image1 = [self.image floodFillFromPoint:tpoint withColor:newcolor andTolerance:4];

Resources