IOS: draw a line that remove another line - ios

In my code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
UIGraphicsBeginImageContext(drawImage.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 10.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), r, g, b, a);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
mouseMoved++;
if (mouseMoved == 10) {
mouseMoved = 0;
}
}
I'm able to draw a line in drawImage, but if I want draw a line that remove the other line? as a eraser that cancel the others line drawn. is it possible?

If you want to "erase" pixels by drawing a line that replaces what's underneath with transparency, then you need to make your stroke colour fully transparent but also change the blend mode to kCGBlendModeCopy (with the default blend mode, drawing with a fully transparent stroke colour has of course no effect).
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 0, 0, 0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeCopy);
After that, you can draw the "eraser" line just like you are doing in your code snippet (CGContextMoveToPoint, CGContextAddLineToPoint, CGContextStrokePath)

Related

How to erase from a UIImageView

I have a screen that acts like a drawing pad - you select a color then as you scribble across the view there is stuff painted in. I need a better eraser then the one I have right now. If I just set my eraser to [UIColor clearColor] there is a situation where it appears to erase.. although, there is a drawing under my finger drawing that is processed from a piece of hardware that acts like a wacom/stylus pad. And if I draw under the peice that is "erased" (with clearColor) there is an alpha difference... the layer beneath my finger drawing... Ill provide some screenshots..
You'll notice the little grey scribbles on the white line. They were erased - I need this to not be opacity .6 ClearColor - I need to be completely 0. Any ideas?
Here is my touches moved code:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self];
UIGraphicsBeginImageContext(self.frame.size);
[self.image drawInRect:CGRectMake(0, 0, self.frame.size.width, self.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 30.0 );
CGFloat* components = CGColorGetComponents(_strokeColor.CGColor);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), components[0], components[1], components[2], 1);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.image = UIGraphicsGetImageFromCurrentImageContext();
[self setAlpha: .6];
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}

IOS drawing on an image but my touch gesture cuts off in the middle of movement

I am developing some functionality that will let the user "draw" on an image by placing their finger on the screen and moving it around. I have the following code (taken from the tutorial https://www.raywenderlich.com/18840/how-to-make-a-simple-drawing-app-with-uikit):
// Draws a line from point1 to point2
- (void) drawOnImage:(CGPoint *)point1 :(CGPoint *)point2 {
UIGraphicsBeginImageContext(self.view.frame.size);
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), point1->x, point1->y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), point2->x, point2->y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, 1.0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
// Begining of the drawing
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.view];
if (currentAction == DRAW) {
[self backupImage];
mouseSwiped = NO;
//disable the renderImageView so that the gestures dont interfere
[self drawOnImage :&lastPoint :&lastPoint];
}
}
// When lines are being drawn on the image
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
if (currentAction == DRAW) {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)]; // I switched the tempDrawImage to mainImage
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush );
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, 1.0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeNormal);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext(); // I switched the tempDrawImage to mainImage
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
if(!mouseSwiped) {
UIGraphicsBeginImageContext(self.view.frame.size);
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)]; // I switched the tempDrawImage to mainImage
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), brush);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), red, green, blue, 1);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
CGContextFlush(UIGraphicsGetCurrentContext());
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext(); // I switched the tempDrawImage to mainImage
UIGraphicsEndImageContext();
}
if (currentAction == DRAW) {
UIGraphicsBeginImageContext(self.mainImage.frame.size);
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height) blendMode:kCGBlendModeNormal alpha:1.0];
[self.mainImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height) blendMode:kCGBlendModeNormal alpha:1]; // I switched the tempDrawImage to mainImage
self.mainImage.image = UIGraphicsGetImageFromCurrentImageContext();
//self.tempDrawImage.image = nil;
UIGraphicsEndImageContext();
}
}
If the screen is being touched very lightly when drawing sometimes the drawing stops. Like midway through a stroke the line stops drawing and in order to start drawing again you have to lift your finger off the screen and press again. This confuses me a great deal because I would think that if the touch gesture was not being recognized for a second then the drawing would stop for a brief second, and when it was recognized again the drawing would continue. This isn't what happens though, you have to lift your finger off the screen and re-apply it. Does anyone have any ideas on what could cause this or have any ideas on how to fix it?

CGcontext drawing on image not working

This is my code and it makes a very weird drawing when executed. Moreover image starts to disappear slowly by going down the imageview. Please help me with this
-(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [[event allTouches] anyObject];
// if ([touch tapCount] == 2)
// {
// imageView.image = nil;
// }
location = [touch locationInView:touch.view];
lastClick = [NSDate date];
lastPoint = [touch locationInView:self.view];
lastPoint.y -= 0;
[super touchesBegan:touches withEvent:event];
}
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
{mouseSwiped = YES;
UITouch *touch = [touches anyObject];
currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(imageView.image.size);
[imageView.image drawInRect:CGRectMake(0, 44, imageView.image.size.width, imageView.image.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 1, 0, 1);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// lastPoint = currentPoint;
}
Moreover, the lines its drawing are of weird shape and they are disappearing continously
Your image is shifting, because you hardcoded offset in 44 points on each redraw.
Weird drawing is most likely the result of invalid coordinate system usage. You receive touch location in view coordinates, but draw in image coordinates. The easiest way to fix this issue is to create context with size, equal to view size instead of image size. Simply use imageView.bounds.size instead of imageView.image.size. Note that I assume you use "Scale to Fill" mode in your image view.
Whole drawing code after changes:
UIGraphicsBeginImageContext(self.imageView.bounds.size);
[self.imageView.image drawInRect:CGRectMake(0, 0, self.imageView.bounds.size.width, self.imageView.bounds.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 5.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 1, 0, 1);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), self.lastPoint.x, self.lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Also, your solution is not optimal in terms of performance. I suggest to draw path separately in view, instead of updating imageView image on each touch move.

Making an eraser tool for a paint app in iOS

I am creating a paint app and I want to know how to implement the eraser tool. I don't want to have my eraser tool to paint white color because I want to allow users to change the background color. And also, is it possible to set the hardness of the brush? If yes, please tell me how.
Thank you
Here's what I've done so far:
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
lastPoint = [touch locationInView:self.view];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 10);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 10);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0.0, 0.0, 0.0, 1.0);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
- (IBAction)clear:(id)sender {
drawImage.image = nil;
}
Ok here's what I did for the eraser tool:
I add this line of code:
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
So the code will be something something like this:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:self.view];
UIGraphicsBeginImageContext(self.view.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height)];
// I add this
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), sizeSlider.value);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeClear);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
i have make paint application and it is available on iTunes "Easy Doodle" App. There is no special logic for eraser. just get RGB value of background image and pass it in
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(),R value,G value,B value, 1.0);
according to background image selected.
Ok so I have been looking for a solution on how to create an eraser for a week now, I had to change the way I looked at it cause no matter how many times I tried to erase something out it would erase the background image too.
However, in the end I did find a solution that suited me (where I could erase the paint and still have the perception of the background image being untouched or erased) and hopefully it will help others.
So here goes...
1) Create 2 imageviews in the UIView
2) Set both imageviews with the background image that you want it to be...
3) I created a segmented control to determine if the user wanted to erase
4) Added the following function when segmented control is being selected..
func drawLineForRubber(fromPoint: CGPoint, toPoint: CGPoint) {
// 1
UIGraphicsBeginImageContext(view.frame.size)
let context = UIGraphicsGetCurrentContext()
mainImageView.image?.drawInRect(CGRect(x: 0, y: 0, width: view.frame.size.width, height: view.frame.size.height))
// 2
CGContextMoveToPoint(context, fromPoint.x, fromPoint.y)
CGContextAddLineToPoint(context, toPoint.x, toPoint.y)
// 3
CGContextSetBlendMode(context, CGBlendMode.Clear)
CGContextSetLineCap(context, CGLineCap.Round)
CGContextSetLineWidth(context, 10.0)
//this line is very important as it paints the screen clear
CGContextSetRGBStrokeColor(context, red, green, blue, 0.0)
CGContextSetBlendMode(context, CGBlendMode.Clear)
// 4
CGContextStrokePath(context)
// 5
mainImageView.image = UIGraphicsGetImageFromCurrentImageContext()
mainImageView.alpha = opacity
UIGraphicsEndImageContext()
}
As you will see there is a line in there that sets the CGContextSetRGBStrokeColor with the value of 0.0, this is important cause when the user is erasing the paint, he is erasing the paint from the top imageview along with the paint with a transperent color. However, since you have another imageview underneath it, the perception is it looks like its being rubbed out without affecting the background image.
On the contrary it is rubbed out but the imageview behind, makes it look like it hasnt. When you want to export the image (You can do it in a number of ways but I went with the following) you have painted with the background image just combine both image views as the
1st imageview will have the paint you painted and the
2nd imageview will have the original background image you wanted.
func share() {
let layer = UIApplication.sharedApplication().keyWindow!.layer
let scale = UIScreen.mainScreen().scale
UIGraphicsBeginImageContextWithOptions(layer.frame.size, false, scale); // reconsider size property for your screenshot
layer.renderInContext(UIGraphicsGetCurrentContext()!)
let screenshot = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIGraphicsBeginImageContext(mainImageView.bounds.size)
mainImageView.image?.drawInRect(CGRect(x: 0, y: 0,
width: mainImageView.frame.size.width, height: mainImageView.frame.size.height))
UIGraphicsEndImageContext()
let activity = UIActivityViewController(activityItems: [screenshot], applicationActivities: nil)
presentViewController(activity, animated: true, completion: nil)
}
Hope this helps others along the way.. man it took me ages to get my head around it =)

IOS: color with a pattern image

I have this code to colour with a simply line:
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
mouseSwiped = YES;
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
UIGraphicsBeginImageContext(drawImage.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 15.0);
//eraser
/*
if (eraser){
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 0, 0, 0);
CGContextSetBlendMode(UIGraphicsGetCurrentContext(), kCGBlendModeCopy);
}*/
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 0, 0, 0, 1.0); //black
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
}
with this code I'm able to color in a view or imageview; I can choose color and size; but now I want to use a specific pattern to color this view; a specific png to use when touchmoved is called; can you help me?
Check out https://stackoverflow.com/a/707329/108574 if you want to draw pattern (tiled images).
However, you are doing it the wrong way in your code. You aren't supposed to draw to the graphics context during UI events. Drawing routines should be inside - (void)drawRect:(CGRect)rect of a UIView instance. When a UI event refreshes graphics, you record the new state somewhere, and then issue - (void)setNeedsDisplay message to the UIView instance to redraw.
Try colorWithPatternImage:
Creates and returns a color object using the specified image.
+ (UIColor *)colorWithPatternImage:(UIImage *)image
(reference)

Resources