Undo and Redo functionality in drawing application ios - ios

I am working on a drawing app and I am able to draw with my finger touch. Now I am trying to implement clear, undo, and redo functionality.
In my view controller I have two IBAction methods for "clearAll" and "Undo". I have created a custom class called drawing.h and .m where I have written functions for handling touch events. Below are my functions.
The problem is undo and redo work but the last color select in all line in drawn in undo and redo.

I Have Mistake In Touch End Method In Array Last Array Not Remove on last object
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
UIGraphicsBeginImageContext(self.tempimage.bounds.size);
[self.tempimage.layer renderInContext:UIGraphicsGetCurrentContext()];
rawImage = UIGraphicsGetImageFromCurrentImageContext();
[self.tempimage.image drawInRect:CGRectMake(0, 0, tempimage.frame.size.width, tempimage.frame.size.height) blendMode:blendmode alpha:opacity];
UIGraphicsEndImageContext();
#if PUSHTOFILE
lineIndex++;
[self performSelectorInBackground:#selector(writeFilesBG)
withObject:nil];
#else
NSDictionary *lineInfo = [NSDictionary dictionaryWithObjectsAndKeys:rawImage, #"IMAGE",nil];
[pointsArray addObject:lineInfo];
UIBezierPath *_path=[pointsArray lastObject];
[_stack addObject:_path];
[pointsArray removeLastObject];
[self.tempimage setNeedsDisplay];
#endif
}

Related

UIBezierPath drawing takes up 100% of CPU

I have a UIBezierPath is drawn on the screen every time the user's touch is moved. This works fine on my simulator on a Mac Pro, but as soon as I moved it to a physical device, the drawing began to lag a lot. Using instruments, I checked the CPU usage, and it hits 100% while the user is drawing.
This becomes quite the problem because I have an NSTimer that is supposed to fire at 30fps, but when the CPU is overloaded by the drawing, the timer fires at only around 10fps.
How can I optimize the drawing so that it doesn't take 100% of the CPU?
UITouch* lastTouch;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if(lastTouch != nil)
return;
lastTouch = [touches anyObject];
[path moveToPoint:[lastTouch locationInView:self]];
[self drawDot];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches anyObject] == lastTouch)
[self drawDot];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches anyObject] == lastTouch)
lastTouch = nil;
}
- (void)drawDot
{
if(!self.canDraw)
return;
[path addLineToPoint:[lastTouch locationInView:self]];
[self setNeedsDisplayInRect:CGRectMake([lastTouch locationInView:self].x-30, [lastTouch locationInView:self].y-30, 60, 60)];
}
- (void)drawRect:(CGRect)rect
{
CGColorRef green = [UIColor green].CGColor;
CGContextRef context = UIGraphicsGetCurrentContext();
CGColorRef gray = [UIColor gray].CGColor;
CGContextSetStrokeColor(context, CGColorGetComponents(gray));
[shape stroke];
CGContextSetStrokeColor(context, CGColorGetComponents(green));
[path stroke];
}
You shouldn't really be using -drawRect in modern code - it's 30 years old and designed for very old hardware (25Mhz CPU with 16MB of RAM and no GPU) and has performance bottlenecks on modern hardware.
Instead you should be using Core Animation or OpenGL for all drawing. Core Animation can be very similar to drawRect.
Add this to your view's init method:
self.layer.contentsScale = [UIScreen mainScreen].scale;
And implement:
- (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)ctx
{
// basically the same code as you've got inDrawRect, although I recommend
// trying Core Graphics (CGContextMoveToPoint(), CGContextAddLineToPoint(),
// CGContextStrokePath(), etc)
}
And to make it re-draw (inside touchesMoved/etc):
[self.layer setNeedsDisplay];
I would also update your touch event code to just append to an NSMutableArray of points (perhaps encoded in in an [NSValue valueWithCGPoint:location]). That way you aren't creating a graphics path while responding to touch events.
A good place to look for efficient drawRect: is the wwdc video from a few years ago: https://developer.apple.com/videos/wwdc/2012/?id=238 . about 26m in he starts debugging a very simple painting app which is very similar to what you've described. At minute 30 he starts optimizing after the first optimization of setNeedsDisplayInRect: (which you're already doing).
short story: he uses an image as a backing store for what's already drawn so that each drawRect: call he only draws 1 image + very short additional line segments instead of drawing extremely long paths every time.

How to write sketch app on iOS

I'm trying to create sketch app that can draw shapes/path by finger.
What I've done so far is create UIBezierPath when touch start, and draw the path while finger moves.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
CGPoint locationInDrawRect = [mytouch locationInView:self.drawingView];
[self.drawingView.currentPath addLineToPoint:locationInDrawRect];
[self.drawingView setNeedsDisplay];
}
when touch is done, save it to an array of UIBezierPath.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.drawingView.pathsArray addObject:self.drawingView.currentPath]; //add latest current path to pathArray
[self.drawingView clearCurrentPath]; //clear current path for next line
[self.drawingView setNeedsDisplay];
}
in drawRect, draw current path and paths in array using for loop
- (void)drawRect:(CGRect)rect
{
if(!self.currentPath.empty){
[self.currentPath stroke];
}
for (UIBezierPath *path in self.pathsArray){
[path stroke];
}
}
This works for couple of paths objects, but it gets slow when the array holds more than 5 paths.
I tried to limit the area to render using setNeedsDisplayInRect: method.
[self.drawingView setNeedsDisplayInRect:rectToUpdateDraw];
and render path in array only when rect size is full canvas size, which is when touch is ended.
But this draws weird shape line and also gets slow when there are many objects in the array.
I don't know how I can solve this problem and need some help.
Thank you!

UIBezierPath drawing issues

I created a class called myBezierPaths, with two member variables of type UIBezierPath and UIColor, and then I tried to use object of this class to draw the bezierpath, but I am not getting bezierpath, below is my code
//Bezierpath.h
#interface BezierPath : NSObject
{
UIBezierPath *m_bezierPath;
UIColor *m_pathColor;
}
#property (nonatomic, strong) UIBezierPath *bezierPath;
#property (nonatomic, strong) UIColor *pathColor;
#end
//drawingView.m
- (void)drawRect:(CGRect)rect
{
for (BezierPath *pathobj in m_pathArray)
{
[pathobj.pathColor setStroke];
[pathobj.pathColor setFill];
[pathobj.bezierPath strokeWithBlendMode:kCGBlendModeNormal alpha:1.0];
}
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
*self.myPath = [[BezierPath alloc] init];*
myPath.bezierPath.lineWidth = 5;
myPath.bezierPath.lineCapStyle = kCGLineCapRound;
myPath.bezierPath.flatness = 0.0;
myPath.bezierPath.lineJoinStyle = kCGLineJoinRound;
myPath.bezierPath.miterLimit = 200.0;
self.myPath.pathColor = [UIColor redColor];
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath.bezierPath moveToPoint:[mytouch locationInView:self]];
[m_pathArray addObject:self.myPath];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[self.myPath.bezierpath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
hope I am clear with my explanation, waiting for reply
Regards
Ranjit
First, in your drawRect you don't actually fill the shape. You do stroke it (basically draw a border) but you need to call [pathObj.bezierPath fill]; to actually fill the shape with a color.
There's a lot of code you're not showing, so it's hard to tell where the problem might be. I would create a breakpoint in the drawRect method to determine if all of the variables are there. For example, you show (what I assume) to be an array: m_pathArray. Are you sure it's initialized? Are you sure the properties of BezierPath are working correctly? I notice that on touchesBegan you set the pathColor but only access/read the bezierPath variable. Has that been initialized? Where do you create the 'm_bezierPath' instance variable? Do you create it in the BezierPath init method or lazily instantiate it on access?
So those are a few things you could check.
You don't show your touchesMoved event handler , but i would expect something like this. Otherwise all you are doing is creating a set of paths of length zero.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[self.myPath.bezeirPath lineToPoint:[mytouch locationInView:self]];
}
#Ranjit need some help I've same problem ...
I want to change the colors of UIbezierpath.
but wen i select the color my previous path also automatically gets colored with new color.
In My first Image i draw a line and selected color is red.
but when i select yellow color to draw another line the previous line also change to yellow color.(image no 2)

Smoother freehand drawing experience (iOS)

I am making a math related activity in which the user can draw with their fingers for scratch work as they try to solve the math question. However, I notice that when I move my finger quickly, the line lags behind my finger somewhat noticeably. I was wondering if there was some area I had overlooked for performance or if touchesMoved simply just doesn't come enough (it is perfectly smooth and wonderful if you don't move fast). I am using UIBezierPath. First I create it in my init method like this:
myPath=[[UIBezierPath alloc]init];
myPath.lineCapStyle=kCGLineCapSquare;
myPath.lineJoinStyle = kCGLineJoinBevel;
myPath.lineWidth=5;
myPath.flatness = 0.4;
Then in drawRect:
- (void)drawRect:(CGRect)rect
{
[brushPattern setStroke];
if(baseImageView.image)
{
CGContextRef c = UIGraphicsGetCurrentContext();
[baseImageView.layer renderInContext:c];
}
CGBlendMode blendMode = self.erase ? kCGBlendModeClear : kCGBlendModeNormal;
[myPath strokeWithBlendMode:blendMode alpha:1.0];
}
baseImageView is what I use to save the result so that I don't have to draw many paths (gets really slow after a while). Here is my touch logic:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath moveToPoint:[mytouch locationInView:self]];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0f);
CGContextRef c = UIGraphicsGetCurrentContext();
[self.layer renderInContext:c];
baseImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[myPath removeAllPoints];
[self setNeedsDisplay];
}
This project is going to be released as an enterprise app, so it will only be installed on iPad 2. Target iOS is 5.0. Any suggestions about how I can squeeze a little more speed out of this would be appreciated.
Of course you should start by running it under Instruments and look for your hotspots. Then you need to to make changes and re-evaluate to see their impact. Otherwise you're just guessing. That said, some notes from experience:
Adding lots of elements to a path can get very expensive. I would not be surprised if your addLineToPoint: turns out to be a hotspot. It has been for me.
Rather than backing your system with a UIImageView, I would probably render into a CGLayer. CGLayers are optimized for rendering into a specific context.
Why accumulate the path at all rather than just rendering it into the layer at each step? That way your path would never be more than two elements (move, addLine). Typically the two-stage approach is used so you can handle undo or the like.
Make sure that you're turning off any UIBezierPath features you don't want. In particular, look at the section "Accessing Draw Properties" in the docs. You may consider switching to CGMutablePath rather than UIBezierPath. It's not actually faster when configured the same, but it's default settings turn more things off, so by default it's faster. (You're already setting most of these; you'll want to experiment a little in Instruments to see what impact they make.)
http://mobile.tutsplus.com/tutorials/iphone/ios-sdk_freehand-drawing/
This link exactly shows how to make a curve smoother . This tutorial shows it step by step. And simply tells us how we can add some intermediate points (in touchesMoved method) to our curves to make them smoother.

iOS Image editing open source library - or just some tips

For an ios app I'm writing, I'd like to take an photo from the photo library and then let the user "clean it up", essentially deleting parts of it that are not needed. For example, suppose the user chooses a photo of a person, my app only needs the head and everything else should be deleted so the user needs to clean the photo up by deleting the background, the body or other persons in the photo. Imagine a photoshop like experience but with only one tool - the eraser.
I'm looking for open source libraries, or examples or just tips of how to get started with that.
I know how to use a UIImagePickerController to select an image so the missing part is the actual editing. As a complete noob I'd be happy to get some advice on what would be a reasonable approach to this, preferably with some sample code or even a reusable library.
I suppose, in a high level, what I want to do is start with a rectangular image and make sure it has an alpha layer and then as the user touches parts of the image to delete them, I need to "delete" more pixels from the image by changing their alpha level to 0. But that's a too high level description which I'm not even sure is correct... Another reasonable requirement is undo support.
Another approach that comes to mind is using the original image and a mask image which the user edits while touching the screen and when "done", somehow compile the two images to one image with alpha. Of course, this is an implementation detail and the user need not know that there are two images on the screen.
If possible, I'd like to stay at the UIImage or UIImageView or Core Graphics levels and not have to mess with OpenGL ES. My gut feeling is that the higher graphics levels should be performant enough and easy to understand, maintainable clean code is a consideration...
Any advice is appreciated, thanks!
This turned out to be pretty easy, thanks for #Rog's pointers.
I'll paste my solution below. This goes in the controller code:
#pragma mark - touches
- (void) clipImageCircle:(CGPoint)point radius:(CGFloat)radius {
UIBezierPath* uiBezierPath = [UIBezierPath bezierPathWithArcCenter:point radius:radius startAngle:0 endAngle:2 * M_PI clockwise:NO];
CGPathRef erasePath = uiBezierPath.CGPath;
UIImage *img = imageView.image;
CGSize s = img.size;
UIGraphicsBeginImageContext(s);
CGContextRef g = UIGraphicsGetCurrentContext();
CGContextAddPath(g, erasePath);
CGContextAddRect(g,CGRectMake(0, 0, s.width, s.height));
CGContextEOClip(g);
[img drawAtPoint:CGPointZero];
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
- (void) receiveTouch:(CGPoint)point {
NSLog(#"%#", NSStringFromCGPoint(point));
[self clipImageCircle:point radius:20];
}
- (void) endTouch {
NSLog(#"END TOUCH");
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// We currently support only single touch events
UITouch* touch = [touches anyObject];
CGPoint point = [touch locationInView:imageView];
if ([imageView hitTest:point withEvent:event]) {
[self receiveTouch:point];
}
}
- (void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self endTouch];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self endTouch];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
CGPoint point = [touch locationInView:imageView];
if ([imageView hitTest:point withEvent:event]) {
[self receiveTouch:point];
}
}
You will need to get well acquainted with Quartz 2D / CoreGraphics. This guide is a good start for you http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/Introduction/Introduction.html
The task you have described can be as simple or as complicated as you want. From letting the user use their fingers to erase the area around the photo by dragging their finger around (easy) to you trying to detect high contrast areas that help you guess where to cut (pretty complicated).
If you choose the former, you will essentially want to create a clipping mask based on user touches so have a look at the touchesBegan, touchesMoved and touchesEnded methods of UIView.
For the clipping mask, this is probably a good simple example to get you started How erase part of UIImage
Good luck with it, it sounds like a fun (if not challenging) project.

Resources