How to write sketch app on iOS - ios

I'm trying to create sketch app that can draw shapes/path by finger.
What I've done so far is create UIBezierPath when touch start, and draw the path while finger moves.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
CGPoint locationInDrawRect = [mytouch locationInView:self.drawingView];
[self.drawingView.currentPath addLineToPoint:locationInDrawRect];
[self.drawingView setNeedsDisplay];
}
when touch is done, save it to an array of UIBezierPath.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.drawingView.pathsArray addObject:self.drawingView.currentPath]; //add latest current path to pathArray
[self.drawingView clearCurrentPath]; //clear current path for next line
[self.drawingView setNeedsDisplay];
}
in drawRect, draw current path and paths in array using for loop
- (void)drawRect:(CGRect)rect
{
if(!self.currentPath.empty){
[self.currentPath stroke];
}
for (UIBezierPath *path in self.pathsArray){
[path stroke];
}
}
This works for couple of paths objects, but it gets slow when the array holds more than 5 paths.
I tried to limit the area to render using setNeedsDisplayInRect: method.
[self.drawingView setNeedsDisplayInRect:rectToUpdateDraw];
and render path in array only when rect size is full canvas size, which is when touch is ended.
But this draws weird shape line and also gets slow when there are many objects in the array.
I don't know how I can solve this problem and need some help.
Thank you!

Related

UIBezierPath drawing takes up 100% of CPU

I have a UIBezierPath is drawn on the screen every time the user's touch is moved. This works fine on my simulator on a Mac Pro, but as soon as I moved it to a physical device, the drawing began to lag a lot. Using instruments, I checked the CPU usage, and it hits 100% while the user is drawing.
This becomes quite the problem because I have an NSTimer that is supposed to fire at 30fps, but when the CPU is overloaded by the drawing, the timer fires at only around 10fps.
How can I optimize the drawing so that it doesn't take 100% of the CPU?
UITouch* lastTouch;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if(lastTouch != nil)
return;
lastTouch = [touches anyObject];
[path moveToPoint:[lastTouch locationInView:self]];
[self drawDot];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches anyObject] == lastTouch)
[self drawDot];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches anyObject] == lastTouch)
lastTouch = nil;
}
- (void)drawDot
{
if(!self.canDraw)
return;
[path addLineToPoint:[lastTouch locationInView:self]];
[self setNeedsDisplayInRect:CGRectMake([lastTouch locationInView:self].x-30, [lastTouch locationInView:self].y-30, 60, 60)];
}
- (void)drawRect:(CGRect)rect
{
CGColorRef green = [UIColor green].CGColor;
CGContextRef context = UIGraphicsGetCurrentContext();
CGColorRef gray = [UIColor gray].CGColor;
CGContextSetStrokeColor(context, CGColorGetComponents(gray));
[shape stroke];
CGContextSetStrokeColor(context, CGColorGetComponents(green));
[path stroke];
}
You shouldn't really be using -drawRect in modern code - it's 30 years old and designed for very old hardware (25Mhz CPU with 16MB of RAM and no GPU) and has performance bottlenecks on modern hardware.
Instead you should be using Core Animation or OpenGL for all drawing. Core Animation can be very similar to drawRect.
Add this to your view's init method:
self.layer.contentsScale = [UIScreen mainScreen].scale;
And implement:
- (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)ctx
{
// basically the same code as you've got inDrawRect, although I recommend
// trying Core Graphics (CGContextMoveToPoint(), CGContextAddLineToPoint(),
// CGContextStrokePath(), etc)
}
And to make it re-draw (inside touchesMoved/etc):
[self.layer setNeedsDisplay];
I would also update your touch event code to just append to an NSMutableArray of points (perhaps encoded in in an [NSValue valueWithCGPoint:location]). That way you aren't creating a graphics path while responding to touch events.
A good place to look for efficient drawRect: is the wwdc video from a few years ago: https://developer.apple.com/videos/wwdc/2012/?id=238 . about 26m in he starts debugging a very simple painting app which is very similar to what you've described. At minute 30 he starts optimizing after the first optimization of setNeedsDisplayInRect: (which you're already doing).
short story: he uses an image as a backing store for what's already drawn so that each drawRect: call he only draws 1 image + very short additional line segments instead of drawing extremely long paths every time.

Cocos2d ccDrawLine performance issue

I use cocos2d 2.0 and Xcode 4.5. I am trying to learn how to draw a line. I can draw a line but after I drew few lines a serious performance issue occurs on Simulator.
Simulator starts to freeze, draws lines very very slowly and worst of all ,I guess because of -(void)draw is called every frame, the label on the screen becomes bold
before lines :
after lines;
I use following code :
.m
-(id) init
{
if( (self=[super init])) {
CCLabelTTF *label = [CCLabelTTF labelWithString:#"Simple Line Demo" fontName:#"Marker Felt" fontSize:32];
label.position = ccp( 240, 300 );
[self addChild: label];
_naughtytoucharray =[[NSMutableArray alloc ] init];
self.isTouchEnabled = YES;
}
return self;
}
-(BOOL) ccTouchBegan:(UITouch *)touch withEvent:(UIEvent *)event
{
BOOL isTouching;
// determine if it's a touch you want, then return the result
return isTouching;
}
-(void) ccTouchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch *touch = [ touches anyObject];
CGPoint new_location = [touch locationInView: [touch view]];
new_location = [[CCDirector sharedDirector] convertToGL:new_location];
CGPoint oldTouchLocation = [touch previousLocationInView:touch.view];
oldTouchLocation = [[CCDirector sharedDirector] convertToGL:oldTouchLocation];
oldTouchLocation = [self convertToNodeSpace:oldTouchLocation];
// add my touches to the naughty touch array
[_naughtytoucharray addObject:NSStringFromCGPoint(new_location)];
[_naughtytoucharray addObject:NSStringFromCGPoint(oldTouchLocation)];
}
-(void)draw
{
[super draw];
ccDrawColor4F(1.0f, 0.0f, 0.0f, 100.0f);
for(int i = 0; i < [_naughtytoucharray count]; i+=2)
{
CGPoint start = CGPointFromString([_naughtytoucharray objectAtIndex:i]);
CGPoint end = CGPointFromString([_naughtytoucharray objectAtIndex:i+1]);
ccDrawLine(start, end);
}
}
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
ManageTraffic *line = [ManageTraffic node];
[self addChild: line z:99 tag:999];
}
I saw few Air Traffic Control games such as Flight Control, ATC Mania works really well.
Does this performance issue occur because of CCDrawLine/UITouch *touch or it is a common issue?
What Flight Control, ATC Mania might be using for line drawing?
Thanks in advance.
EDIT::::
OK I guess problem is not ccDrawLine, problem is I call ManageTraffic *line = [ManageTraffic node]; every time touch ends it calls init of node so it overrides scene
- (void)ccTouchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
ManageTraffic *line = [ManageTraffic node];
[self addChild: line z:99 tag:999];
}
There's three things going on:
You assess performance on the Simulator. Test it on a device as Ben says.
You store points as strings and convert strings back to CGPoint. That is terribly inefficient.
ccDrawLine is not exactly efficient. For a couple dozen line segments it's ok. In your case maybe not (see below).
For #2, create a point class with only a CGPoint property and use that to store points in the array. Removes the string conversion or packing into NSData.
For #3 make sure that new points are only added if the new point is at least n points away from the previous point. For example a distance of 10 should reduce the number of points while still allowing for relatively fine line details.
Also regarding #3, I notice you add both current and previous point to the array. Why? You only need to add the new point, and then draw points from index 0 to 1, from 1 to 2, and so on. You only have to test for the case where there is only 1 point. The previous touch event's location is always the next touch event's previousLocation. So you're storing twice as many points as you need to.

UIBezierPath drawing issues

I created a class called myBezierPaths, with two member variables of type UIBezierPath and UIColor, and then I tried to use object of this class to draw the bezierpath, but I am not getting bezierpath, below is my code
//Bezierpath.h
#interface BezierPath : NSObject
{
UIBezierPath *m_bezierPath;
UIColor *m_pathColor;
}
#property (nonatomic, strong) UIBezierPath *bezierPath;
#property (nonatomic, strong) UIColor *pathColor;
#end
//drawingView.m
- (void)drawRect:(CGRect)rect
{
for (BezierPath *pathobj in m_pathArray)
{
[pathobj.pathColor setStroke];
[pathobj.pathColor setFill];
[pathobj.bezierPath strokeWithBlendMode:kCGBlendModeNormal alpha:1.0];
}
}
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
*self.myPath = [[BezierPath alloc] init];*
myPath.bezierPath.lineWidth = 5;
myPath.bezierPath.lineCapStyle = kCGLineCapRound;
myPath.bezierPath.flatness = 0.0;
myPath.bezierPath.lineJoinStyle = kCGLineJoinRound;
myPath.bezierPath.miterLimit = 200.0;
self.myPath.pathColor = [UIColor redColor];
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath.bezierPath moveToPoint:[mytouch locationInView:self]];
[m_pathArray addObject:self.myPath];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[self.myPath.bezierpath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
hope I am clear with my explanation, waiting for reply
Regards
Ranjit
First, in your drawRect you don't actually fill the shape. You do stroke it (basically draw a border) but you need to call [pathObj.bezierPath fill]; to actually fill the shape with a color.
There's a lot of code you're not showing, so it's hard to tell where the problem might be. I would create a breakpoint in the drawRect method to determine if all of the variables are there. For example, you show (what I assume) to be an array: m_pathArray. Are you sure it's initialized? Are you sure the properties of BezierPath are working correctly? I notice that on touchesBegan you set the pathColor but only access/read the bezierPath variable. Has that been initialized? Where do you create the 'm_bezierPath' instance variable? Do you create it in the BezierPath init method or lazily instantiate it on access?
So those are a few things you could check.
You don't show your touchesMoved event handler , but i would expect something like this. Otherwise all you are doing is creating a set of paths of length zero.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[self.myPath.bezeirPath lineToPoint:[mytouch locationInView:self]];
}
#Ranjit need some help I've same problem ...
I want to change the colors of UIbezierpath.
but wen i select the color my previous path also automatically gets colored with new color.
In My first Image i draw a line and selected color is red.
but when i select yellow color to draw another line the previous line also change to yellow color.(image no 2)

UIBezierPath not drawing a smooth curve

I am using UIBezierPath for drawing, and I have written the code on touch events and its working fine, but my curves are not smooth, when I am move my finger around and draw some curve, they are not smooth.
- (void)drawRect:(CGRect)rect
{
[[UIColor redColor] setStroke];
for (UIBezierPath *_path in pathArray)
[_path strokeWithBlendMode:kCGBlendModeNormal alpha:1.0];
}
#pragma mark - Touch Methods
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
myPath=[[UIBezierPath alloc]init];
myPath.lineWidth=5;
myPath.lineCapStyle = kCGLineCapRound;
myPath.flatness = 0.0;
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath moveToPoint:[mytouch locationInView:self]];
[pathArray addObject:myPath];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
Here is the image
In the above image if you see the letters a and d you will see that the curve is not smooth. What should I do, to get a smooth curve?
Well you are using addLinePoint so obviously you get lines, not curves. You would be interested in other methods to give you smooth curves: addCurveToPoint or addQuadCurveToPoint. But as you can see from the API, besides the points you are actually drawing with your finger, you need also control points which do not come free with the drawing. Even Photoshop is asking you to move them around when doing curvature. In other words, making your hand drawing smooth "automagically" involves quite some mathematics. Google "smoothing hand drawn" and you will get at least these points to start with
Smoothing a hand-drawn free shape
Smoothing a hand-drawn curve
It is really not iOS specific at all.
Just use this line. it will solve your problem myPath.miterLimit=-10;.
Change the value if you need to any thing it takes float value

iOS Image editing open source library - or just some tips

For an ios app I'm writing, I'd like to take an photo from the photo library and then let the user "clean it up", essentially deleting parts of it that are not needed. For example, suppose the user chooses a photo of a person, my app only needs the head and everything else should be deleted so the user needs to clean the photo up by deleting the background, the body or other persons in the photo. Imagine a photoshop like experience but with only one tool - the eraser.
I'm looking for open source libraries, or examples or just tips of how to get started with that.
I know how to use a UIImagePickerController to select an image so the missing part is the actual editing. As a complete noob I'd be happy to get some advice on what would be a reasonable approach to this, preferably with some sample code or even a reusable library.
I suppose, in a high level, what I want to do is start with a rectangular image and make sure it has an alpha layer and then as the user touches parts of the image to delete them, I need to "delete" more pixels from the image by changing their alpha level to 0. But that's a too high level description which I'm not even sure is correct... Another reasonable requirement is undo support.
Another approach that comes to mind is using the original image and a mask image which the user edits while touching the screen and when "done", somehow compile the two images to one image with alpha. Of course, this is an implementation detail and the user need not know that there are two images on the screen.
If possible, I'd like to stay at the UIImage or UIImageView or Core Graphics levels and not have to mess with OpenGL ES. My gut feeling is that the higher graphics levels should be performant enough and easy to understand, maintainable clean code is a consideration...
Any advice is appreciated, thanks!
This turned out to be pretty easy, thanks for #Rog's pointers.
I'll paste my solution below. This goes in the controller code:
#pragma mark - touches
- (void) clipImageCircle:(CGPoint)point radius:(CGFloat)radius {
UIBezierPath* uiBezierPath = [UIBezierPath bezierPathWithArcCenter:point radius:radius startAngle:0 endAngle:2 * M_PI clockwise:NO];
CGPathRef erasePath = uiBezierPath.CGPath;
UIImage *img = imageView.image;
CGSize s = img.size;
UIGraphicsBeginImageContext(s);
CGContextRef g = UIGraphicsGetCurrentContext();
CGContextAddPath(g, erasePath);
CGContextAddRect(g,CGRectMake(0, 0, s.width, s.height));
CGContextEOClip(g);
[img drawAtPoint:CGPointZero];
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
- (void) receiveTouch:(CGPoint)point {
NSLog(#"%#", NSStringFromCGPoint(point));
[self clipImageCircle:point radius:20];
}
- (void) endTouch {
NSLog(#"END TOUCH");
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// We currently support only single touch events
UITouch* touch = [touches anyObject];
CGPoint point = [touch locationInView:imageView];
if ([imageView hitTest:point withEvent:event]) {
[self receiveTouch:point];
}
}
- (void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self endTouch];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self endTouch];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
CGPoint point = [touch locationInView:imageView];
if ([imageView hitTest:point withEvent:event]) {
[self receiveTouch:point];
}
}
You will need to get well acquainted with Quartz 2D / CoreGraphics. This guide is a good start for you http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/Introduction/Introduction.html
The task you have described can be as simple or as complicated as you want. From letting the user use their fingers to erase the area around the photo by dragging their finger around (easy) to you trying to detect high contrast areas that help you guess where to cut (pretty complicated).
If you choose the former, you will essentially want to create a clipping mask based on user touches so have a look at the touchesBegan, touchesMoved and touchesEnded methods of UIView.
For the clipping mask, this is probably a good simple example to get you started How erase part of UIImage
Good luck with it, it sounds like a fun (if not challenging) project.

Resources