I am using UIBezierPath for drawing, and I have written the code on touch events and its working fine, but my curves are not smooth, when I am move my finger around and draw some curve, they are not smooth.
- (void)drawRect:(CGRect)rect
{
[[UIColor redColor] setStroke];
for (UIBezierPath *_path in pathArray)
[_path strokeWithBlendMode:kCGBlendModeNormal alpha:1.0];
}
#pragma mark - Touch Methods
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
myPath=[[UIBezierPath alloc]init];
myPath.lineWidth=5;
myPath.lineCapStyle = kCGLineCapRound;
myPath.flatness = 0.0;
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath moveToPoint:[mytouch locationInView:self]];
[pathArray addObject:myPath];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
Here is the image
In the above image if you see the letters a and d you will see that the curve is not smooth. What should I do, to get a smooth curve?
Well you are using addLinePoint so obviously you get lines, not curves. You would be interested in other methods to give you smooth curves: addCurveToPoint or addQuadCurveToPoint. But as you can see from the API, besides the points you are actually drawing with your finger, you need also control points which do not come free with the drawing. Even Photoshop is asking you to move them around when doing curvature. In other words, making your hand drawing smooth "automagically" involves quite some mathematics. Google "smoothing hand drawn" and you will get at least these points to start with
Smoothing a hand-drawn free shape
Smoothing a hand-drawn curve
It is really not iOS specific at all.
Just use this line. it will solve your problem myPath.miterLimit=-10;.
Change the value if you need to any thing it takes float value
Related
In an iOS application, I want to draw continuously curved lines like shown in below image. Here is my code but it draws only single straight line.
- (void)drawRect:(CGRect)rect{
CGContextRef context = UIGraphicsGetCurrentContext();
// set the line properties
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, 30);
CGContextSetAlpha(context, 0.6);
// draw the line
CGContextMoveToPoint(context, startPoint.x, startPoint.y);
CGContextAddLineToPoint(context, endPoint.x, endPoint.y);
CGContextStrokePath(context);
}
-(void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint current = [touch locationInView:self];
startPoint=current;
arrPoints=[[NSMutableArray alloc]init];
[arrPoints addObject:NSStringFromCGPoint(startPoint)];
}
-(void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
UITouch *touch = [touches anyObject];
CGPoint p = [touch locationInView:self];
endPoint=p;
[arrPoints addObject:NSStringFromCGPoint(endPoint)];
[self setNeedsDisplay];
}
-(void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event{
[self touchesMoved:touches withEvent:event];
}
Here is the image what I want to achieve is suppose there are five view and I want to draw continues line starting from first view to second,third,etc... and at same time I want to draw curve at each view in line.
Your code is already building an array of points.
Now you need to modify your drawRect method to draw line segments between all the points, not just the latest one.
You will probably get better performance if you build a UIBezierPath out of your line segments and draw that in one go.
The result of this will be a continuous series of short line segments that approximate a curve. If the user moves his finger quickly then the line segments will be longer, making the curve look more choppy.
Once you get that working there are techniques you can use to smooth the resulting curve. Erica Sadun's excellent "Core iOS Developer's cookbook" has a recipe called "Smoothing" covering this very topic. It does exactly what you want - takes a freehand drawing by the user and smooths it.
I have a couple of projects on Github that use Erica Sadun's line smoothing technique.
The project KeyframeViewAnimations draws a curve that passes through a set of predefined points.
The project "RandomBlobs" draws a closed curve which is a smoothed version of a polygon.
Both of these include Dr. Sadun's curve-smoothing code, but again, the chapter from her book is the best fit for your needs.
I have a UIBezierPath is drawn on the screen every time the user's touch is moved. This works fine on my simulator on a Mac Pro, but as soon as I moved it to a physical device, the drawing began to lag a lot. Using instruments, I checked the CPU usage, and it hits 100% while the user is drawing.
This becomes quite the problem because I have an NSTimer that is supposed to fire at 30fps, but when the CPU is overloaded by the drawing, the timer fires at only around 10fps.
How can I optimize the drawing so that it doesn't take 100% of the CPU?
UITouch* lastTouch;
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
if(lastTouch != nil)
return;
lastTouch = [touches anyObject];
[path moveToPoint:[lastTouch locationInView:self]];
[self drawDot];
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches anyObject] == lastTouch)
[self drawDot];
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
if([touches anyObject] == lastTouch)
lastTouch = nil;
}
- (void)drawDot
{
if(!self.canDraw)
return;
[path addLineToPoint:[lastTouch locationInView:self]];
[self setNeedsDisplayInRect:CGRectMake([lastTouch locationInView:self].x-30, [lastTouch locationInView:self].y-30, 60, 60)];
}
- (void)drawRect:(CGRect)rect
{
CGColorRef green = [UIColor green].CGColor;
CGContextRef context = UIGraphicsGetCurrentContext();
CGColorRef gray = [UIColor gray].CGColor;
CGContextSetStrokeColor(context, CGColorGetComponents(gray));
[shape stroke];
CGContextSetStrokeColor(context, CGColorGetComponents(green));
[path stroke];
}
You shouldn't really be using -drawRect in modern code - it's 30 years old and designed for very old hardware (25Mhz CPU with 16MB of RAM and no GPU) and has performance bottlenecks on modern hardware.
Instead you should be using Core Animation or OpenGL for all drawing. Core Animation can be very similar to drawRect.
Add this to your view's init method:
self.layer.contentsScale = [UIScreen mainScreen].scale;
And implement:
- (void)drawLayer:(CALayer *)layer inContext:(CGContextRef)ctx
{
// basically the same code as you've got inDrawRect, although I recommend
// trying Core Graphics (CGContextMoveToPoint(), CGContextAddLineToPoint(),
// CGContextStrokePath(), etc)
}
And to make it re-draw (inside touchesMoved/etc):
[self.layer setNeedsDisplay];
I would also update your touch event code to just append to an NSMutableArray of points (perhaps encoded in in an [NSValue valueWithCGPoint:location]). That way you aren't creating a graphics path while responding to touch events.
A good place to look for efficient drawRect: is the wwdc video from a few years ago: https://developer.apple.com/videos/wwdc/2012/?id=238 . about 26m in he starts debugging a very simple painting app which is very similar to what you've described. At minute 30 he starts optimizing after the first optimization of setNeedsDisplayInRect: (which you're already doing).
short story: he uses an image as a backing store for what's already drawn so that each drawRect: call he only draws 1 image + very short additional line segments instead of drawing extremely long paths every time.
I'm trying to create sketch app that can draw shapes/path by finger.
What I've done so far is create UIBezierPath when touch start, and draw the path while finger moves.
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
CGPoint locationInDrawRect = [mytouch locationInView:self.drawingView];
[self.drawingView.currentPath addLineToPoint:locationInDrawRect];
[self.drawingView setNeedsDisplay];
}
when touch is done, save it to an array of UIBezierPath.
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
[self.drawingView.pathsArray addObject:self.drawingView.currentPath]; //add latest current path to pathArray
[self.drawingView clearCurrentPath]; //clear current path for next line
[self.drawingView setNeedsDisplay];
}
in drawRect, draw current path and paths in array using for loop
- (void)drawRect:(CGRect)rect
{
if(!self.currentPath.empty){
[self.currentPath stroke];
}
for (UIBezierPath *path in self.pathsArray){
[path stroke];
}
}
This works for couple of paths objects, but it gets slow when the array holds more than 5 paths.
I tried to limit the area to render using setNeedsDisplayInRect: method.
[self.drawingView setNeedsDisplayInRect:rectToUpdateDraw];
and render path in array only when rect size is full canvas size, which is when touch is ended.
But this draws weird shape line and also gets slow when there are many objects in the array.
I don't know how I can solve this problem and need some help.
Thank you!
I am making a math related activity in which the user can draw with their fingers for scratch work as they try to solve the math question. However, I notice that when I move my finger quickly, the line lags behind my finger somewhat noticeably. I was wondering if there was some area I had overlooked for performance or if touchesMoved simply just doesn't come enough (it is perfectly smooth and wonderful if you don't move fast). I am using UIBezierPath. First I create it in my init method like this:
myPath=[[UIBezierPath alloc]init];
myPath.lineCapStyle=kCGLineCapSquare;
myPath.lineJoinStyle = kCGLineJoinBevel;
myPath.lineWidth=5;
myPath.flatness = 0.4;
Then in drawRect:
- (void)drawRect:(CGRect)rect
{
[brushPattern setStroke];
if(baseImageView.image)
{
CGContextRef c = UIGraphicsGetCurrentContext();
[baseImageView.layer renderInContext:c];
}
CGBlendMode blendMode = self.erase ? kCGBlendModeClear : kCGBlendModeNormal;
[myPath strokeWithBlendMode:blendMode alpha:1.0];
}
baseImageView is what I use to save the result so that I don't have to draw many paths (gets really slow after a while). Here is my touch logic:
-(void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath moveToPoint:[mytouch locationInView:self]];
}
-(void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event
{
UITouch *mytouch=[[touches allObjects] objectAtIndex:0];
[myPath addLineToPoint:[mytouch locationInView:self]];
[self setNeedsDisplay];
}
-(void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
{
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0f);
CGContextRef c = UIGraphicsGetCurrentContext();
[self.layer renderInContext:c];
baseImageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[myPath removeAllPoints];
[self setNeedsDisplay];
}
This project is going to be released as an enterprise app, so it will only be installed on iPad 2. Target iOS is 5.0. Any suggestions about how I can squeeze a little more speed out of this would be appreciated.
Of course you should start by running it under Instruments and look for your hotspots. Then you need to to make changes and re-evaluate to see their impact. Otherwise you're just guessing. That said, some notes from experience:
Adding lots of elements to a path can get very expensive. I would not be surprised if your addLineToPoint: turns out to be a hotspot. It has been for me.
Rather than backing your system with a UIImageView, I would probably render into a CGLayer. CGLayers are optimized for rendering into a specific context.
Why accumulate the path at all rather than just rendering it into the layer at each step? That way your path would never be more than two elements (move, addLine). Typically the two-stage approach is used so you can handle undo or the like.
Make sure that you're turning off any UIBezierPath features you don't want. In particular, look at the section "Accessing Draw Properties" in the docs. You may consider switching to CGMutablePath rather than UIBezierPath. It's not actually faster when configured the same, but it's default settings turn more things off, so by default it's faster. (You're already setting most of these; you'll want to experiment a little in Instruments to see what impact they make.)
http://mobile.tutsplus.com/tutorials/iphone/ios-sdk_freehand-drawing/
This link exactly shows how to make a curve smoother . This tutorial shows it step by step. And simply tells us how we can add some intermediate points (in touchesMoved method) to our curves to make them smoother.
For an ios app I'm writing, I'd like to take an photo from the photo library and then let the user "clean it up", essentially deleting parts of it that are not needed. For example, suppose the user chooses a photo of a person, my app only needs the head and everything else should be deleted so the user needs to clean the photo up by deleting the background, the body or other persons in the photo. Imagine a photoshop like experience but with only one tool - the eraser.
I'm looking for open source libraries, or examples or just tips of how to get started with that.
I know how to use a UIImagePickerController to select an image so the missing part is the actual editing. As a complete noob I'd be happy to get some advice on what would be a reasonable approach to this, preferably with some sample code or even a reusable library.
I suppose, in a high level, what I want to do is start with a rectangular image and make sure it has an alpha layer and then as the user touches parts of the image to delete them, I need to "delete" more pixels from the image by changing their alpha level to 0. But that's a too high level description which I'm not even sure is correct... Another reasonable requirement is undo support.
Another approach that comes to mind is using the original image and a mask image which the user edits while touching the screen and when "done", somehow compile the two images to one image with alpha. Of course, this is an implementation detail and the user need not know that there are two images on the screen.
If possible, I'd like to stay at the UIImage or UIImageView or Core Graphics levels and not have to mess with OpenGL ES. My gut feeling is that the higher graphics levels should be performant enough and easy to understand, maintainable clean code is a consideration...
Any advice is appreciated, thanks!
This turned out to be pretty easy, thanks for #Rog's pointers.
I'll paste my solution below. This goes in the controller code:
#pragma mark - touches
- (void) clipImageCircle:(CGPoint)point radius:(CGFloat)radius {
UIBezierPath* uiBezierPath = [UIBezierPath bezierPathWithArcCenter:point radius:radius startAngle:0 endAngle:2 * M_PI clockwise:NO];
CGPathRef erasePath = uiBezierPath.CGPath;
UIImage *img = imageView.image;
CGSize s = img.size;
UIGraphicsBeginImageContext(s);
CGContextRef g = UIGraphicsGetCurrentContext();
CGContextAddPath(g, erasePath);
CGContextAddRect(g,CGRectMake(0, 0, s.width, s.height));
CGContextEOClip(g);
[img drawAtPoint:CGPointZero];
imageView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
- (void) receiveTouch:(CGPoint)point {
NSLog(#"%#", NSStringFromCGPoint(point));
[self clipImageCircle:point radius:20];
}
- (void) endTouch {
NSLog(#"END TOUCH");
}
- (void) touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
// We currently support only single touch events
UITouch* touch = [touches anyObject];
CGPoint point = [touch locationInView:imageView];
if ([imageView hitTest:point withEvent:event]) {
[self receiveTouch:point];
}
}
- (void) touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
[self endTouch];
}
- (void) touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
[self endTouch];
}
- (void) touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
UITouch* touch = [touches anyObject];
CGPoint point = [touch locationInView:imageView];
if ([imageView hitTest:point withEvent:event]) {
[self receiveTouch:point];
}
}
You will need to get well acquainted with Quartz 2D / CoreGraphics. This guide is a good start for you http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/drawingwithquartz2d/Introduction/Introduction.html
The task you have described can be as simple or as complicated as you want. From letting the user use their fingers to erase the area around the photo by dragging their finger around (easy) to you trying to detect high contrast areas that help you guess where to cut (pretty complicated).
If you choose the former, you will essentially want to create a clipping mask based on user touches so have a look at the touchesBegan, touchesMoved and touchesEnded methods of UIView.
For the clipping mask, this is probably a good simple example to get you started How erase part of UIImage
Good luck with it, it sounds like a fun (if not challenging) project.