How can I smooth the line between 2 points in iOS? - ios

I already know how to draw a line between 2 points, but the line seems not so smooth. What can I do to make it smoother? Thanks you.
- (void)drawLineFrom:(CGPoint)start To:(CGPoint)end {
// begin image context
UIGraphicsBeginImageContext(self.imageLineView.frame.size);
// define image rect for drawing
[self.imageLineView.image drawInRect:CGRectMake(0, 0, imageLineView.frame.size.width, imageLineView.frame.size.height)];
// set line properties
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 2.0f);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), 1.0f, .0f, .0f, 1.0);
// move context to start point
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), start.x, start.y);
// define path to end point
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), end.x, end.y);
// stroke path
CGContextStrokePath(UIGraphicsGetCurrentContext());
// flush context to be sure all drawing operations were processed
CGContextFlush(UIGraphicsGetCurrentContext());
// get UIImage from context and pass it to our image view
imageLineView.image = UIGraphicsGetImageFromCurrentImageContext();
// end image context
UIGraphicsEndImageContext();
}

You can draw smooth line using Bezier Path.
You can have some more information here.
Bézier Paths

Try to enable Anti-Aliasing. you'll need to set the UIViewEdgeAntialiasing key to YES in your Info.plist.

Related

Drawing line on iPhone X

I have drawing functionality in my app over photo. It's working on every device except iPhone X. On iPhone X the lines become fade and move upwards with each finger movement. The upper 10-20 percent area of view works fine. Following is the code to draw line.
- (void)drawLineNew{
UIGraphicsBeginImageContext(self.bounds.size);
[self.viewImage drawInRect:self.bounds];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetStrokeColorWithColor(UIGraphicsGetCurrentContext(), self.selectedColor.CGColor);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), _lineWidth);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), previousPoint.x, previousPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[self setNeedsDisplay];}
Following is the sample drawing screenshot
After hours of hit and trial I made it work. The code I wrote is correct and should be working. The only issue was the frame size of that drawingView (self).
The height of view was in float and that was making it to go upwards with each pixel. I applied lroundf function and its working.
Happy coding.

UIImage masking with gesture

I'm trying to achieve selective color feature in iOS. I personally think that first draw shape using finger gesture and convert that into mask, But at the same time it should be real time, It should work as i move my finger across the grayscale image. Can anyone direct me to correct path.
Sample app : https://itunes.apple.com/us/app/color-splash/id304871603?mt=8
Thanks.
You can position two UIImageViews over each other, the color version in the background and the black&white version in the foreground.
Then you can use touchesBegan, touchesMoved and so on events to track user input. In touches moved you can "erase" a path that the user moved the finger along like this (self.foregroundDrawView is the black&white UIImageView):
UIGraphicsBeginImageContext(self.foregroundDrawView.frame.size);
[self.foregroundDrawView.image drawInRect:CGRectMake(0, 0, self.foregroundDrawView.frame.size.width, self.foregroundDrawView.frame.size.height)];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextSetAllowsAntialiasing(context, TRUE);
CGContextSetLineWidth(context, 85);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetRGBStrokeColor(context, 1, 0, 0, 1.0);
// Soft edge ... 5.0 works ok, but we try some more
CGContextSetShadowWithColor(context, CGSizeMake(0.0, 0.0), 13.0, [UIColor redColor].CGColor);
CGContextBeginPath(context);
CGContextMoveToPoint(context, touchLocation.x, touchLocation.y);
CGContextAddLineToPoint(context, currentLocation.x, currentLocation.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
self.foregroundDrawView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
The important part is CGContextSetBlendMode(context, kCGBlendModeClear);. This erases the traced part from the image, afterwards the image is set back as the image of the foreground image view.
When the user is done you should be able to combine the two images or use the black&white image as a mask.

CGContext: how do I erase pixels (e.g. kCGBlendModeClear) outside of a bitmap context?

I'm trying to build an eraser tool using Core Graphics, and I'm finding it incredibly difficult to make a performant eraser - it all comes down to:
CGContextSetBlendMode(context, kCGBlendModeClear)
If you google around for how to "erase" with Core Graphics, almost every answer comes back with that snippet. The problem is it only (apparently) works in a bitmap context. If you're trying to implement interactive erasing, I don't see how kCGBlendModeClear helps you - as far as I can tell, you're more or less locked into erasing on and off-screen UIImage/CGImage and drawing that image in the famously non-performant [UIView drawRect].
Here's the best I've been able to do:
-(void)drawRect:(CGRect)rect
{
if (drawingStroke) {
if (eraseModeOn) {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
[eraseImage drawAtPoint:CGPointZero];
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, lineWidth);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextSetLineWidth(context, ERASE_WIDTH);
CGContextStrokePath(context);
curImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[curImage drawAtPoint:CGPointZero];
} else {
[curImage drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, lineWidth);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextSetStrokeColorWithColor(context, lineColor.CGColor);
CGContextStrokePath(context);
}
} else {
[curImage drawAtPoint:CGPointZero];
}
}
Drawing a normal line (!eraseModeOn) is acceptably performant; I'm blitting my off-screen drawing buffer (curImage, which contains all previously drawn strokes) to the current CGContext, and I'm rendering the line (path) being currently drawn. It's not perfect, but hey, it works, and it's reasonably performant.
However, because kCGBlendModeNormal apparently does not work outside of a bitmap context, I'm forced to:
Create a bitmap context (UIGraphicsBeginImageContextWithOptions).
Draw my offscreen buffer (eraseImage, which is actually derived from curImage when the eraser tool is turned on - so really pretty much the same as curImage for arguments sake).
Render the "erase line" (path) currently being drawn to the bitmap context (using kCGBlendModeClear to clear pixels).
Extract the entire image into the offscreen buffer (curImage = UIGraphicsGetImageFromCurrentImageContext();)
And then finally blit the offscreen buffer to the view's CGContext
That's horrible, performance-wise. Using Instrument's Time tool, it's painfully obvious where the problems with this method are:
UIGraphicsBeginImageContextWithOptions is expensive
Drawing the offscreen buffer twice is expensive
Extracting the entire image into an offscreen buffer is expensive
So naturally, the code performs horribly on a real iPad.
I'm not really sure what to do here. I've been trying to figure out how to clear pixels in a non-bitmap context, but as far as I can tell, relying on kCGBlendModeClear is a dead-end.
Any thoughts or suggestions? How do other iOS drawing apps handle erase?
Additional Info
I've been playing around with a CGLayer approach, as it does appear that CGContextSetBlendMode(context, kCGBlendModeClear) will work in a CGLayer based on a bit of googling I've done.
However, I'm not super hopeful that this approach will pan out. Drawing the layer in drawRect (even using setNeedsDisplayInRect) is hugely non-performant; Core Graphics is choking up will rendering each path in the layer in CGContextDrawLayerAtPoint (according to Instruments). As far as I can tell, using a bitmap context is definitely preferable here in terms of performance - the only problem, of course, being the above question (kCGBlendModeClear not working after I blit the bitmap context to the main CGContext in drawRect).
I've managed to get good results by using the following code:
- (void)drawRect:(CGRect)rect
{
if (drawingStroke) {
if (eraseModeOn) {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextBeginTransparencyLayer(context, NULL);
[eraseImage drawAtPoint:CGPointZero];
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, ERASE_WIDTH);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextSetStrokeColorWithColor(context, [[UIColor clearColor] CGColor]);
CGContextStrokePath(context);
CGContextEndTransparencyLayer(context);
} else {
[curImage drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, self.lineWidth);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextSetStrokeColorWithColor(context, self.lineColor.CGColor);
CGContextStrokePath(context);
}
} else {
[curImage drawAtPoint:CGPointZero];
}
self.empty = NO;
}
The trick was to wrap the following into CGContextBeginTransparencyLayer / CGContextEndTransparencyLayer calls:
Blitting the erase background image to the context
Drawing the "erase" path on top of the erase background image, using kCGBlendModeClear
Since both the erase background image's pixel data and the erase path are in the same layer, it has the effect of clearing the pixels.
2D graphics following painting paradigms. When you are painting, it's hard to remove paint you've already put on the canvas, but super easy to add more paint on top. The blend modes with a bitmap context give you a way to do something hard (scrape paint off the canvas) with few lines of code. The few lines of code do not make it an easy computing operation (which is why it performs slowly).
The easiest way to fake clearing out pixels without having to do the offscreen bitmap buffering is to paint the background of your view over the image.
-(void)drawRect:(CGRect)rect
{
if (drawingStroke) {
CGColor lineCgColor = lineColor.CGColor;
if (eraseModeOn) {
//Use concrete background color to display erasing. You could use the backgroundColor property of the view, or define a color here
lineCgColor = [[self backgroundColor] CGColor];
}
[curImage drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, lineWidth);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextSetStrokeColorWithColor(context, lineCgColor);
CGContextStrokePath(context);
} else {
[curImage drawAtPoint:CGPointZero];
}
}
The more difficult (but more correct) way is to do the image editing on a background serial queue in response to an editing event. When you get a new action, you do the bitmap rendering in the background to an image buffer. When the buffered image is ready, you call setNeedsDisplay to allow the view to be redrawn during the next update cycle. This is more correct as drawRect: should be displaying the content of your view as quickly as possible, not processing the editing action.
#interface ImageEditor : UIView
#property (nonatomic, strong) UIImage * imageBuffer;
#property (nonatomic, strong) dispatch_queue_t serialQueue;
#end
#implementation ImageEditor
- (dispatch_queue_t) serialQueue
{
if (_serialQueue == nil)
{
_serialQueue = dispatch_queue_create("com.example.com.imagebuffer", DISPATCH_QUEUE_SERIAL);
}
return _serialQueue;
}
- (void)editingAction
{
dispatch_async(self.serialQueue, ^{
CGSize bufferSize = [self.imageBuffer size];
UIGraphicsBeginImageContext(bufferSize);
CGContext context = UIGraphicsGetCurrentContext();
CGContextDrawImage(context, CGRectMake(0, 0, bufferSize.width, bufferSize.height), [self.imageBuffer CGImage]);
//Do editing action, draw a clear line, solid line, etc
self.imageBuffer = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
dispatch_async(dispatch_get_main_queue(), ^{
[self setNeedsDisplay];
});
});
}
-(void)drawRect:(CGRect)rect
{
[self.imageBuffer drawAtPoint:CGPointZero];
}
#end
key is CGContextBeginTransparencyLayer and use clearColor and set CGContextSetBlendMode(context, kCGBlendModeClear);

IOS line on an image - invert color possible?

I am drawing a line on top of an UIImage in my app. I want the line to be a "distinct" color from the background it is drawing. If the image has white area in it, if the line is drawn on top of it it should not be white. Is this achievable easilyt?
Now I am using white color for the line. Code below:
[myImage drawInRect:CGRectMake(0,0,200,200)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 1.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(),1,1,1, 1.0);//RGB all 1)
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), pt1.x, pt1.y); // pt1 -start point
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), pt2.x, pt2.y); // pt2 - end point
As you can see I have used 1,1,1 to get white line on top of the image from point pt1 to pt2. When this is drawn on a white area of 'myImage' the line becomes invisible. I want to make it visible somehow on top of white as well. How can I achieve this?
Not sure whether there is already an answer to this on this forum. But couldn't find anything like this.
Thanks in advance for your help
EDIT
#rob mayoff's answer wroked perfect for me.
Here is my final code for the benefit of anyone looking here:
[myImage drawInRect:CGRectMake(0,0,200,200)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), 1.0);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(),1,1,1, 1.0);//RGB all 1)
CGContextSetShadowWithColor(UIGraphicsGetCurrentContext(), CGSizeMake(2.0f, 2.0f), 1.0f,
[[UIColor colorWithRed:0.0 green:0.0 blue:0.0 alpha:.6] CGColor]);
//CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeExclusion);
//CGContextSetBlendMode(UIGraphicsGetCurrentContext(),kCGBlendModeDifference);
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), pt1.x, pt1.y); // pt1 -start point
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), pt2.x, pt2.y); // pt2 - end point
Here are two suggestions:
Use a black shadow around your white line. Take a look at CGContextSetShadowWithColor. You will also probably want to use CGContextSaveGState and CGContextRestoreGState, if you want to draw more things after drawing the line.
Set the blend mode to kCGBlendModeDifference or kCGBlendModeExclusion. See CGContextSetBlendMode.

IOS: draw an image in a view

I have this code to color in a view:
UITouch *touch = [touches anyObject];
CGPoint currentPoint = [touch locationInView:drawImage];
UIGraphicsBeginImageContext(drawImage.frame.size);
[drawImage.image drawInRect:CGRectMake(0, 0, drawImage.frame.size.width, drawImage.frame.size.height)];
CGContextSetLineCap(UIGraphicsGetCurrentContext(), kCGLineCapRound);
CGContextSetLineWidth(UIGraphicsGetCurrentContext(), size);
CGContextSetRGBStrokeColor(UIGraphicsGetCurrentContext(), r, g, b, a);
CGContextBeginPath(UIGraphicsGetCurrentContext());
CGContextMoveToPoint(UIGraphicsGetCurrentContext(), lastPoint.x, lastPoint.y);
CGContextAddLineToPoint(UIGraphicsGetCurrentContext(), currentPoint.x, currentPoint.y);
CGContextStrokePath(UIGraphicsGetCurrentContext());
drawImage.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
my problem is that I want to color not with a point, but I want to use a particular image that is repeated (as a png image)
Is it possible?
It's easy to load a single UIImage and draw it:
UIImage *brushImage = [UIImage imageNamed:#"brush.png"];
[brushImage drawAtPoint:CGPointMake(currentPoint.x-brushImage.size.width/2, currentPoint.y-brushImage.size.height/2)];
This will draw the image just once per cycle, not a continuous line. If you want solid lines of your brush picture, see Objective C: Using UIImage for Stroking.
This could end up loading the image file every time this method is called, making it slow. While the results of [UIImage imageNamed:] are often cached, my code above could be improved by storing the brush for later reuse.
Speaking of performance, test this on older devices. As written, it worked on my second generation iPod touch, but any additions could make it stutter on second and third generation devices. Apple's GLPaint example, recommended by #Armaan, uses OpenGL for fast drawing, but much more code is involved.
Also, you seem to be doing your interaction (touchesBegan:, touchesMoved:, touchesEnded:) in a view that then draws its contents to drawImage, presumably a UIImageView. It is possible to store the painting image, then set this view's layer contents to be that image. This will not alter performance, but the code will be cleaner. If you continue using drawImage, access it through a property instead using its ivar.
You can start with APPLE's sample code:Sample code:GLPaint

Resources