I'm drawing a line on UIView. When I call -setNeedsDisplay, my view becomes clear and drawing new line. How to continue current line? And how drawing with animation? Thanks.
- (void)drawRect:(CGRect)rect {
[super drawRect:rect];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetStrokeColorWithColor(context, [_color CGColor]);
CGContextSetLineWidth(context, 10.0);
CGContextMoveToPoint(context, _startPointX, 0);
CGContextAddLineToPoint(context, _endPointX, 0);
CGContextStrokePath(context);
}
setNeedsDisplay totally redraw view. So you need to store all drawings somewhere and apply it every redraw.
What do you mean under animating drawing? Animating drawing is drawing itself. Just redraw the view after every small change and it will looks like you draw naturally.
For animation, you can create a CAShapeLayer with these lines.
Then create a CABasicAnimation with keypath #"storkeEnd"
toValue set #1 to draw all your lines from clear to drawn down.
Related
I'm new to context concept in Apple iOS programming. Currently, I'm thinking in views. I want to draw an arc using the easiest method possible, and then I found this question and answer. But then I'm confused with context concept. I thought to draw something, I will need to call the drawing function and supply the view where I want the drawing function to draw. But in there, I got confused with UIGraphicsGetCurrentContext(). What "current context" is that? Is it the UIScreen.mainScreen()? How if I want to set the "current context" to a view inside a UICollectionViewCell, so that all the pixel drawn will be relative to the view? Can I do that?
The way you draw in CocoaTouch is to subclass UIView to create a custom view and implement the drawRect:(CGRect)rect method. Put your drawing code inside drawRect. Something like:
-(void)drawRect:(CGRect)rect {
CGContextRef context = UIGraphicsGetCurrentContext();
CGPathRef path = CGPathCreateWithRect(rect, NULL);
[[UIColor blueColor] setStroke];
CGContextSetLineWidth(context, 4.0);
CGContextAddPath(context, path);
CGContextDrawPath(context, kCGPathFillStroke);
CGPathRelease(path);
}
You can put your custom UIView inside of a UICollectionViewCell if you want.
I have the following code to show marker in a UIView. The marker show's well, and once we try to pinch zoom and scale the UIView using the transform the first drawing remains as it is, even after calling setNeedsDisplay.
My Custom UIView subclass has the following code
- (void)drawRect:(CGRect)rect
{
// Drawing code
CGFloat w=20.0f;
CGFloat h=8.0f;
CGContextRef context=UIGraphicsGetCurrentContext();
CGContextSetFillColorWithColor(context, [UIColor blueColor].CGColor);
CGContextClearRect(context,self.bounds);
CGContextSetStrokeColorWithColor(context, [UIColor redColor].CGColor);
CGContextSetLineCap(context, 2.0);
CGMutablePathRef leftMarker=CGPathCreateMutable();
CGPathMoveToPoint(leftMarker, NULL, 0, 0);
CGPathAddLineToPoint(leftMarker, NULL, w, 0);
CGPathAddLineToPoint(leftMarker,NULL, w, h);
CGPathAddLineToPoint(leftMarker,NULL, h, h);
CGPathAddLineToPoint(leftMarker,NULL,h, w);
CGPathAddLineToPoint(leftMarker,NULL,0, w);
CGPathAddLineToPoint(leftMarker,NULL, 0, 0);
CGContextAddPath(context, leftMarker);
CGContextDrawPath(context, kCGPathFill);
const CGAffineTransform rightMarkerTransform=CGAffineTransformMakeRotateTranslate(DEGREES_TO_RADIANS(90),self.frame.size.width,0);
CGPathRef rightMarker=CGPathCreateCopyByTransformingPath(path, &rightMarkerTransform);
CGContextAddPath(context, rightMarker);
CGContextDrawPath(context, kCGPathFill);
const CGAffineTransform leftMarkerBottomTransform=CGAffineTransformMakeRotateTranslate(DEGREES_TO_RADIANS(270),0,self.frame.size.height);
CGPathRef leftMarkerbottom=CGPathCreateCopyByTransformingPath(path, &leftMarkerBottomTransform);
CGContextAddPath(context, leftMarkerbottom);
CGContextDrawPath(context, kCGPathFill);
const CGAffineTransform rightMarkerBottomTransform=CGAffineTransformMakeRotateTranslate(DEGREES_TO_RADIANS(180),self.frame.size.width,self.frame.size.height);
CGPathRef rightMarkerBottom=CGPathCreateCopyByTransformingPath(path, &rightMarkerBottomTransform);
CGContextAddPath(context, rightMarkerBottom);
CGContextDrawPath(context, kCGPathFill);
CGPathRelease(rightMarker);
CGPathRelease(leftMarkerbottom);
CGPathRelease(rightMarkerBottom);
CGPathRelease(leftMarker);
}
The pinch zoom code is listed below
CGFloat lastScale;
-(void) handlepinchGesture:(UIPinchGestureRecognizer*)gesture{
UIView *gestureV=gesture.view;
CGFloat scale=gesture.scale;
switch (gesture.state) {
case UIGestureRecognizerStateBegan:
if(lastScale<1.0){
lastScale=1.0;
}
scale=lastScale;
break;
default:
break;
}
if(scale<1.0){
scale=1.0;
}
lastScale=scale;
gestureV.transform=CGAffineTransformMakeScale(scale, scale);
//Even this does not work ….[gestureV setNeedsDisplay];
gesture.scale=scale;
}
Make you sure have this set (but it should be defaulted to YES).
self.clearsContextBeforeDrawing = YES;
From Apple's Docs in UIView
When
set to YES, the drawing buffer is automatically cleared to transparent
black before the drawRect: method is called. This behavior ensures
that there are no visual artifacts left over when the view’s contents
are redrawn. If the view’s opaque property is also set to YES, the
backgroundColor property of the view must not be nil or drawing errors
may occur. The default value of this property is YES.
If you set the value of this property to NO, you are responsible for
ensuring the contents of the view are drawn properly in your drawRect:
method. If your drawing code is already heavily optimized, setting
this property is NO can improve performance, especially during
scrolling when only a portion of the view might need to be redrawn.
You need to set the background color to something other than nil.
From DBD's answer (which is taken from the docs):
If the view’s opaque property is also set to YES, the backgroundColor property of the view must not be nil or drawing errors may occur.
You also need to make sure self.clearsContextBeforeDrawing is set to YES.
I'm trying to build an eraser tool using Core Graphics, and I'm finding it incredibly difficult to make a performant eraser - it all comes down to:
CGContextSetBlendMode(context, kCGBlendModeClear)
If you google around for how to "erase" with Core Graphics, almost every answer comes back with that snippet. The problem is it only (apparently) works in a bitmap context. If you're trying to implement interactive erasing, I don't see how kCGBlendModeClear helps you - as far as I can tell, you're more or less locked into erasing on and off-screen UIImage/CGImage and drawing that image in the famously non-performant [UIView drawRect].
Here's the best I've been able to do:
-(void)drawRect:(CGRect)rect
{
if (drawingStroke) {
if (eraseModeOn) {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0.0);
CGContextRef context = UIGraphicsGetCurrentContext();
[eraseImage drawAtPoint:CGPointZero];
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, lineWidth);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextSetLineWidth(context, ERASE_WIDTH);
CGContextStrokePath(context);
curImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
[curImage drawAtPoint:CGPointZero];
} else {
[curImage drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, lineWidth);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextSetStrokeColorWithColor(context, lineColor.CGColor);
CGContextStrokePath(context);
}
} else {
[curImage drawAtPoint:CGPointZero];
}
}
Drawing a normal line (!eraseModeOn) is acceptably performant; I'm blitting my off-screen drawing buffer (curImage, which contains all previously drawn strokes) to the current CGContext, and I'm rendering the line (path) being currently drawn. It's not perfect, but hey, it works, and it's reasonably performant.
However, because kCGBlendModeNormal apparently does not work outside of a bitmap context, I'm forced to:
Create a bitmap context (UIGraphicsBeginImageContextWithOptions).
Draw my offscreen buffer (eraseImage, which is actually derived from curImage when the eraser tool is turned on - so really pretty much the same as curImage for arguments sake).
Render the "erase line" (path) currently being drawn to the bitmap context (using kCGBlendModeClear to clear pixels).
Extract the entire image into the offscreen buffer (curImage = UIGraphicsGetImageFromCurrentImageContext();)
And then finally blit the offscreen buffer to the view's CGContext
That's horrible, performance-wise. Using Instrument's Time tool, it's painfully obvious where the problems with this method are:
UIGraphicsBeginImageContextWithOptions is expensive
Drawing the offscreen buffer twice is expensive
Extracting the entire image into an offscreen buffer is expensive
So naturally, the code performs horribly on a real iPad.
I'm not really sure what to do here. I've been trying to figure out how to clear pixels in a non-bitmap context, but as far as I can tell, relying on kCGBlendModeClear is a dead-end.
Any thoughts or suggestions? How do other iOS drawing apps handle erase?
Additional Info
I've been playing around with a CGLayer approach, as it does appear that CGContextSetBlendMode(context, kCGBlendModeClear) will work in a CGLayer based on a bit of googling I've done.
However, I'm not super hopeful that this approach will pan out. Drawing the layer in drawRect (even using setNeedsDisplayInRect) is hugely non-performant; Core Graphics is choking up will rendering each path in the layer in CGContextDrawLayerAtPoint (according to Instruments). As far as I can tell, using a bitmap context is definitely preferable here in terms of performance - the only problem, of course, being the above question (kCGBlendModeClear not working after I blit the bitmap context to the main CGContext in drawRect).
I've managed to get good results by using the following code:
- (void)drawRect:(CGRect)rect
{
if (drawingStroke) {
if (eraseModeOn) {
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextBeginTransparencyLayer(context, NULL);
[eraseImage drawAtPoint:CGPointZero];
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, ERASE_WIDTH);
CGContextSetBlendMode(context, kCGBlendModeClear);
CGContextSetStrokeColorWithColor(context, [[UIColor clearColor] CGColor]);
CGContextStrokePath(context);
CGContextEndTransparencyLayer(context);
} else {
[curImage drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, self.lineWidth);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextSetStrokeColorWithColor(context, self.lineColor.CGColor);
CGContextStrokePath(context);
}
} else {
[curImage drawAtPoint:CGPointZero];
}
self.empty = NO;
}
The trick was to wrap the following into CGContextBeginTransparencyLayer / CGContextEndTransparencyLayer calls:
Blitting the erase background image to the context
Drawing the "erase" path on top of the erase background image, using kCGBlendModeClear
Since both the erase background image's pixel data and the erase path are in the same layer, it has the effect of clearing the pixels.
2D graphics following painting paradigms. When you are painting, it's hard to remove paint you've already put on the canvas, but super easy to add more paint on top. The blend modes with a bitmap context give you a way to do something hard (scrape paint off the canvas) with few lines of code. The few lines of code do not make it an easy computing operation (which is why it performs slowly).
The easiest way to fake clearing out pixels without having to do the offscreen bitmap buffering is to paint the background of your view over the image.
-(void)drawRect:(CGRect)rect
{
if (drawingStroke) {
CGColor lineCgColor = lineColor.CGColor;
if (eraseModeOn) {
//Use concrete background color to display erasing. You could use the backgroundColor property of the view, or define a color here
lineCgColor = [[self backgroundColor] CGColor];
}
[curImage drawAtPoint:CGPointZero];
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, currentPath);
CGContextSetLineCap(context, kCGLineCapRound);
CGContextSetLineWidth(context, lineWidth);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextSetStrokeColorWithColor(context, lineCgColor);
CGContextStrokePath(context);
} else {
[curImage drawAtPoint:CGPointZero];
}
}
The more difficult (but more correct) way is to do the image editing on a background serial queue in response to an editing event. When you get a new action, you do the bitmap rendering in the background to an image buffer. When the buffered image is ready, you call setNeedsDisplay to allow the view to be redrawn during the next update cycle. This is more correct as drawRect: should be displaying the content of your view as quickly as possible, not processing the editing action.
#interface ImageEditor : UIView
#property (nonatomic, strong) UIImage * imageBuffer;
#property (nonatomic, strong) dispatch_queue_t serialQueue;
#end
#implementation ImageEditor
- (dispatch_queue_t) serialQueue
{
if (_serialQueue == nil)
{
_serialQueue = dispatch_queue_create("com.example.com.imagebuffer", DISPATCH_QUEUE_SERIAL);
}
return _serialQueue;
}
- (void)editingAction
{
dispatch_async(self.serialQueue, ^{
CGSize bufferSize = [self.imageBuffer size];
UIGraphicsBeginImageContext(bufferSize);
CGContext context = UIGraphicsGetCurrentContext();
CGContextDrawImage(context, CGRectMake(0, 0, bufferSize.width, bufferSize.height), [self.imageBuffer CGImage]);
//Do editing action, draw a clear line, solid line, etc
self.imageBuffer = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
dispatch_async(dispatch_get_main_queue(), ^{
[self setNeedsDisplay];
});
});
}
-(void)drawRect:(CGRect)rect
{
[self.imageBuffer drawAtPoint:CGPointZero];
}
#end
key is CGContextBeginTransparencyLayer and use clearColor and set CGContextSetBlendMode(context, kCGBlendModeClear);
In my app I need to change the color of one pixel(of a view) to black, I need to do this in the (void)touchesMoved:withEvent: of a custom gester recognizer, which will be applied on the view. (I am making a pen like thing).
My question is what is the simplest draw a line behind the gester recognizer, the line would stay after the gester recognizer is moved/
Let me know if you need any more information.
Drawing in a view is done in drawRect: http://developer.apple.com/library/ios/documentation/UIKit/Reference/UIView_Class/UIView/UIView.html#//apple_ref/doc/uid/TP40006816-CH3-BBCDGJHF
You'll need to set an instance variable or property in touchesMoved to the point that you need to paint then call [self setNeedsDisplay] and drawRect will get invoked. In drawRect you will draw a one pixel rectangle.
Something like this, modify to suit your needs:
- (void)drawRect:(CGRect)rect {
[super drawRect:rect];
CGRect rectangle = CGRectMake(self.cachedPoint.x, self.cachedPoint.y, 1, 1);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetRGBFillColor(context, 1.0, 0.0, 0.0, 1.0);
CGContextSetRGBStrokeColor(context, 1.0, 0.0, 0.0, 1.0);
CGContextFillRect(context, rectangle);
}
I'm currently drawing on the screen. I get smooth lines, I can change the color of my drawings. But I can't find how to apply a shadow to that line.
To draw it, I use :
[path strokeWithBlendMode:[path blendMode] alpha:1.0];
I saw that I could use CGContextSetShadowWithColor() but even though, I'm not sure how to use it since here's what's said in the CGPath reference for strokeWithBlendMode:
This method automatically saves the current graphics state prior to
drawing and restores that state when it is done, so you do not have to
save the graphics state yourself.
So I don't really know where to put that CGContextSetShadowWithColor() or anything else if I can use it.
Regards
If you want to use CGContextSetShadowwithColor() then you will need to change the way to draw your bezierpath to the view so that you draw the CGPath representation to the CGContext. An example is below:
UIBezierPath *path; // this is your path as before
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, path.CGPath);
CGContextSetLineWidth(context, 2.0);
CGContextSetBlendMode(context, path.blendMode);
CGContextSetShadowWithColor(context, CGSizeMake(1.0, 1.0), 2.0, [UIColor blackColor].CGColor);
CGContextStrokePath(context);
Another way you could do this is to create a new CAShapeLayer and draw you path to that by setting it as the path property. This will easily allow you to add a shadow that will only shadow your path.