Core Graphics CGPathRef drawn at a specified point - ios

I have a "palette" of paths that I will draw many times; perhaps 100.
I'd like to draw these at a specified location like this:
CGPathRef foo = ...
CGPathRef bar = ...
// do this dozens of times at differing points
[self draw:context path:foo atX:100 andY:50];
[self draw:context path:bar atX:200 andY:50];
What I'm doing now is translating. It works, but I'm not sure that this is the most performant solution. Something like this:
- (CGRect) draw:(CGContextRef) context path:(CGPathRef) path atX:(CGFloat) x andY: (CGFloat)y
{
CGContextSaveGState(context);
CGContextTranslateCTM(context, x, y);
CGRect pathBoundingRect = CGPathGetBoundingBox(path);
CGContextSetFillColorWithColor(context, drawColor);
CGContextAddPath(context, path);
CGContextDrawPath(context, kCGPathFill);
CGContextRestoreGState(context);
return pathBoundingRect;
}
Do you have any suggestions for improvement?

If they move, it would probably be much faster to draw each one in its own UIView (so in the beginning, they would all be identical), and position the view itself.
That way, the translation (of the views) would automatically be done on the GPU instead of the CPU, and drawRect: would only need to be called once for each path object.

Related

How to LIGHTLY erase drawn path in cgcontext?

I managed to implement erase drawings on CGContext
UIImageView *maskImgView = [self.view viewWithTag:K_MASKIMG];
UIGraphicsBeginImageContext(maskImgView.image.size);
[maskImgView.image drawAtPoint:CGPointMake(0,0)];
float alp = 0.5;
UIImage *oriBrush = [UIImage imageNamed:_brushName];
//sets the style for the endpoints of lines drawn in a graphics context
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGFloat eraseSize = oriBrush.size.width*_brushSize/_zoomCurrentFactor;
CGContextSetLineCap(ctx, kCGLineCapRound);
CGContextSetLineJoin(ctx, kCGLineJoinRound);
CGContextSetLineWidth(ctx,eraseSize);
CGContextSetRGBStrokeColor(ctx, 1, 1, 1, alp);
CGContextSetBlendMode(ctx, kCGBlendModeClear);
CGContextBeginPath(ctx);
CGContextMoveToPoint(ctx, lastPoint.x,lastPoint.y);
CGPoint vector = CGPointMake(currentPoint.x - lastPoint.x, currentPoint.y - lastPoint.y);
CGFloat distance = hypotf(vector.x, vector.y);
vector.x /= distance;
vector.y /= distance;
for (CGFloat i = 0; i < distance; i += 1.0f) {
CGPoint p = CGPointMake(lastPoint.x + i * vector.x, lastPoint.y + i * vector.y);
CGContextAddLineToPoint(ctx, p.x, p.y);
}
CGContextStrokePath(ctx);
maskImgView.image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
lastPoint = currentPoint;
Problem is, this TOTALLY erase anything. The alpha set in this function ( CGContextSetRGBStrokeColor(ctx, 1, 1, 1, alp);) seems to be ignored totally.
I want to erase just lightly and repeated erasing will then totally removes the drawing.
Any ideas?
EDIT: As per request, I add more details about this code:
alp=_brushAlpha is a property delcared in this ViewController class. It ranges from 0.1 to 1.0. At testing I set it to 0.5. This drawing code is triggered by pan gesture recognizer (change state). It is basically following the finger (draw/erase by finger).
You've set the blending mode to clear. That ignores stroke color. You should play with the various modes a bit, but I suspect you want something like sourceAtop or maybe screen. See the CGBlendMode docs for full details.
You have a flag named clearsContextBeforeDrawing in UIView. if you set it to YES it will clear it before every draw.
according to documentation
A Boolean value that determines whether the view’s bounds should be automatically cleared before drawing.
When set to YES, the drawing buffer is automatically cleared to transparent black before the drawRect: method is called. This behavior ensures that there are no visual artifacts left over when the view’s contents are redrawn. If the view’s opaque property is also set to YES, the backgroundColor property of the view must not be nil or drawing errors may occur. The default value of this property is YES.
If you set the value of this property to NO, you are responsible for ensuring the contents of the view are drawn properly in your drawRect: method. If your drawing code is already heavily optimized, setting this property is NO can improve performance, especially during scrolling when only a portion of the view might need to be redrawn.

Apply a shadow at both sides of a UIBezierPath

I'm currently drawing on the screen. I get smooth lines, I can change the color of my drawings. But I can't find how to apply a shadow to that line.
To draw it, I use :
[path strokeWithBlendMode:[path blendMode] alpha:1.0];
I saw that I could use CGContextSetShadowWithColor() but even though, I'm not sure how to use it since here's what's said in the CGPath reference for strokeWithBlendMode:
This method automatically saves the current graphics state prior to
drawing and restores that state when it is done, so you do not have to
save the graphics state yourself.
So I don't really know where to put that CGContextSetShadowWithColor() or anything else if I can use it.
Regards
If you want to use CGContextSetShadowwithColor() then you will need to change the way to draw your bezierpath to the view so that you draw the CGPath representation to the CGContext. An example is below:
UIBezierPath *path; // this is your path as before
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextAddPath(context, path.CGPath);
CGContextSetLineWidth(context, 2.0);
CGContextSetBlendMode(context, path.blendMode);
CGContextSetShadowWithColor(context, CGSizeMake(1.0, 1.0), 2.0, [UIColor blackColor].CGColor);
CGContextStrokePath(context);
Another way you could do this is to create a new CAShapeLayer and draw you path to that by setting it as the path property. This will easily allow you to add a shadow that will only shadow your path.

Efficient way to draw a graph line by line in CALayer

I need to draw a line chart from values that come to me every half a seconds. I've come up with my custom CALayer for this graph which stores all the previous lines and every two seconds redraws all previous lines and adds one new line. I find this solution non-optimal because there's only need to draw one additional line to the layer, no reason to redraw potentially thousands of previous lines.
What do you think would be the best solution in this case?
Use your own NSBitmapContext or UIImage as a backing store. Whenever new data comes in draw to this context and set your layer's contents property to the context's image.
I am looking at an identical implementation. Graph updates every 500 ms. Similarly I felt uncomfortable drawing the entire graph each iteration. I implemented a solution 'similar' to what Nikolai Ruhe proposed as follows:
First some declarations:
#define TIME_INCREMENT 10
#property (nonatomic) UIImage *lastSnapshotOfPlot;
and then the drawLayer:inContext method of my CALayer delegate
- (void) drawLayer:( CALayer*)layer inContext:(CGContextRef)ctx
{
// Restore the image of the layer from the last time through, if it exists
if( self.lastSnapshotOfPlot )
{
// For some reason the image is being redrawn upside down!
// This block of code adjusts the context to correct it.
CGContextSaveGState(ctx);
CGContextTranslateCTM(ctx, 0, layer.bounds.size.height);
CGContextScaleCTM(ctx, 1.0, -1.0);
// Now we can redraw the image right side up but shifted over a little bit
// to allow space for the new data
CGRect r = CGRectMake( -TIME_INCREMENT, 0, layer.bounds.size.width, layer.bounds.size.height );
CGContextDrawImage(ctx, r, self.lastSnapshotOfPlot.CGImage );
// And finally put the context back the way it was
CGContextRestoreGState(ctx);
}
CGContextStrokePath(ctx);
CGContextSetLineWidth(ctx, 2.0);
CGContextSetStrokeColorWithColor(ctx, [UIColor blueColor].CGColor );
CGContextBeginPath( ctx );
// This next section is where I draw the line segment on the extreme right end
// which matches up with the stored graph on the image. This part of the code
// is application specific and I have only left it here for
// conceptual reference. Basically I draw a tiny line segment
// from the last value to the new value at the extreme right end of the graph.
CGFloat ppy = layer.bounds.size.height - _lastValue / _displayRange * layer.bounds.size.height;
CGFloat cpy = layer.bounds.size.height - self.sensorData.currentvalue / _displayRange * layer.bounds.size.height;
CGContextMoveToPoint(ctx,layer.bounds.size.width - TIME_INCREMENT, ppy ); // Move to the previous point
CGContextAddLineToPoint(ctx, layer.bounds.size.width, cpy ); // Draw to the latest point
CGContextStrokePath(ctx);
// Finally save the entire current layer to an image. This will include our latest
// drawn line segment
UIGraphicsBeginImageContext(layer.bounds.size);
[layer renderInContext: UIGraphicsGetCurrentContext()];
self.lastSnapshotOfPlot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}
Is this the most efficient way?
I have not been programming in ObjectiveC long enough to know so all suggestions/improvements welcome.

How to clip in drawRect when drawing (and clipping) based upon CGMutablePathRefs

In my custom control I have defined a few CGMutablePathRefs with the needed lines and arcs to draw my control; one draws the overall fill shape and others provide specular highlights. I have also defined two CGMutablePathRefs which contain the paths needed as clipping masks for the active and inactive state of the control.
What I'm struggling with is applying the clipping paths. I have previously used clipping paths for applying gradients to an image, but those drawing commands were of the CGContext... variety, not the CGPath... variety.
For testing purposes I have removed the specular highlight drawing aspects, just trying to get a large path clipped to a smaller path. This is what I had been testing with:
- (void)drawRect:(CGRect)rect
{
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextBeginPath(ctx);
CGContextAddPath(ctx, inactiveClip);
CGContextClosePath(ctx);
CGContextClip(ctx);
CGContextAddPath(ctx, frontFace);
CGContextSetLineWidth(ctx, 1.0);
CGContextSetFillColorWithColor(ctx, [[UIColor blackColor] CGColor]);
CGContextFillPath(ctx);
}
By putting the clipping command before any drawing, I thought I was saying to CoreGraphics, "Here's the region you should actually draw into."
Alas, nothing is drawn.
So assuming I had that ordering backwards I tried:
- (void)drawRect:(CGRect)rect
{
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextAddPath(ctx, frontFace);
CGContextSetLineWidth(ctx, 1.0);
CGContextSetFillColorWithColor(ctx, [[UIColor blackColor] CGColor]);
CGContextFillPath(ctx);
CGContextBeginPath(ctx);
CGContextAddPath(ctx, inactiveClip);
CGContextClosePath(ctx);
CGContextClip(ctx);
}
This was to say to CoreGraphics, "Okay before you actually color bits, check them against this clipping region."
Alas, nothing is clipped.
Since it is the case that my clipping path uses some of the same points and control points, in the same order, as the fill path, I have also replaced the call to CGContextClip with a call to CGContextEOClip to see if I was really struggling with the even-odd rule, but that doesn't seem to have had any visual affect.
I don't really know if I needed to bracket the CGContextAddPath call with CGContextBeginPath/CGContextClosePath calls, but what I was trying to do was minimize the differences between Apple's example code and my code. In theirs they do their CGContext... drawing calls between begin/close calls so I was too.
What am I misunderstanding here?

CGContextDrawRadialGradient For Non-Circular Glow

I'm able to use CGContextDrawRadialGradient to make a sphere that does an alpha fade to UIColor.clearColor and it works.
However, I'm trying to do this type of thing:
While placing some strategic spheres around makes for an interesting effect (similar to LED backlights in strategic places), I would love to get a true glow. How can I draw a glow around a rounded rectangle in drawRect?
You can create a glow effect around any path using CGContextSetShadowWithColor, but you don't get precise control over the appearance. In particular, the default shadow is fairly light:
And the only way I know of to make it darker is to draw it again over itself:
Not optimal, but it approximates what you want pretty well.
Those images were generated by the following drawRect:
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGMutablePathRef path = CGPathCreateMutable();
int padding = 20;
CGPathMoveToPoint(path, NULL, padding, padding);
CGPathAddLineToPoint(path, NULL, rect.size.width - padding, rect.size.height / 2);
CGPathAddLineToPoint(path, NULL, padding, rect.size.height - padding);
CGContextSetShadowWithColor(context, CGSizeZero, 20, UIColor.redColor.CGColor);
CGContextSetFillColorWithColor(context, UIColor.blueColor.CGColor);
CGContextAddPath(context, path);
CGContextFillPath(context);
// CGContextAddPath(context, path);
// CGContextFillPath(context);
CGPathRelease(path);
}
One thing to bear in mind is that rendering fuzzy shadows is fairly expensive, which may or may not be a problem depending on how often your views are redrawn. If the shadows don't need to animate, consider rendering them to a UIImage once and just displaying the result in a UIImageView.

Resources