iOS CALayer and TapGestureRecognizer - ios

I'm developing an app on iOS 6.1 for iPad.
I've a problem with the CALayer and a TapGestureRecognizer.
I've 7 CALayers forming a rainbow (every layer is a colour).
Every layer is built using a CAShapeLayer generate from a CGMutablePathRef. Everything works fine. All the layers are drawn on screen and I can see a beautiful rainbow.
Now I want to detect the tap above a single color/layer. I try this way:
- (void)tap:(UITapGestureRecognizer*)recognizer
{
//I've had the tapGestureRecognizer to rainbowView (that is an UIView) in viewDidLoad
CGLayer* tappedLayer = [rainbowView.layer.presentationlayer hitTest:[recognizer locationInView:rainbowView];
if (tappedLayer == purpleLayer) //for example
NSLog(#"PURPLE!");
}
I don't understand why this code won't work! I've already red other topics in here: all suggest the hitTest: method for solving problems like this. But in my case I can't obtain the desired result.
Can anyone help me? Thanks!!
EDIT:
Here's the code for the creation of paths and layers:
- (void)viewDidLoad
{
//Other layers
...
...
//Purple Arc
purplePath = CGPathCreateMutable();
CGPathMoveToPoint(purplePath, NULL, 150, 400);
CGPathAddCurveToPoint(purplePath, NULL, 150, 162, 550, 162, 550, 400);
purpletrack = [CAShapeLayer layer];
purpletrack.path = purplePath;
purpletrack.strokeColor = [UIColor colorWithRed:134.0/255.0f green:50.0/255.0f blue:140.0/255.0f alpha:1.0].CGColor;
purpletrack.fillColor = nil;
purpletrack.lineWidth = 25.0;
[rainbowView.layer insertSublayer:purpletrack above:bluetrack];
}
This was my first approach to the problem. And the touch didn't work.
I also tried to create a RainbowView class where the rainbow was drawing in drawRect method using UIBezierPaths.
Then I follow the "Doing Hit-Detection on a Path" section in http://developer.apple.com/library/ios/#documentation/2ddrawing/conceptual/drawingprintingios/BezierPaths/BezierPaths.html
In this case the problem was the path variable passed to the method. I try to compare the UIBezierPath passed with the paths in RainbowView but with no results.
I could try to create curves instead of paths. In this case maybe there isn't a fill part of figure and the touching area is limited to the stroke. But then how can I recognize the touch on a curve?
I'm so confused about all of these stuff!!! :D

The problem you are facing is that you are checking agains the frame/bounds of the layer when hit testing and not agains the path of the shape layer.
If your paths are filled you should instead use CGPathContainsPoint() to determine if the tap was inside the path. If your paths aren't filled but instead stroked I refer you to Ole Begemann's article about CGPath Hit Testing.
To make your code cleaner you could do the hit testing in your own subclass. Also, unless the layer is animating when hit testing it makes no sense using the presentationLayer.

Related

Obtain Bezier Path of CALayer

CALayer objects have a property accessibilityPath which as stated is supposedly
Returns the path of the element in screen coordinates.
Of course as expected this does not return the path of the layer.
Is there a way to access the physical path of a given CALayer that has already been created? For instance, how would you grab the path of a UIButton's layer property once the button has been initialized?
EDIT
For reference, I am trying to detect whether a rotated button contains a point. The reason for the difficulty here is due to the fact that the buttons are drawn in a curved view...
My initial approach was to create bezier slices then pass them as a property to the button to check if the path contains the point. For whatever reason, there seems to be an ugly offset from the path and the button.
They are both added to the same view and use the same coordinates / values to determine their frame, but the registered path seems to be offset to the left from the actual drawn shape from the path. Below is an image of the shapes I have drawn. The green outline is where the path is drawn (and displayed....) where the red is approximately the area which registers as inside the path...
I'm having a hard time understanding how the registered area is different.
If anyone has any ideas on why this offset would be occurring would be most appreciated.
UPDATE
Here is a snippet of me adding the shapes. self in this case is simply a UIView added to a controller. it's frame is the full size of the controller which is `{0, height_of_device - controllerHeight, width_of_device, controllerHeight}
UIBezierPath *slicePath = UIBezierPath.new;
[slicePath moveToPoint:self.archedCenterRef];
[slicePath addArcWithCenter:self.archedCenterRef radius:outerShapeDiameter/2 startAngle:shapeStartAngle endAngle:shapeEndAngle clockwise:clockwise];
[slicePath addArcWithCenter:self.archedCenterRef radius:(outerShapeDiameter/2 - self.rowHeight) startAngle:shapeEndAngle endAngle:shapeStartAngle clockwise:!clockwise];
[slicePath closePath];
CAShapeLayer *sliceShape = CAShapeLayer.new;
sliceShape.path = slicePath.CGPath;
sliceShape.fillColor = [UIColor colorWithWhite:0 alpha:.4].CGColor;
[self.layer addSublayer:sliceShape];
...
...
button.hitTestPath = slicePath;
In a separate method in my button subclass to detect if it contains the point or not: (self here is the button of course)
...
if ([self.hitTestPath containsPoint:touchPosition]) {
if (key.alpha > 0 && !key.isHidden) return YES;
else return NO;
}
else return NO;
You completely missunderstood the property, this is for assistive technology, from the docs:
Excerpt:
"The default value of this property is nil. If no path is set, the accessibility frame rectangle is used to highlight the element.
When you specify a value for this property, the assistive technology uses the path object you specify (in addition to the accessibility frame, and not in place of it) to highlight the element."
You can only get the path from a CAShapeLayer, alls other CALayers don't need to be drawn with a path at all.
Update to your update:
I think the offset is due to a missing
UIView convert(_ point: CGPoint, to view: UIView?)
The point needs to be converted to the buttons coordinate systems.

How to clear and redraw contents of a UIView (present inside a view controller) with swift?

I've a UIView inside a View Controller in which I'm drawing few lines as required by my app. After a certain point of time, I want some of those lines to disappear and a few other to appear in the same view. Approach I'm using as of now is that I'm clearing the UIView and redrawing all the lines I want to draw in the updated view.
Can somebody tell me what's the right way to go about it? I've gone through various questions that sound similar but it hasn't helped much. Till now I've tried things like:-
outletView.setNeedsDisplay()
and
let context = UIGraphicsGetCurrentContext()
context?.clear(outletView.frame)
None of these seem to make any difference.
If I call viewDidLoad() again since all the lines are updated now. New lines to be drawn come up but the ones that were supposed to disappear don't go away. Variables for lines are updated correctly since other logic I have which checks line variable's values is working fine after the update is supposed to happen. Only problem is with the redraw part. In fact, if I understand this correctly, problem is only with cleaning the old uiview contents. If cleaning happens properly, redraw with viewDidLoad will show correct lines drawn.
P.S. I know that calling viewDidLoad() explicitly isn't a good practice. Hope to find a solution to this problem without having to call viewDidLoad again.
Maybe you could draw your lines in different layers of the view, delete the layer containing the lines that need to disappear and create a new layer for the new lines. You can draw in layoutSubviews() and use self.setNeedsLayout() when you need to update the view.
remove:
guard let sublayers = yourView.layer.sublayers else { return }
for layer in sublayers {
layer.removeFromSuperlayer()
}
add:
let linesPath = UIBezierPath()
let linesLayer = CAShapeLayer()
linesPath.move(to: CGPoint(x: 0, y: 0)
linesPath.addLine(to: CGPoint(x: 50, y: 100)
lineLayer.path = linesPath.cgPath
linesLayer.lineWidth = 1.0
linesLayer.strokeColor = UIColor.black
yourView.layer.addSublayer(linesLayer)

A long CGPath does not render

I work with a spritekit engine and faced with the following problem. I need to draw a route on a map, for this I make an array of points, and create a path for the array with CGPathAddLines method. Everything worked fine until I tried to build a route between two distant points on the map. The path of it route not rendered.
I began to deal with the problem. I get the bounding box of my path with CGPathGetBoundingBox and I noticed that the route is not drawn each time width of it rect more than 2005 px. I know it sounds strange but in my case it really is. Below that part of the code which is used to create and display the path:
var pathToDraw = CGPathCreateMutable()
let pathPoints = generatePathPoints()
CGPathAddLines(pathToDraw, nil, pathPoints, pathPoints.count)
var shapeNode = SKShapeNode()
shapeNode.path = pathToDraw
shapeNode.lineWidth = 10
shapeNode.strokeColor = UIColor.blueColor()
var effectNode = SKEffectNode()
effectNode.addChild(shapeNode)
mapNode.addChild(effectNode)
generatePathPoints - function return [CGPoint]
mapNode - object of SKSpriteNode type
Maybe I'm doing something wrong or is this a restriction of CGPath which I do not know?
If it works well for short paths and then stops working for long ones, I'll assume (as #matt did) that it's because a limitation of SKShapeNode.
Have you tried creating more than one SKShapeNode with shorter paths and putting all of them inside another SKNode? With this solution you could work with the parent SKNode (full of smaller SKShapeNodes) as if it were your bigger SKShapeNode but (I hope) all the shorter paths will render correctly...

How to get a CAEmitterLayer from Canvas

I am trying to follow the tutorial about iOS particle systems here: http://www.raywenderlich.com/6063/uikit-particle-systems-in-ios-5-tutorial
I am having trouble casting the self.canvas.layer in C4Workspace.m to a CAEmitterLayer. The code compiles just fine but fails at runtime.
I tried this:
particlesystem = (CAEmitterLayer *)self.canvas.layer;
But I receive this error every time.
-[C4Layer setEmitterPosition:]: unrecognized selector sent to instance 0xa183830
It seems that I am not casting or exposing methods properly. How might I do this?
You cannot simply cast one layer as another. In order for a view to have a non-standard layer, you need to subclass it and define the +layerClass method:
#implementation MyViewSubclass
+ (Class)layerClass {
return [CAEmitterLayer class];
}
...
Unfortunately for your case, the view you're working with has already set up a custom layer, C4Layer, which can be seen on GitHub. This layer is doing a lot and you don't want to try replacing it.
What you can do is insert your own sublayer into your canvas:
CAEmitterLayer *myLayer = [CAEmitterLayer layer];
myLayer.frame = self.canvas.bounds;
[self.canvas.layer addSublayer:myLayer];
This emitter layer will now overlay your layer and you can add any effects you want. If you want the emitter below other layers, you can use insertSublayer:myLayer atIndex:0.

Does MKOverlayPathView need drawMapRect?

I'm having some inconsistencies modifying the Breadcrumb example, to have the CrumbPathView subclassed from MKOverlayPathView (like it's supposed to) rather than subclassed from MKOverlayView.
Trouble is, the docs are limited in stating the difference in how these 2 should be implemented. For a subclass of MKOverlayPathView it's advised to use:
- createPath
- applyStrokePropertiesToContext:atZoomScale:
- strokePath:inContext:
But is this in place of drawMapRect, or in addition to? It doesn't seem like much point if it's in addition to, because both would be used for similar implementations. But using it instead of drawMapRect, leaves the line choppy and broken.
Struggling to find any real world examples of subclassing MKOverlayPathView too...is there any point?
UPDATE - modified code from drawMapRect, to what should work:
- (void)createPath
{
CrumbPath *crumbs = (CrumbPath *)(self.overlay);
CGMutablePathRef newPath = [self createPathForPoints:crumbs.points
pointCount:crumbs.pointCount];
if (newPath != nil) {
CGPathAddPath(newPath, NULL, self.path);
[self setPath:newPath];
}
CGPathRelease(newPath);
}
- (void)applyStrokePropertiesToContext:(CGContextRef)context atZoomScale:(MKZoomScale)zoomScale
{
CGContextSetStrokeColorWithColor(context, [[UIColor greenColor] CGColor]);
CGFloat lineWidth = MKRoadWidthAtZoomScale(zoomScale);
CGContextSetLineWidth(context, lineWidth);
CGContextSetLineJoin(context, kCGLineJoinRound);
CGContextSetLineCap(context, kCGLineCapRound);
}
- (void)strokePath:(CGPathRef)path inContext:(CGContextRef)context
{
CGContextAddPath(context, path);
CGContextStrokePath(context);
[self setPath:path];
}
This draws an initial line, but fails to continue the line...it doesn't add the path. I've confirmed that applyStrokePropertiesToContext and strokePath are getting called, upon every new location.
Here's a screenshot of the broken line that results (it draws for createPath, but not after that):
Here's a screenshot of the "choppy" path that happens when drawMapRect is included with createPath:
Without having seen more of your code I'm guessing, but here goes.
I suspect the path is being broken into segments, A->B, C->D, E->F rather than a path with points A,B,C,D, E and F. To be sure of that we'd need to see what is happening to self.overlay and whether it is being reset at any point.
In strokePath you set self.path to be the one that is being stroked. I doubt that is a good idea since the stroking could happen at any time just like viewForAnnotations.
As for the choppiness it may be a side effect or a poor bounds calculation on Apple's part. If your like ends near the boundary of a tile that Apple uses to cover the map it would probably only prompt the map to draw the one the line is within. But your stroke width extends into a neighbouring tile that hasn't been draw. I'm guessing again but you could test this out by moving the point that is just north of the W in "Queen St W" a fraction south, or by increasing the stroke width and see if the cut off line stays in the same place geographically.

Resources