iOS hitTest withEvent - ios

I'm attempting to get the reference to a UImageView that is underneath a masked UIImageView using hitTest withEvent. Here is what I have that is not working:
UIView A that contains 3 UIImageViews as subviews: panelOne, panelTwo, and panelThree. panelThree is takes up the entire frame but is masked into a triangle, revealing parts of panels one and two. So I need to detect when a user taps outside of that rectangle and send the touch to the appropriate UIImageView.
Code: (CollagePanel is a subclass of UIImageView)
-(void)triangleInASquare
{
CGSize size = self.frame.size;
CollagePanel *panelOne = [[CollagePanel alloc] initWithFrame:CGRectMake(0,0, size.width/2, size.height)];
panelOne.panelScale = panelOne.frame.size.width/self.frame.size.width;
panelOne.backgroundColor = [UIColor greenColor];
CollagePanel *panelTwo = [[CollagePanel alloc] initWithFrame:CGRectMake(size.width/2,0, size.width/2, size.height)];
panelTwo.panelScale = panelOne.frame.size.width/self.frame.size.width;
panelTwo.backgroundColor = [UIColor purpleColor];
CollagePanel *panelThree = [[CollagePanel alloc] initWithFrame:CGRectMake(0,0, size.width, size.height)];
panelThree.backgroundColor = [UIColor orangeColor];
UIBezierPath* trianglePath = [UIBezierPath bezierPath];
[trianglePath moveToPoint:CGPointMake(0, panelThree.frame.size.height)];
[trianglePath addLineToPoint:CGPointMake(panelThree.frame.size.width/2,0)];
[trianglePath addLineToPoint:CGPointMake(panelThree.frame.size.width, panelTwo.frame.size.height)];
[trianglePath closePath];
// Mask the panels's layer to triangle.
CAShapeLayer *triangleMaskLayer = [CAShapeLayer layer];
[triangleMaskLayer setPath:trianglePath.CGPath];
triangleMaskLayer.strokeColor = [[UIColor whiteColor] CGColor];
panelThree.layer.mask = triangleMaskLayer;
//Add border
CAShapeLayer *borderLayer = [CAShapeLayer layer];
borderLayer.strokeColor = [[UIColor whiteColor] CGColor];
borderLayer.fillColor = [[UIColor clearColor] CGColor];
borderLayer.lineWidth = 6;
[borderLayer setPath:trianglePath.CGPath];
[panelThree.layer addSublayer:borderLayer];
NSMutableArray *tempArray = [[NSMutableArray alloc] init];
[tempArray addObject:panelOne];
[tempArray addObject:panelTwo];
[tempArray addObject:panelThree];
[self addGestureRecognizersToPanelsInArray:tempArray];
[self addPanelsFromArray:tempArray];
self.panelArray = tempArray;
}
-(void)handleTap: (UITapGestureRecognizer*) recognizer //coming from panel.imageView
{
CGPoint tapPoint = [recognizer locationInView:self];
NSLog(#"Location in self: %#", NSStringFromCGPoint(tapPoint));
NSLog(#"self.subviews: %#", self.subviews);
UIView *bottomView = [self hitTest:tapPoint withEvent:nil];
NSLog(#"Bottom View: %#", bottomView);
}
The NSLog of bottomView is always panelThree (the topmost panel). From what I understand the hit test should be returning the "bottom most" subview.

you understand wrong. it will return the view that recognizes itself as touched and is nearest to your finger, nearer to the top.
If a view shall not recognize itself as touch for a certain point, you need to overwrite
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
for that view.
I think Ole Begemann's Shapped Button is a great example how to do so.
In your project this method could determine, if a point lies within the paths: CGPathContainsPoint.
Your pointInside:withEvent: might look like this:
#import "CollagePanel.h"
#import <QuartzCore/QuartzCore.h>
#implementation CollagePanel
//
// ...
//
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
CGPoint p = [self convertPoint:point toView:[self superview]];
if(self.layer.mask){
if (CGPathContainsPoint([(CAShapeLayer *)self.layer.mask path], NULL, p, YES) )
return YES;
}else {
if(CGRectContainsPoint(self.layer.frame, p))
return YES;
}
return NO;
}
#end

Related

Drawing / Editing a line in iOS

I am trying to draw and edit multiple lines on iOS. Each line has a UIView at each end acting as handles so once the line is drawn a user can drag each end and the line will redraw.
Im currently using a UIBezierPath to draw a CAShapelayer on the view. The issue I have is working the best way to then draw another one and which ever line is tapped on the user can edit this one.
Does anyone have any ideas about the best way for this? Is CAShapelayer the best option?
Video Link that might show better what I'm trying to achieve.
https://www.dropbox.com/s/8rpt2azrs3uk6vr/Line%20Example.mov?dl=0
An example of the code I have done so far is below:
//Drawing a Line
Here I create a path from two touch points and the draw a CAShapeLayer on the view. I also create a custom object 'Line' to store the path and shapelayer.
-(void)DrawLineFrom:(CGPoint)pointA to:(CGPoint)pointB
{
NSLog(#"Drawing line X:%f Y:%f - X:%f Y:%f", pointA.x, pointA.y, pointB.x, pointB.y);
UIBezierPath* path = [[UIBezierPath alloc]init];
[path moveToPoint:pointA];
[path addLineToPoint:pointB];
[path addLineToPoint:CGPointMake(pointB.x, pointB.y+2)];
[path addLineToPoint:CGPointMake(pointA.x, pointA.y+2)];
[path addLineToPoint:pointA];
[path closePath];
currentLine.bPath = path;
if (!shapeLayer)
{
shapeLayer = [LineLayer layer];
[shapeLayer setFrame:self.view.frame];
shapeLayer.path = path.CGPath;
shapeLayer.strokeColor = [UIColor redColor].CGColor; //etc...
shapeLayer.lineWidth = 2.0; //etc...
shapeLayer.parent = currentLine;
currentLine.shapeLayer = shapeLayer;
[self.view.layer addSublayer:shapeLayer];
}
else
{
shapeLayer.path = path.CGPath;
}
[self ExitDrawMode];
}
//Creating a Handle
This creates the two views at either end and adds the Long Press Gesture.
-(HandleView *)MakeLineHandleForPoint:(int)point atLocation:(CGPoint)loc
{
HandleView *pointView = [[HandleView alloc]initWithFrame:CGRectMake(loc.x-10, loc.y-10, 20, 20)];
pointView.layer.cornerRadius = 10;
pointView.backgroundColor = [UIColor redColor];
UILongPressGestureRecognizer *LP = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLp:)];
[pointView addGestureRecognizer:LP];
LP.delegate = self;
LP.minimumPressDuration = 0.0;
pointView.userInteractionEnabled = YES;
pointView.tag = point;
pointView.lineParent = currentLine;
return pointView;
}
Finally the gesture is handled here. This works fine however will always move the last line drawn. Even if I have selected the first one.
-(void)handleLp:(UILongPressGestureRecognizer *)sender
{
CGPoint loc = [sender locationInView:self.view];
[sender view].center = loc;
HandleView *handleView = [sender view];
if ([sender view].tag == 0) {
currentLine.pointA = loc;
[self DrawLineFrom:loc to:currentLine.pointB];
}
if ([sender view].tag == 1) {
currentLine.pointB = loc;
[self DrawLineFrom:currentLine.pointA to:loc];
}
}
Any Help much appreciated. Thanks in advance.

iOS Clear UIView and redraw drawRect:

I want my custom view to clear everything (subviews, sublayers, drawings, etc.) in it and call drawRect: again when pressing a button, so it will be like getting a clean slate and painting on it again.
Right now, I've tried using setNeedsDisplay and it just draws over the whole thing without cleaning the previous work.
A snippet of code in my drawRect:
if (_displayYAxisAverageLine) {
// Check if y axis data are numbers
for (id yValue in yAxisValues) {
if (![yValue isKindOfClass:[NSNumber class]]) {
return;
}
}
NSNumber *average = [yAxisValues valueForKeyPath:#"#avg.self"];
CGFloat scale = [self scaleForYValue:average];
CGFloat avgLineY = roundf(graphHeight * scale)+_graphInset;
UIBezierPath *path = [UIBezierPath bezierPath];
[path moveToPoint:CGPointMake(0, avgLineY)];
[path addLineToPoint:CGPointMake(contentWidth, avgLineY)];
CAShapeLayer *avgLine = [CAShapeLayer layer];
avgLine.path = path.CGPath;
avgLine.fillColor = [UIColor clearColor].CGColor;
avgLine.strokeColor = _borderColor.CGColor;
avgLine.lineWidth = 1.f;
avgLine.lineDashPattern = #[#2, #2];
[self.contentView.layer addSublayer:avgLine];
}

Handling touches in CALayer

I'm trying to detect touches in my view.
My code:
- (CALayer *)layerForTouch:(UITouch *)touch {
UIView *view = self;
CGPoint location = [touch locationInView:view];
location = [view convertPoint:location toView:nil];
CALayer *hitPresentationLayer = [view.layer.presentationLayer hitTest:location];
if (hitPresentationLayer) {
return hitPresentationLayer.modelLayer;
}
return nil;
}
But I found problem. It's detecting just first layer in tree.
Every figure on screenshot - one layer. I'm drawing it's from the biggest to the smallest as tree.
if your topmost layer covers the whole screen (or other below layers) then yes you will get the top most layer during hitTest check. you can do a containsPoint check for all your layers if you want to check it further below. Do find some sample code below
#interface ViewController ()
#property (nonatomic, strong) CALayer *layer1;
#property (nonatomic, strong) CALayer *layer2;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
CALayer *layer = [[CALayer alloc] init];
layer.frame = (CGRect) { 10, 10, 100, 100 };
[self.view.layer addSublayer:layer];
layer.backgroundColor = [UIColor redColor].CGColor;
self.layer1 = layer;
layer = [[CALayer alloc] init];
layer.frame = (CGRect) { 60, 60, 100, 100 };
[self.view.layer addSublayer:layer];
layer.backgroundColor = [UIColor blueColor].CGColor;
self.layer2 = layer;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touchPoint = [(UITouch*)[touches anyObject] locationInView:self.view];
CALayer *touchedLayer = [self.view.layer.presentationLayer hitTest:touchPoint];
CALayer *originalLayer = [touchedLayer modelLayer];
if (originalLayer == self.layer1) { NSLog(#"layer 1 touched"); }
if (originalLayer == self.layer2) { NSLog(#"layer 2 touched"); }
for ( CALayer *layer in #[self.layer1, self.layer2])
{
CGPoint point = [layer convertPoint:touchPoint fromLayer:self.view.layer];
NSLog(#"layer %# containsPoint ? %#",layer, [layer containsPoint:point]? #"YES":#"NO");
NSLog(#"original point : %# | converted point : %#",NSStringFromCGPoint(touchPoint),NSStringFromCGPoint(point));
}
}
#end
Or you can try using CAShapeLayers & match with the CGPath of ShapeLayer to see if its touched like the example below ..
#interface ViewController ()
#property (nonatomic, strong) CALayer *layer1;
#property (nonatomic, strong) CALayer *layer2;
#end
#implementation ViewController
- (void)viewDidLoad
{
[super viewDidLoad];
CAShapeLayer *shapeLayer = [[CAShapeLayer alloc] init];
shapeLayer.path = [UIBezierPath bezierPathWithRect:(CGRect){ 0,0,100,10 }].CGPath;
shapeLayer.backgroundColor = [UIColor redColor].CGColor;
shapeLayer.frame = (CGRect){20, 20, 100, 100};
[self.view.layer addSublayer:shapeLayer];
self.layer1 = shapeLayer;
shapeLayer = [[CAShapeLayer alloc] init];
shapeLayer.path = [UIBezierPath bezierPathWithRect:(CGRect){ 0,0,10,100 }].CGPath;
shapeLayer.backgroundColor = [[UIColor blueColor] colorWithAlphaComponent:0.6].CGColor;
shapeLayer.frame = (CGRect){60, 60, 100, 100};
[self.view.layer addSublayer:shapeLayer];
self.layer2 = shapeLayer;
}
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event
{
CGPoint touchPoint = [(UITouch*)[touches anyObject] locationInView:self.view];
CALayer *touchedLayer = [self.view.layer.presentationLayer hitTest:touchPoint];
CALayer *originalLayer = [touchedLayer modelLayer];
for ( CAShapeLayer *layer in #[self.layer1, self.layer2])
{
if (![layer isKindOfClass:[CAShapeLayer class]]) { continue; }
CGPoint point = [layer convertPoint:touchPoint fromLayer:self.view.layer];
if (CGPathContainsPoint(layer.path, 0, point, 0))
{
NSLog(#"match found");
if (originalLayer == self.layer1) { NSLog(#"layer 1 touched"); }
if (originalLayer == self.layer2) { NSLog(#"layer 2 touched"); }
return;
}
}
}

Cut Out Shape with Animation

I want to do something similar to the following:
How to mask an image in IOS sdk?
I want to cover the entire screen with translucent black. Then, I want to cut a circle out of the translucent black covering so that you can see through clearly. I'm doing this to highlight parts of the screen for a tutorial.
I then want to animate the cut-out circle to other parts of the screen. I also want to be able to stretch the cut-out circle horizontally & vertically, as you would do with a generic button background image.
(UPDATE: Please see also my other answer which describes how to set up multiple independent, overlapping holes.)
Let's use a plain old UIView with a backgroundColor of translucent black, and give its layer a mask that cuts a hole out of the middle. We'll need an instance variable to reference the hole view:
#implementation ViewController {
UIView *holeView;
}
After loading the main view, we want to add the hole view as a subview:
- (void)viewDidLoad {
[super viewDidLoad];
[self addHoleSubview];
}
Since we want to move the hole around, it will be convenient to make the hole view be very large, so that it covers the rest of the content regardless of where it's positioned. We'll make it 10000x10000. (This doesn't take up any more memory because iOS doesn't automatically allocate a bitmap for the view.)
- (void)addHoleSubview {
holeView = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 10000, 10000)];
holeView.backgroundColor = [UIColor colorWithWhite:0.0 alpha:0.5];
holeView.autoresizingMask = 0;
[self.view addSubview:holeView];
[self addMaskToHoleView];
}
Now we need to add the mask that cuts a hole out of the hole view. We'll do this by creating a compound path consisting of a huge rectangle with a smaller circle at its center. We'll fill the path with black, leaving the circle unfilled and therefore transparent. The black part has alpha=1.0 and so it makes the hole view's background color show. The transparent part has alpha=0.0, so that part of the hole view is also transparent.
- (void)addMaskToHoleView {
CGRect bounds = holeView.bounds;
CAShapeLayer *maskLayer = [CAShapeLayer layer];
maskLayer.frame = bounds;
maskLayer.fillColor = [UIColor blackColor].CGColor;
static CGFloat const kRadius = 100;
CGRect const circleRect = CGRectMake(CGRectGetMidX(bounds) - kRadius,
CGRectGetMidY(bounds) - kRadius,
2 * kRadius, 2 * kRadius);
UIBezierPath *path = [UIBezierPath bezierPathWithOvalInRect:circleRect];
[path appendPath:[UIBezierPath bezierPathWithRect:bounds]];
maskLayer.path = path.CGPath;
maskLayer.fillRule = kCAFillRuleEvenOdd;
holeView.layer.mask = maskLayer;
}
Notice that I've put the circle at the center of the 10000x10000 view. This means that we can just set holeView.center to set the center of the circle relative to the other content. So, for example, we can easily animate it up and down over the main view:
- (void)viewDidLayoutSubviews {
CGRect const bounds = self.view.bounds;
holeView.center = CGPointMake(CGRectGetMidX(bounds), 0);
// Defer this because `viewDidLayoutSubviews` can happen inside an
// autorotation animation block, which overrides the duration I set.
dispatch_async(dispatch_get_main_queue(), ^{
[UIView animateWithDuration:2 delay:0
options:UIViewAnimationOptionRepeat
| UIViewAnimationOptionAutoreverse
animations:^{
holeView.center = CGPointMake(CGRectGetMidX(bounds),
CGRectGetMaxY(bounds));
} completion:nil];
});
}
Here's what it looks like:
But it's smoother in real life.
You can find a complete working test project in this github repository.
This is not a simple one. I can get you a good bit of the way there. It's the animating that is tricky. Here's the output of some code I threw together:
The code is like this:
- (void)viewDidLoad
{
[super viewDidLoad];
// Create a containing layer and set it contents with an image
CALayer *containerLayer = [CALayer layer];
[containerLayer setBounds:CGRectMake(0.0f, 0.0f, 500.0f, 320.0f)];
[containerLayer setPosition:[[self view] center]];
UIImage *image = [UIImage imageNamed:#"cool"];
[containerLayer setContents:(id)[image CGImage]];
// Create your translucent black layer and set its opacity
CALayer *translucentBlackLayer = [CALayer layer];
[translucentBlackLayer setBounds:[containerLayer bounds]];
[translucentBlackLayer setPosition:
CGPointMake([containerLayer bounds].size.width/2.0f,
[containerLayer bounds].size.height/2.0f)];
[translucentBlackLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[translucentBlackLayer setOpacity:0.45];
[containerLayer addSublayer:translucentBlackLayer];
// Create a mask layer with a shape layer that has a circle path
CAShapeLayer *maskLayer = [CAShapeLayer layer];
[maskLayer setBorderColor:[[UIColor purpleColor] CGColor]];
[maskLayer setBorderWidth:5.0f];
[maskLayer setBounds:[containerLayer bounds]];
// When you create a path, remember that origin is in upper left hand
// corner, so you have to treat it as if it has an anchor point of 0.0,
// 0.0
UIBezierPath *path = [UIBezierPath bezierPathWithOvalInRect:
CGRectMake([translucentBlackLayer bounds].size.width/2.0f - 100.0f,
[translucentBlackLayer bounds].size.height/2.0f - 100.0f,
200.0f, 200.0f)];
// Append a rectangular path around the mask layer so that
// we can use the even/odd fill rule to invert the mask
[path appendPath:[UIBezierPath bezierPathWithRect:[maskLayer bounds]]];
// Set the path's fill color since layer masks depend on alpha
[maskLayer setFillColor:[[UIColor blackColor] CGColor]];
[maskLayer setPath:[path CGPath]];
// Center the mask layer in the translucent black layer
[maskLayer setPosition:
CGPointMake([translucentBlackLayer bounds].size.width/2.0f,
[translucentBlackLayer bounds].size.height/2.0f)];
// Set the fill rule to even odd
[maskLayer setFillRule:kCAFillRuleEvenOdd];
// Set the translucent black layer's mask property
[translucentBlackLayer setMask:maskLayer];
// Add the container layer to the view so we can see it
[[[self view] layer] addSublayer:containerLayer];
}
You would have to animate the mask layer which you could build up based on user input, but it will be a bit challenging. Notice the lines where I append a rectangular path to the circle path and then set the fill rule a few lines later on the shape layer. These are what make the inverted mask possible. If you leave those out you will instead show the translucent black in the center of the circle and then nothing on the outer part (if that makes sense).
Maybe try to play with this code a bit and see if you can get it animating. I'll play with it some more as I have time, but this is a pretty interesting problem. Would love to see a complete solution.
UPDATE: So here's another stab at it. The trouble here is that this one makes the translucent mask look white instead of black, but the upside is that circle can be animated pretty easily.
This one builds up a composite layer with the translucent layer and the circle layer being siblings inside of a parent layer that gets used as the mask.
I added a basic animation to this one so we could see the circle layer animate.
- (void)viewDidLoad
{
[super viewDidLoad];
CGRect baseRect = CGRectMake(0.0f, 0.0f, 500.0f, 320.0f);
CALayer *containerLayer = [CALayer layer];
[containerLayer setBounds:baseRect];
[containerLayer setPosition:[[self view] center]];
UIImage *image = [UIImage imageNamed:#"cool"];
[containerLayer setContents:(id)[image CGImage]];
CALayer *compositeMaskLayer = [CALayer layer];
[compositeMaskLayer setBounds:baseRect];
[compositeMaskLayer setPosition:CGPointMake([containerLayer bounds].size.width/2.0f, [containerLayer bounds].size.height/2.0f)];
CALayer *translucentLayer = [CALayer layer];
[translucentLayer setBounds:baseRect];
[translucentLayer setBackgroundColor:[[UIColor blackColor] CGColor]];
[translucentLayer setPosition:CGPointMake([containerLayer bounds].size.width/2.0f, [containerLayer bounds].size.height/2.0f)];
[translucentLayer setOpacity:0.35];
[compositeMaskLayer addSublayer:translucentLayer];
CAShapeLayer *circleLayer = [CAShapeLayer layer];
UIBezierPath *circlePath = [UIBezierPath bezierPathWithOvalInRect:CGRectMake(0.0f, 0.0f, 200.0f, 200.0f)];
[circleLayer setBounds:CGRectMake(0.0f, 0.0f, 200.0f, 200.0f)];
[circleLayer setPosition:CGPointMake([containerLayer bounds].size.width/2.0f, [containerLayer bounds].size.height/2.0f)];
[circleLayer setPath:[circlePath CGPath]];
[circleLayer setFillColor:[[UIColor blackColor] CGColor]];
[compositeMaskLayer addSublayer:circleLayer];
[containerLayer setMask:compositeMaskLayer];
[[[self view] layer] addSublayer:containerLayer];
CABasicAnimation *posAnimation = [CABasicAnimation animationWithKeyPath:#"position"];
[posAnimation setFromValue:[NSValue valueWithCGPoint:[circleLayer position]]];
[posAnimation setToValue:[NSValue valueWithCGPoint:CGPointMake([circleLayer position].x + 100.0f, [circleLayer position].y + 100)]];
[posAnimation setDuration:1.0f];
[posAnimation setRepeatCount:INFINITY];
[posAnimation setAutoreverses:YES];
[circleLayer addAnimation:posAnimation forKey:#"position"];
}
Here's an answer that works with multiple independent, possibly overlapping spotlights.
I'll set up my view hierarchy like this:
SpotlightsView with black background
UIImageView with `alpha`=.5 (“dim view”)
UIImageView with shape layer mask (“bright view”)
The dim view will appear dimmed because its alpha mixes its image with the black of the top-level view.
The bright view is not dimmed, but it only shows where its mask lets it. So I just set the mask to contain the spotlight areas and nowhere else.
Here's what it looks like:
I'll implement it as a subclass of UIView with this interface:
// SpotlightsView.h
#import <UIKit/UIKit.h>
#interface SpotlightsView : UIView
#property (nonatomic, strong) UIImage *image;
- (void)addDraggableSpotlightWithCenter:(CGPoint)center radius:(CGFloat)radius;
#end
I'll need QuartzCore (also called Core Animation) and the Objective-C runtime to implement it:
// SpotlightsView.m
#import "SpotlightsView.h"
#import <QuartzCore/QuartzCore.h>
#import <objc/runtime.h>
I'll need instance variables for the subviews, the mask layer, and an array of individual spotlight paths:
#implementation SpotlightsView {
UIImageView *_dimImageView;
UIImageView *_brightImageView;
CAShapeLayer *_mask;
NSMutableArray *_spotlightPaths;
}
To implement the image property, I just pass it through to your image subviews:
#pragma mark - Public API
- (void)setImage:(UIImage *)image {
_dimImageView.image = image;
_brightImageView.image = image;
}
- (UIImage *)image {
return _dimImageView.image;
}
To add a draggable spotlight, I create a path outlining the spotlight, add it to the array, and flag myself as needing layout:
- (void)addDraggableSpotlightWithCenter:(CGPoint)center radius:(CGFloat)radius {
UIBezierPath *path = [UIBezierPath bezierPathWithOvalInRect:CGRectMake(center.x - radius, center.y - radius, 2 * radius, 2 * radius)];
[_spotlightPaths addObject:path];
[self setNeedsLayout];
}
I need to override some methods of UIView to handle initialization and layout. I'll handle being created either programmatically or in a xib or storyboard by delegating the common initialization code to a private method:
#pragma mark - UIView overrides
- (instancetype)initWithFrame:(CGRect)frame
{
if (self = [super initWithFrame:frame]) {
[self commonInit];
}
return self;
}
- (instancetype)initWithCoder:(NSCoder *)aDecoder {
if (self = [super initWithCoder:aDecoder]) {
[self commonInit];
}
return self;
}
I'll handle layout in separate helper methods for each subview:
- (void)layoutSubviews {
[super layoutSubviews];
[self layoutDimImageView];
[self layoutBrightImageView];
}
To drag the spotlights when they are touched, I need to override some UIResponder methods. I want to handle each touch separately, so I just loop over the updated touches, passing each one to a helper method:
#pragma mark - UIResponder overrides
- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches){
[self touchBegan:touch];
}
}
- (void)touchesMoved:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches){
[self touchMoved:touch];
}
}
- (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
[self touchEnded:touch];
}
}
- (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event {
for (UITouch *touch in touches) {
[self touchEnded:touch];
}
}
Now I'll implement the private appearance and layout methods.
#pragma mark - Implementation details - appearance/layout
First I'll do the common initialization code. I want to set my background color to black, since that is part of making the dimmed image view dim, and I want to support multiple touches:
- (void)commonInit {
self.backgroundColor = [UIColor blackColor];
self.multipleTouchEnabled = YES;
[self initDimImageView];
[self initBrightImageView];
_spotlightPaths = [NSMutableArray array];
}
My two image subviews will be configured mostly the same way, so I'll call another private method to create the dim image view, then tweak it to actually be dim:
- (void)initDimImageView {
_dimImageView = [self newImageSubview];
_dimImageView.alpha = 0.5;
}
I'll call the same helper method to create the bright view, then add its mask sublayer:
- (void)initBrightImageView {
_brightImageView = [self newImageSubview];
_mask = [CAShapeLayer layer];
_brightImageView.layer.mask = _mask;
}
The helper method that creates both image views sets the content mode and adds the new view as a subview:
- (UIImageView *)newImageSubview {
UIImageView *subview = [[UIImageView alloc] init];
subview.contentMode = UIViewContentModeScaleAspectFill;
[self addSubview:subview];
return subview;
}
To lay out the dim image view, I just need to set its frame to my bounds:
- (void)layoutDimImageView {
_dimImageView.frame = self.bounds;
}
To lay out the bright image view, I need to set its frame to my bounds, and I need to update its mask layer's path to be the union of the individual spotlight paths:
- (void)layoutBrightImageView {
_brightImageView.frame = self.bounds;
UIBezierPath *unionPath = [UIBezierPath bezierPath];
for (UIBezierPath *path in _spotlightPaths) {
[unionPath appendPath:path];
}
_mask.path = unionPath.CGPath;
}
Note that this isn't a true union that encloses each point once. It relies on the fill mode (the default, kCAFillRuleNonZero) to ensure that repeatedly-enclosed points are included in the mask.
Next up, touch handling.
#pragma mark - Implementation details - touch handling
When UIKit sends me a new touch, I'll find the individual spotlight path containing the touch, and attach the path to the touch as an associated object. That means I need an associated object key, which just needs to be some private thing I can take the address of:
static char kSpotlightPathAssociatedObjectKey;
Here I actually find the path and attach it to the touch. If the touch is outside any of my spotlight paths, I ignore it:
- (void)touchBegan:(UITouch *)touch {
UIBezierPath *path = [self firstSpotlightPathContainingTouch:touch];
if (path == nil)
return;
objc_setAssociatedObject(touch, &kSpotlightPathAssociatedObjectKey,
path, OBJC_ASSOCIATION_RETAIN_NONATOMIC);
}
When UIKit tells me a touch has moved, I see if the touch has a path attached. If so, I translate (slide) the path by the amount that the touch has moved since I last saw it. Then I flag myself for layout:
- (void)touchMoved:(UITouch *)touch {
UIBezierPath *path = objc_getAssociatedObject(touch,
&kSpotlightPathAssociatedObjectKey);
if (path == nil)
return;
CGPoint point = [touch locationInView:self];
CGPoint priorPoint = [touch previousLocationInView:self];
[path applyTransform:CGAffineTransformMakeTranslation(
point.x - priorPoint.x, point.y - priorPoint.y)];
[self setNeedsLayout];
}
I don't actually need to do anything when the touch ends or is cancelled. The Objective-C runtime will de-associated the attached path (if there is one) automatically:
- (void)touchEnded:(UITouch *)touch {
// Nothing to do
}
To find the path that contains a touch, I just loop over the spotlight paths, asking each one if it contains the touch:
- (UIBezierPath *)firstSpotlightPathContainingTouch:(UITouch *)touch {
CGPoint point = [touch locationInView:self];
for (UIBezierPath *path in _spotlightPaths) {
if ([path containsPoint:point])
return path;
}
return nil;
}
#end
I have uploaded a full demo to github.
I've been struggling with this same problem and found some great help here on SO so I thought I'd share my solution combining a few different ideas I found online. One additional feature I added was for the cut-out to have a gradient effect. The added benefit to this solution is that it works with any UIView and not just with images.
First subclass UIView to black out everything except the frames you want cut out:
// BlackOutView.h
#interface BlackOutView : UIView
#property (nonatomic, retain) UIColor *fillColor;
#property (nonatomic, retain) NSArray *framesToCutOut;
#end
// BlackOutView.m
#implementation BlackOutView
- (void)drawRect:(CGRect)rect
{
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextSetBlendMode(context, kCGBlendModeDestinationOut);
for (NSValue *value in self.framesToCutOut) {
CGRect pathRect = [value CGRectValue];
UIBezierPath *path = [UIBezierPath bezierPathWithRect:pathRect];
// change to this path for a circular cutout if you don't want a gradient
// UIBezierPath *path = [UIBezierPath bezierPathWithOvalInRect:pathRect];
[path fill];
}
CGContextSetBlendMode(context, kCGBlendModeNormal);
}
#end
If you don't want the blur effect, then you can swap paths to the oval one and skip the blur mask below. Otherwise, the cutout will be square and filled with a circular gradient.
Create a gradient shape with the center transparent and slowly fading in black:
// BlurFilterMask.h
#interface BlurFilterMask : CAShapeLayer
#property (assign) CGPoint origin;
#property (assign) CGFloat diameter;
#property (assign) CGFloat gradient;
#end
// BlurFilterMask.m
#implementation CRBlurFilterMask
- (void)drawInContext:(CGContextRef)context
{
CGFloat gradientWidth = self.diameter * 0.5f;
CGFloat clearRegionRadius = self.diameter * 0.25f;
CGFloat blurRegionRadius = clearRegionRadius + gradientWidth;
CGColorSpaceRef baseColorSpace = CGColorSpaceCreateDeviceRGB();
CGFloat colors[8] = { 0.0f, 0.0f, 0.0f, 0.0f, // Clear region colour.
0.0f, 0.0f, 0.0f, self.gradient }; // Blur region colour.
CGFloat colorLocations[2] = { 0.0f, 0.4f };
CGGradientRef gradient = CGGradientCreateWithColorComponents (baseColorSpace, colors, colorLocations, 2);
CGContextDrawRadialGradient(context, gradient, self.origin, clearRegionRadius, self.origin, blurRegionRadius, kCGGradientDrawsAfterEndLocation);
CGColorSpaceRelease(baseColorSpace);
CGGradientRelease(gradient);
}
#end
Now you just need to call these two together and pass in the UIViews that you want cutout
- (void)viewWillAppear:(BOOL)animated
{
[super viewWillAppear:animated];
[self addMaskInViews:#[self.viewCutout1, self.viewCutout2]];
}
- (void) addMaskInViews:(NSArray *)viewsToCutOut
{
NSMutableArray *frames = [NSMutableArray new];
for (UIView *view in viewsToCutOut) {
view.hidden = YES; // hide the view since we only use their bounds
[frames addObject:[NSValue valueWithCGRect:view.frame]];
}
// Create the overlay passing in the frames we want to cut out
BlackOutView *overlay = [[BlackOutView alloc] initWithFrame:self.view.frame];
overlay.backgroundColor = [UIColor colorWithWhite:0.0 alpha:0.8];
overlay.framesToCutOut = frames;
[self.view insertSubview:overlay atIndex:0];
// add a circular gradients inside each view
for (UIView *maskView in viewsToCutOut)
{
BlurFilterMask *blurFilterMask = [BlurFilterMask layer];
blurFilterMask.frame = maskView.frame;
blurFilterMask.gradient = 0.8f;
blurFilterMask.diameter = MIN(maskView.frame.size.width, maskView.frame.size.height);
blurFilterMask.origin = CGPointMake(maskView.frame.size.width / 2, maskView.frame.size.height / 2);
[self.view.layer addSublayer:blurFilterMask];
[blurFilterMask setNeedsDisplay];
}
}
If you just want something that is plug and play, I added a library to CocoaPods that allows you to create overlays with rectangular/circular holes, allowing the user to interact with views behind the overlay. It is a Swift implementation of similar strategies used in other answers. I used it to create this tutorial for one of our apps:
The library is called TAOverlayView, and is open source under Apache 2.0.
Note: I haven't implemented moving holes yet (unless you move the entire overlay as in other answers).

How to bring UIBezierPath to the back of a MKAnnotation object?

In my app, user draws a shape on map and using UIBeizerPath i am drawing that path. Then based on the coordinates of the path i am displaying the results which are only in that area. Everything works great except that now when Annotations drops on the Map view the pins looks like they are behind the path which means path looks in the front.
I am using this code to display the Annotation and path :
-(void)clearAnnotationAndPath:(id)sender {
[_mapView removeAnnotations:_mapView.annotations];
path = [UIBezierPath bezierPath];
[shapeLayer removeFromSuperlayer];
}
- (void)handleGesture:(UIPanGestureRecognizer *)gesture
{
CGPoint location = [gesture locationInView:_pathOverlay];
if (gesture.state == UIGestureRecognizerStateBegan)
{
shapeLayer = [[CAShapeLayer alloc] init];
shapeLayer.fillColor = [[UIColor clearColor] CGColor];
shapeLayer.strokeColor = [[UIColor greenColor] CGColor];
shapeLayer.lineWidth = 5.0;
//[_mapView.layer addSublayer:shapeLayer];
[pathOverlay.layer addSublayer:shapeLayer];
path = [UIBezierPath bezierPath];
[path moveToPoint:location];
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
[path addLineToPoint:location];
shapeLayer.path = [path CGPath];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
// MKMapView *mapView = (MKMapView *)gesture.view;
[path addLineToPoint:location];
[path closePath];
allStations = [RoadmapData sharedInstance].data;
for (int i=0; i<[allStations count]; i++) {
NSDictionary * itemNo = [allStations objectAtIndex:i];
NSString * fullAddress = [NSString stringWithFormat:#"%#,%#,%#,%#",[itemNo objectForKey:#"address"],[itemNo objectForKey:#"city"],[itemNo objectForKey:#"state"],[itemNo objectForKey:#"zip"]];
CLGeocoder * geoCoder = [[CLGeocoder alloc]init];
[geoCoder geocodeAddressString:fullAddress completionHandler:^(NSArray *placemarks, NSError *error) {
if (error) {
NSLog(#"Geocode failed with error: %#", error);
return;
}
if(placemarks && placemarks.count > 0)
{
CLPlacemark *placemark = placemarks[0];
CLLocation *location = placemark.location;
CLLocationCoordinate2D coords = location.coordinate;
CGPoint loc = [_mapView convertCoordinate:coords toPointToView:_pathOverlay];
if ([path containsPoint:loc])
{
NSString * name = [itemNo objectForKey:#"name"];
stationAnn = [[LocationAnnotation alloc]initWithCoordinate:coords Title:name subTitle:#"Wells Fargo Offer" annIndex:i];
stationAnn.tag = i;
[_mapView addAnnotation:stationAnn];
}
else{
NSLog(#"Out of boundary");
}
}
}];
[self turnOffGesture:gesture];
}
}
}
- (void)mapView:(MKMapView *)aMapView didAddAnnotationViews:(NSArray *)views{
if (views.count > 0) {
UIView* firstAnnotation = [views objectAtIndex:0];
UIView* parentView = [firstAnnotation superview];
if (_pathOverlay == nil){
// create a transparent view to add bezier paths to
pathOverlay = [[UIView alloc] initWithFrame: parentView.frame];
pathOverlay.opaque = NO;
pathOverlay.backgroundColor = [UIColor clearColor];
[parentView addSubview:pathOverlay];
}
// make sure annotations stay above pathOverlay
for (UIView* view in views) {
[parentView bringSubviewToFront:view];
}
}
}
Also once i go back from this and view and come again its not even drawing the Path.
Please help.
Thanks,
Apparently, when you add your bezier path to the map via:
[_mapView.layer addSublayer:shapeLayer];
it is getting added above some internal layer that MKMapView uses to draw the annotations. If you take a look at this somewhat related question, you'll see that you can implement the MKMapViewDelegate protocol, and get callbacks when new station annotations are added. When this happens, you basically inspect the view heirarchy of the newly added annotations, and insert a new, transparent UIView layer underneath them. You take care to bring all the annotations in front of this transparent UIView.
// always remember to assign the delegate to get callbacks!
_mapView.delegate = self;
...
#pragma mark - MKMapViewDelegate
- (void)mapView:(MKMapView *)aMapView didAddAnnotationViews:(NSArray *)views{
if (views.count > 0) {
UIView* firstAnnotation = [views objectAtIndex:0];
UIView* parentView = [firstAnnotation superview];
// NOTE: could perform this initialization in viewDidLoad, too
if (self.pathOverlay == nil){
// create a transparent view to add bezier paths to
pathOverlay = [[UIView alloc] initWithFrame: parentView.frame];
pathOverlay.opaque = NO;
pathOverlay.backgroundColor = [UIColor clearColor];
[parentView addSubview:pathOverlay];
}
// make sure annotations stay above pathOverlay
for (UIView* view in views) {
[parentView bringSubviewToFront:view];
}
}
}
Then, instead of adding your shape layer to _mapView.layer, you add it to your transparent view layer, also using this new layer in the coordinate conversion:
- (void)handleGesture:(UIPanGestureRecognizer*)gesture
{
CGPoint location = [gesture locationInView: self.pathOverlay];
if (gesture.state == UIGestureRecognizerStateBegan)
{
if (!shapeLayer)
{
shapeLayer = [[CAShapeLayer alloc] init];
shapeLayer.fillColor = [[UIColor clearColor] CGColor];
shapeLayer.strokeColor = [[UIColor greenColor] CGColor];
shapeLayer.lineWidth = 5.0;
[pathOverlay.layer addSublayer:shapeLayer]; // <- change here !!!
}
self.path = [[UIBezierPath alloc] init];
[path moveToPoint:location];
}
else if (gesture.state == UIGestureRecognizerStateChanged)
{
[path addLineToPoint:location];
shapeLayer.path = [path CGPath];
}
else if (gesture.state == UIGestureRecognizerStateEnded)
{
/*
* This code is the same as what you already have ...
*/
// But replace this next line with the following line ...
//CGPoint loc = [_mapView convertCoordinate:coords toPointToView:self];
CGPoint loc = [_mapView convertCoordinate:coords toPointToView: self.pathOverlay];
/*
* And again use the rest of your original code
*/
}
}
where I also added an ivar (and property) for the new transparent layer:
UIView* pathOverlay;
I tested this with a bogus grid of stations and got the following results:
P.S. I'd also recommend getting rid of your static variables. Just make them ivars/properties of your class.

Resources