I'm trying to modify a the path of my CAShapeLayer and then redraw it using setNeedsDisplay,
But for some reason it wont redraw. The only way i can make it to be redrawn is calling viewDidLoad which i am sure is not the correct way.
Here is the code for initialisation:
self.shape.lineWidth = 2.0;
[self.shape setFillColor:[[UIColor clearColor] CGColor]];
self.shape.strokeColor = [[UIColor blackColor] CGColor];
[self.shape setPath:[self.shapePath CGPath]];
self.shape.fillRule = kCAFillModeRemoved;
[self.cuttingView.layer addSublayer:self.shape];
And the gesture recogniser for the drag gesture:
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static CGPoint lastLocation;
CGPoint location = [gesture locationInView:self.cuttingView];
if (gesture.state == UIGestureRecognizerStateBegan)
{
if (![self.shapePath containsPoint:location] ) {
gesture.enabled = NO;
} else {
lastLocation = location;
}
}
if (gesture.state == UIGestureRecognizerStateChanged) {
CGAffineTransform translation = CGAffineTransformMakeTranslation((location.x - lastLocation.x), (location.y - lastLocation.y));
self.shapePath.CGPath = CGPathCreateCopyByTransformingPath(self.shapePath.CGPath, &translation);
[self.shape setNeedsDisplay];
}
gesture.enabled = YES;
}
I am printing the locations, and they do change, but the screen shows nothing.
How could this be fixed?
Thanks!
You are changing self.shapePath.CGPath, but you are not changing the path in the CAShapeLayer.
Instead of [self.shape setNeedsDisplay], do this again:
[self.shape setPath:[self.shapePath CGPath]];
you don't assign the path to the layer. I'd remove the shapePath variable as it has no use (AFAICS)
Related
I am trying to draw and edit multiple lines on iOS. Each line has a UIView at each end acting as handles so once the line is drawn a user can drag each end and the line will redraw.
Im currently using a UIBezierPath to draw a CAShapelayer on the view. The issue I have is working the best way to then draw another one and which ever line is tapped on the user can edit this one.
Does anyone have any ideas about the best way for this? Is CAShapelayer the best option?
Video Link that might show better what I'm trying to achieve.
https://www.dropbox.com/s/8rpt2azrs3uk6vr/Line%20Example.mov?dl=0
An example of the code I have done so far is below:
//Drawing a Line
Here I create a path from two touch points and the draw a CAShapeLayer on the view. I also create a custom object 'Line' to store the path and shapelayer.
-(void)DrawLineFrom:(CGPoint)pointA to:(CGPoint)pointB
{
NSLog(#"Drawing line X:%f Y:%f - X:%f Y:%f", pointA.x, pointA.y, pointB.x, pointB.y);
UIBezierPath* path = [[UIBezierPath alloc]init];
[path moveToPoint:pointA];
[path addLineToPoint:pointB];
[path addLineToPoint:CGPointMake(pointB.x, pointB.y+2)];
[path addLineToPoint:CGPointMake(pointA.x, pointA.y+2)];
[path addLineToPoint:pointA];
[path closePath];
currentLine.bPath = path;
if (!shapeLayer)
{
shapeLayer = [LineLayer layer];
[shapeLayer setFrame:self.view.frame];
shapeLayer.path = path.CGPath;
shapeLayer.strokeColor = [UIColor redColor].CGColor; //etc...
shapeLayer.lineWidth = 2.0; //etc...
shapeLayer.parent = currentLine;
currentLine.shapeLayer = shapeLayer;
[self.view.layer addSublayer:shapeLayer];
}
else
{
shapeLayer.path = path.CGPath;
}
[self ExitDrawMode];
}
//Creating a Handle
This creates the two views at either end and adds the Long Press Gesture.
-(HandleView *)MakeLineHandleForPoint:(int)point atLocation:(CGPoint)loc
{
HandleView *pointView = [[HandleView alloc]initWithFrame:CGRectMake(loc.x-10, loc.y-10, 20, 20)];
pointView.layer.cornerRadius = 10;
pointView.backgroundColor = [UIColor redColor];
UILongPressGestureRecognizer *LP = [[UILongPressGestureRecognizer alloc] initWithTarget:self action:#selector(handleLp:)];
[pointView addGestureRecognizer:LP];
LP.delegate = self;
LP.minimumPressDuration = 0.0;
pointView.userInteractionEnabled = YES;
pointView.tag = point;
pointView.lineParent = currentLine;
return pointView;
}
Finally the gesture is handled here. This works fine however will always move the last line drawn. Even if I have selected the first one.
-(void)handleLp:(UILongPressGestureRecognizer *)sender
{
CGPoint loc = [sender locationInView:self.view];
[sender view].center = loc;
HandleView *handleView = [sender view];
if ([sender view].tag == 0) {
currentLine.pointA = loc;
[self DrawLineFrom:loc to:currentLine.pointB];
}
if ([sender view].tag == 1) {
currentLine.pointB = loc;
[self DrawLineFrom:currentLine.pointA to:loc];
}
}
Any Help much appreciated. Thanks in advance.
I am masking UIView(240 * 240) in tringular shape using UIBezierPath as follows:
path = [UIBezierPath new];
[path moveToPoint:(CGPoint){0, 240}];
[path addLineToPoint:(CGPoint){120,0}];
[path addLineToPoint:(CGPoint){240,240}];
[path addLineToPoint:(CGPoint){0,240}];
[path closePath];
CAShapeLayer *mask = [CAShapeLayer new];
mask.frame = self.viewShape.bounds;
mask.path = path.CGPath;
self.viewShape.layer.mask = mask;
In above Image triangular area is mask. Now I have image of "Coca-Cola" which is to be moved only in triangular mask. So, for that I have apply UIPanGestureRecognizer to UIIMageView and restrict its frame in following way.
- (void)handlePanGesture:(UIPanGestureRecognizer *)gestureRecognizer
{
CGPoint touchLocation = [gestureRecognizer locationInView:self.viewShape];
CGRect boundsRect;
BOOL isInside = [path containsPoint:CGPointMake(self.innerView.center.x, self.innerView.center.y)];
NSLog(#"value:%d",isInside);
if (isInside) {
NSLog(#"inside");
self.innerView.center = touchLocation;
}else{
NSLog(#"outside");
}
}
My above if condition executes successfully but when control goes into else condition I am not able to drag back my ImageView inside mask frame.
So, My question is when else block(outside) called I should be able to drag imageView again inside the Mask's frame.
How can I achieve this?
saving reference to last center of imageview is a way to achieve this.
in your customView;
CGPoint lastValidCenter; //initially it is the center of imageview;
and in the code
NSLog(#"value:%d",isInside);
if (isInside) {
NSLog(#"inside");
self.innerView.center = touchLocation;
lastValidCenter = self.innerView.center;
}else{
NSLog(#"outside");
self.innerView.center = lastValidCenter;
}
There is a problem in your calculation. What you need to do is to check the top-left and top-right corner of the UIImageView for contains point with UIBezirePath instance.
It is so because when the image will move both of these corners are the ones which will try to go out first. So just put a check and you'll get your desired output.
BOOL isInside = [path containsPoint:CGPointMake(CGRectGetMinX(self.innerView.bounds), CGRectGetMinY(self.innerView.bounds))];
isInside = isInside || [path containsPoint:CGPointMake(CGRectGetMaxX(self.innerView.bounds), CGRectGetMinY(self.innerView.bounds))];
if (isInside) {
//Move your UIImageView
}
else {
//Don't move
}
I modified some portion of Meth's code. The code is as follows:
self.innerView.center = touchLocation;
BOOL isInside = [path containsPoint:CGPointMake(self.innerView.center.x, self.innerView.center.y)];
if (isInside) {
NSLog(#"inside");
self.innerView.center = touchLocation;
lastValidCenter = self.innerView.center;
}else{
NSLog(#"outside");
self.innerView.center = lastValidCenter;
}
Using some code I came across that draws a circle shape over an image. Pan and pinch gestures are used on the circle shape being drawn. The movement of the shape crosses over the dimensions of the UIImageView, onto the rest of the UI that is not displaying the image. I've tried to incorporate methods used in other examples where pan and pinch restrictions were needed in order to restrict the circle shape's movement to the bounds of the UIImageView, but could not get any of them to work. I understand that I would need to control the center and radius values of the pan and pinch so that they don't cross outside the UIImageView's border, but haven't the slightest clue as to how to execute this. Attached is the code I'm using. Thanks for your help!
-(void)setUpCropTool:(id)sender{
// create layer mask for the image
CAShapeLayer *maskLayer = [CAShapeLayer layer];
self.imageToBeCropped.layer.mask = maskLayer;
self.maskLayer = maskLayer;
// create shape layer for circle we'll draw on top of image (the boundary of the circle)
CAShapeLayer *circleLayer = [CAShapeLayer layer];
circleLayer.lineWidth = 50;
circleLayer.fillColor = [[UIColor clearColor] CGColor];
circleLayer.strokeColor = [[UIColor redColor] CGColor];
[self.imageToBeCropped.layer addSublayer:circleLayer];
self.circleLayer = circleLayer;
// create circle path
[self updateCirclePathAtLocation:CGPointMake(self.imageToBeCropped.bounds.size.width / 2.0, self.imageToBeCropped.bounds.size.height / 2.0) radius:self.imageToBeCropped.bounds.size.width/2.5];
// create pan gesture
UIPanGestureRecognizer *pan = [[UIPanGestureRecognizer alloc] initWithTarget:self action:#selector(handlePan:)];
pan.delegate = self;
[self.imageToBeCropped addGestureRecognizer:pan];
self.imageToBeCropped.userInteractionEnabled = YES;
self.pan = pan;
// create pan gesture
UIPinchGestureRecognizer *pinch = [[UIPinchGestureRecognizer alloc] initWithTarget:self action:#selector(handlePinch:)];
pinch.delegate = self;
[self.imageToBeCropped addGestureRecognizer:pinch];
self.pinch = pinch;
}
- (void)updateCirclePathAtLocation:(CGPoint)location radius:(CGFloat)radius
{
self.circleCenter = location;
self.circleRadius = radius;
UIBezierPath *path = [UIBezierPath bezierPath];
[path addArcWithCenter:self.circleCenter
radius:self.circleRadius
startAngle:0.0
endAngle:M_PI * 2.0
clockwise:YES];
self.maskLayer.path = [path CGPath];
self.circleLayer.path = [path CGPath];
}
#pragma mark - Gesture recognizers
- (void)handlePan:(UIPanGestureRecognizer *)gesture
{
static CGPoint oldCenter;
CGPoint tranlation = [gesture translationInView:gesture.view];
if (gesture.state == UIGestureRecognizerStateBegan)
{
oldCenter = self.circleCenter;
}
CGPoint newCenter = CGPointMake(oldCenter.x + tranlation.x, oldCenter.y + tranlation.y);
[self updateCirclePathAtLocation:newCenter radius:self.circleRadius];
}
- (void)handlePinch:(UIPinchGestureRecognizer *)gesture
{
static CGFloat oldRadius;
CGFloat scale = [gesture scale];
if (gesture.state == UIGestureRecognizerStateBegan)
{
oldRadius = self.circleRadius;
}
CGFloat newRadius = oldRadius * scale;
[self updateCirclePathAtLocation:self.circleCenter radius:newRadius];
}
#pragma mark - UIGestureRecognizerDelegate
- (BOOL)gestureRecognizer:(UIGestureRecognizer *)gestureRecognizer shouldRecognizeSimultaneouslyWithGestureRecognizer:(UIGestureRecognizer *)otherGestureRecognizer
{
if ((gestureRecognizer == self.pan && otherGestureRecognizer == self.pinch) ||
(gestureRecognizer == self.pinch && otherGestureRecognizer == self.pan))
{
return YES;
}
return NO;
}
Try adding some additional constraints to the pan handling code. Maybe something like:
if (oldCenter.x + tranlation.x >= superView.frame.size.width - self.circleRadius
|| oldCenter.x + tranlation.x <= self.circleRadius)
{
newCenter.x = oldCenter.x;
}
else
{
newCenter.x = oldCenter.x + tranlation.x;
}
This is a pretty simple example, hope it helps.
I'm attempting to get the reference to a UImageView that is underneath a masked UIImageView using hitTest withEvent. Here is what I have that is not working:
UIView A that contains 3 UIImageViews as subviews: panelOne, panelTwo, and panelThree. panelThree is takes up the entire frame but is masked into a triangle, revealing parts of panels one and two. So I need to detect when a user taps outside of that rectangle and send the touch to the appropriate UIImageView.
Code: (CollagePanel is a subclass of UIImageView)
-(void)triangleInASquare
{
CGSize size = self.frame.size;
CollagePanel *panelOne = [[CollagePanel alloc] initWithFrame:CGRectMake(0,0, size.width/2, size.height)];
panelOne.panelScale = panelOne.frame.size.width/self.frame.size.width;
panelOne.backgroundColor = [UIColor greenColor];
CollagePanel *panelTwo = [[CollagePanel alloc] initWithFrame:CGRectMake(size.width/2,0, size.width/2, size.height)];
panelTwo.panelScale = panelOne.frame.size.width/self.frame.size.width;
panelTwo.backgroundColor = [UIColor purpleColor];
CollagePanel *panelThree = [[CollagePanel alloc] initWithFrame:CGRectMake(0,0, size.width, size.height)];
panelThree.backgroundColor = [UIColor orangeColor];
UIBezierPath* trianglePath = [UIBezierPath bezierPath];
[trianglePath moveToPoint:CGPointMake(0, panelThree.frame.size.height)];
[trianglePath addLineToPoint:CGPointMake(panelThree.frame.size.width/2,0)];
[trianglePath addLineToPoint:CGPointMake(panelThree.frame.size.width, panelTwo.frame.size.height)];
[trianglePath closePath];
// Mask the panels's layer to triangle.
CAShapeLayer *triangleMaskLayer = [CAShapeLayer layer];
[triangleMaskLayer setPath:trianglePath.CGPath];
triangleMaskLayer.strokeColor = [[UIColor whiteColor] CGColor];
panelThree.layer.mask = triangleMaskLayer;
//Add border
CAShapeLayer *borderLayer = [CAShapeLayer layer];
borderLayer.strokeColor = [[UIColor whiteColor] CGColor];
borderLayer.fillColor = [[UIColor clearColor] CGColor];
borderLayer.lineWidth = 6;
[borderLayer setPath:trianglePath.CGPath];
[panelThree.layer addSublayer:borderLayer];
NSMutableArray *tempArray = [[NSMutableArray alloc] init];
[tempArray addObject:panelOne];
[tempArray addObject:panelTwo];
[tempArray addObject:panelThree];
[self addGestureRecognizersToPanelsInArray:tempArray];
[self addPanelsFromArray:tempArray];
self.panelArray = tempArray;
}
-(void)handleTap: (UITapGestureRecognizer*) recognizer //coming from panel.imageView
{
CGPoint tapPoint = [recognizer locationInView:self];
NSLog(#"Location in self: %#", NSStringFromCGPoint(tapPoint));
NSLog(#"self.subviews: %#", self.subviews);
UIView *bottomView = [self hitTest:tapPoint withEvent:nil];
NSLog(#"Bottom View: %#", bottomView);
}
The NSLog of bottomView is always panelThree (the topmost panel). From what I understand the hit test should be returning the "bottom most" subview.
you understand wrong. it will return the view that recognizes itself as touched and is nearest to your finger, nearer to the top.
If a view shall not recognize itself as touch for a certain point, you need to overwrite
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
for that view.
I think Ole Begemann's Shapped Button is a great example how to do so.
In your project this method could determine, if a point lies within the paths: CGPathContainsPoint.
Your pointInside:withEvent: might look like this:
#import "CollagePanel.h"
#import <QuartzCore/QuartzCore.h>
#implementation CollagePanel
//
// ...
//
- (BOOL)pointInside:(CGPoint)point withEvent:(UIEvent *)event
{
CGPoint p = [self convertPoint:point toView:[self superview]];
if(self.layer.mask){
if (CGPathContainsPoint([(CAShapeLayer *)self.layer.mask path], NULL, p, YES) )
return YES;
}else {
if(CGRectContainsPoint(self.layer.frame, p))
return YES;
}
return NO;
}
#end
I'm trying to figure out a way to detect which MKOverlayView (actually MKPolygonView) was tapped and then change its color.
I got it running with this code:
- (void)mapTapped:(UITapGestureRecognizer *)recognizer {
MKMapView *mapView = (MKMapView *)recognizer.view;
MKPolygonView *tappedOverlay = nil;
for (id<MKOverlay> overlay in mapView.overlays)
{
MKPolygonView *view = (MKPolygonView *)[mapView viewForOverlay:overlay];
if (view){
// Get view frame rect in the mapView's coordinate system
CGRect viewFrameInMapView = [view.superview convertRect:view.frame toView:mapView];
// Get touch point in the mapView's coordinate system
CGPoint point = [recognizer locationInView:mapView];
// Check if the touch is within the view bounds
if (CGRectContainsPoint(viewFrameInMapView, point))
{
tappedOverlay = view;
break;
}
}
}
if([[tappedOverlay fillColor] isEqual:[[UIColor cyanColor] colorWithAlphaComponent:0.2]]){
[listOverlays addObject:tappedOverlay];
tappedOverlay.fillColor = [[UIColor redColor] colorWithAlphaComponent:0.2];
}
else{
[listOverlays removeObject:tappedOverlay];
tappedOverlay.fillColor = [[UIColor cyanColor] colorWithAlphaComponent:0.2];
}
//tappedOverlay.strokeColor = [[UIColor blueColor] colorWithAlphaComponent:0.7];
}
Which works but sometimes, depending where I tap it gets wrong which MKPolygonView was tapped. I suppose because CGRectContainsPoint doesnt calculate properly the area, since it's not a rectangle it's a Polygon.
What other methods there are to do this? I tried CGPathContainsPoint but I get worse results.
Thanks to #Ana Karenina, that pointed out the right way, this is how you have to convert the gesture so that the method CGPathContainsPoint' works right.
- (void)mapTapped:(UITapGestureRecognizer *)recognizer{
MKMapView *mapView = (MKMapView *)recognizer.view;
MKPolygonView *tappedOverlay = nil;
int i = 0;
for (id<MKOverlay> overlay in mapView.overlays)
{
MKPolygonView *view = (MKPolygonView *)[mapView viewForOverlay:overlay];
if (view){
CGPoint touchPoint = [recognizer locationInView:mapView];
CLLocationCoordinate2D touchMapCoordinate =
[mapView convertPoint:touchPoint toCoordinateFromView:mapView];
MKMapPoint mapPoint = MKMapPointForCoordinate(touchMapCoordinate);
CGPoint polygonViewPoint = [view pointForMapPoint:mapPoint];
if(CGPathContainsPoint(view.path, NULL, polygonViewPoint, NO)){
tappedOverlay = view;
tappedOverlay.tag = i;
break;
}
}
i++;
}
if([[tappedOverlay fillColor] isEqual:[[UIColor cyanColor] colorWithAlphaComponent:0.2]]){
[listOverlays addObject:tappedOverlay];
tappedOverlay.fillColor = [[UIColor redColor] colorWithAlphaComponent:0.2];
}
else{
[listOverlays removeObject:tappedOverlay];
tappedOverlay.fillColor = [[UIColor cyanColor] colorWithAlphaComponent:0.2];
}
//tappedOverlay.strokeColor = [[UIColor blueColor] colorWithAlphaComponent:0.7];
}