I have a custom UIView, which has a UIImageView as its' subview to display an image and a CAShapeLayer as a sublayer to draw on it. The UIView is zoomable, I use a UIPinchGestureRecognizer to zoom in and out of the view. I can draw straight lines on the view using a CGPathRef. The problem is, when I pinch to zoom the view, the lines I draw also zoom and become thick. How to zoom in the view, keeping the lines thin?
Before zooming (green lines are the ones I draw):
And when I zoom:
What I want is, the lines to be as thin as they were before I zoomed in.
I tried to:
CGMutablePath path = CGPathCreateMutable();
//line drawing code here
//scale path to 1.0?
CGAffineTransform transform = CGAffineTransformMakeScale(1.0, 1.0);
CGPathRef transformedPath = CGPathCreateCopyByTransformingPath(path, &transform);
self.drawingLayer.path = transformedPath;
CGPathRelease(path);
CGPathRelease(transformedPath);
How do I prevent path width from enlarging when zooming?
You need to divide the line width with the zoom factor.By default line width is 1.
So create property for the same and when you apply zoom with gesture divide it.
i.e zoom = 4.0 then line width = 1.0/4.0
and same as if zoom is 2 then 1.0/2.0
Hope it is helpful
Related
I am building a custom UIView that you can rotate and resize. I can resize the UIView by dragging the corners of the UIView. I calculate how much I have dragged then change the frame of the UIView accordingly.
However, I am running into problems once I added a rotation gesture recognizer to the view. If I rotate or apply a transform to the view, I no longer know how to calculate drag distance and change the frame of the view. How could I calculate the width and height change between my new view and the original view when things are put at an added angle or if they have some other transform, like a translation transform?
I thought of possibilities to set the view's transform back to .identity, change the size of the view, then re-apply its transform, but I'm not sure how to actually go about implementing this.
After applying transform you can not use frame
You have two options
1) First Calculate everything using center of your view
2) As you know apply identity and change frame
for point 2 I have added example that might helpful to you
let transform = imageView.transform
imageView.transform = CGAffineTransform.identity
var rect: CGRect = imageView.frame
rect = // Change Rect here
imageView.frame = rect // Assign it
imageView.transform = transform // Apply Transform
I am developing a program which identifies a rectangle in an image and draws a path on the border of that identified rectangle. Now I want to reposition that path in case if it is not on the exact position. For an example look at this image
In cases like this i need to drag the corners of the path and reposition it as it fits the rectangle.
To draw the path I used CAShapeLayer and UIBezierPath. Here is the code I used to draw the path.
// imgView is the UIImageView which contains the image with the rectangle
let line: CAShapeLayer = CAShapeLayer();
line.frame = imgView.bounds;
let linePath: UIBezierPath = UIBezierPath();
linePath.moveToPoint(CGPointMake(x1, y1);
linePath.addLineToPoint(CGPointMake(x2, y2);
linePath.addLineToPoint(CGPointMake(x3, y3);
linePath.addLineToPoint(CGPointMake(x4, y4);
linePath.addLineToPoint(CGPointMake(x1, y1);
linePath.closePath();
line.lineWidth = 5.0;
line.path = linePath.CGPath;
line.fillColor = UIColor.clearColor().CGColor;
line.strokeColor = UIColor.blueColor().CGColor;
imgView.layer.addSublayer(line);
The thing is I tried to add a gesture to UIBezierPath. But there is nothing like that as I noticed. Couldn't find anything regarding this. So can someone please let me know a way to get my work done. Any help would be highly appreciated.
You are right that there is no way to attach a gesture recognizer to a UIBezierPath. Gesture recognizers attach to UIView objects, and a UIBezierPath is not a view object.
There is no built-in mechanism to do this. You need to do it yourself. I would suggest building a family of classes to handle it. Create a rectangle view class. It would use a bezier path internally, as well as placing 4 corner point views on the vertexes and installing pan gesture recognizers each corner point view.
Note that Cocoa rectangles (CGRects) can't be rotated. You'll need to use a series of connected line segments and write logic that forces it to stay square.
I am testing a UIView using a UISlider as in the example images below:
I have a custom UIView with a yellow background that draws the gray square, the drawRect method is like so:
-(void)drawRect:(CGRect)rect{
NSLog(#"Draw rect called");
UIBezierPath* squarePath = [UIBezierPath bezierPathWithRect: CGRectMake(10, 10, 100, 100)];
[UIColor.grayColor setFill];
[squarePath fill];
}
And the method for my slide changing value:
- (IBAction)changeValue:(id)sender {
CGAffineTransform transform = CGAffineTransformScale(CGAffineTransformIdentity, self.slider.value, self.slider.value);
self.tableView.transform = transform;
[self.tableView setNeedsDisplay];
}
I dont understand why the square is getting larger. I've noticed that drawRect is called every time the slider is moved. If this happens then why is the square size changing? Shouldn't it remain the same size and just the frame grow with the square in the top left corner?
My second question is, how would I change the code so just the frame grows and the drawing size stays the same? I ask this because actually I want the drawing size to change dynamically using my own code in drawRect.
Any pointers would be really appreciated! thanks!
The reason why the size of the square changes is because you've transformed it. Transformations don't just affect the frame of a view; they will affect the content. The square is getting drawn into its context at its constant size (100x100) and then the transform is stretching it before it gets rendered.
The reason why it's not expanding to the right and down is because by default the anchor point of a transform is the center of the bounds. Thus it'll scale from the center outwards. From the documentation:
The origin of the transform is the value of the center property ...
Transformations aren't intended to be used to simply scale the width and height of your frame. The frame property is for that. Simply store the view's frame in a variable, change its width and height, then set it back. In your drawRect: code you can check the dimensions of the rectangle that's given to you and make your square's width/height a percentage of that.
I am trying to draw a Hexagon using UIBezierPath and ZEPolygon, it works great but my hexagon is flat on top. I have tried everything to get it to draw form a point in the middle including a 180 degree transform on the path which work but everything else break.
This is how it looks now
This is how i would like it to look
My code is below
UIImageView *maskedImageView = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"image.png"]];
UIBezierPath *nonagon = [UIBezierPath bezierPathWithPolygonInRect:maskedImageView.frame numberOfSides:6];
CAShapeLayer *shapeLayer = [CAShapeLayer layer];
shapeLayer.path = nonagon.CGPath;
maskedImageView.layer.mask = shapeLayer;
[self.view addSubview:maskedImageView];
This is the library i used for the drawing the bezier path
Thanks for any help
When you rotate a UIBezierPath with a CGTransform, it will be rotated around the point (0,0), and for your path the point (0,0) is the top left corner of your shape. This is why the offset is incorrect when you just rotate by 90 degrees w/o doing anything else - its rotating around the wrong point.
So before you rotate, you need to center the path to the point (0,0), then rotate it, then move it back so that (0,0) is at its top left.
the following code will rotate the polygon 90 degrees:
// get the size of the image, we'll need this for our path and for later too
CGRect boundsForPoly = maskedImageView.frame;
// create our path inside the rect
UIBezierPath *nonagon = [UIBezierPath bezierPathWithPolygonInRect:boundsForPoly numberOfSides:6];
// center the polygon on (0,0)
[nonagon applyTransform:CGAffineTransformMakeTranslation(-boundsForPoly.size.width/2, -boundsForPoly.size.height/2)];
// rotate it 90 degrees
[nonagon applyTransform:CGAffineTransformMakeRotation(M_PI/2)];
// now move it back so that the top left of its bounding box is (0,0)
[nonagon applyTransform:CGAffineTransformMakeTranslation(nonagon.bounds.size.width/2, nonagon.bounds.size.height/2)];
this will rotate the polygon 90 degrees, and keep it's top left corner at (0,0)
The blue outline is your path before, and the green outline is after the rotation:
There is a problem with the answer from adam.wulf is with the line
[nonagon applyTransform:CGAffineTransformMakeTranslation(nonagon.bounds.size.width/2, nonagon.bounds.size.height/2)];
It doesn't center the polygon to the center of the frame. It should be instead
//Centered version
[nonagon applyTransform:CGAffineTransformMakeTranslation(boundsForPoly.size.width/2, boundsForPoly.size.height/2/2)];
Thus the code should look like following from adam.wulf
// get the size of the image, we'll need this for our path and for later too
CGRect boundsForPoly = maskedImageView.frame;
// create our path inside the rect
UIBezierPath *nonagon = [UIBezierPath bezierPathWithPolygonInRect:boundsForPoly numberOfSides:6];
// center the polygon on (0,0)
[nonagon applyTransform:CGAffineTransformMakeTranslation(-boundsForPoly.size.width/2, -boundsForPoly.size.height/2)];
// rotate it 90 degrees
[nonagon applyTransform:CGAffineTransformMakeRotation(M_PI/2)];
// now move it back so that the top left of its bounding box is (0,0)
[nonagon applyTransform:CGAffineTransformMakeTranslation(boundsForPoly.size.width/2, boundsForPoly.size.height/2/2)];
I have a UIImageView add to a UIView as a subview. When I apply a transformation on the UIView's layer the UIImageView gets blurry. Why is that? How can this problem be resolved?
view.layer.position = newPosition;
I apply only this transformation.
Edit:
I've tested it and if I apply other transformations like these:
view.layer.transform = newTransform;
view.layer.zPosition = newZPosition;
then the blurry doesn't appear, only if I change the layer position.
Use
blurryDialog.frame = CGRectIntegral(blurryDialog.frame);
To set the frame coordinates to integer values
I found the answer. When you set the position of a layer or a view then the origin of it can be float values and this causes the blurry thing.
So the solution is to set also the frame's origin after you set the position. You just have to set the frame's position to integer values. On the UI this change won't be seen, but the image will not be blurry.