How to correctly set a circle imageView with Swift? - ios

im trying to make a circle imageprofile, like the profile image of instagram/whatsapp. Right now my code seems to work, but i did it in 2 different ways and both works, so i want to know which one is the best
First way:
profileImageView.layer.cornerRadius = profileImageView.frame.width / 2
profileImageView.clipsToBounds = true
Second way
profileImageView.layer.cornerRadius = profileImageView.frame.width / 2
profileImageView.layer.masksToBounds = true
Also i would like if someone can explain me about "clipToBounds" and "maskToBounds", what they do. Thanks!

clipsToBounds is a boolean value that determines whether subviews are confined to the bounds of the view.
Setting this value to YES causes subviews to be clipped to the bounds of the receiver. If set to NO, subviews whose frames extend beyond the visible bounds of the receiver are not clipped. The default value is NO.
Basically, this thing plays with the view's property.
Whereas masksToBounds is a boolean indicating whether sublayers are clipped to the layer’s bounds.And this thing plays with the layer of the view.

The way that I always do it, specifically in a situation where I want to show a profile picture in my application, I use this code:
profileImage.layer.cornerRadius = self.profileImage.frame.size.width / 2;
profileImage.clipsToBounds = true;
This is what I would recommend to do!

Related

Obtain Bezier Path of CALayer

CALayer objects have a property accessibilityPath which as stated is supposedly
Returns the path of the element in screen coordinates.
Of course as expected this does not return the path of the layer.
Is there a way to access the physical path of a given CALayer that has already been created? For instance, how would you grab the path of a UIButton's layer property once the button has been initialized?
EDIT
For reference, I am trying to detect whether a rotated button contains a point. The reason for the difficulty here is due to the fact that the buttons are drawn in a curved view...
My initial approach was to create bezier slices then pass them as a property to the button to check if the path contains the point. For whatever reason, there seems to be an ugly offset from the path and the button.
They are both added to the same view and use the same coordinates / values to determine their frame, but the registered path seems to be offset to the left from the actual drawn shape from the path. Below is an image of the shapes I have drawn. The green outline is where the path is drawn (and displayed....) where the red is approximately the area which registers as inside the path...
I'm having a hard time understanding how the registered area is different.
If anyone has any ideas on why this offset would be occurring would be most appreciated.
UPDATE
Here is a snippet of me adding the shapes. self in this case is simply a UIView added to a controller. it's frame is the full size of the controller which is `{0, height_of_device - controllerHeight, width_of_device, controllerHeight}
UIBezierPath *slicePath = UIBezierPath.new;
[slicePath moveToPoint:self.archedCenterRef];
[slicePath addArcWithCenter:self.archedCenterRef radius:outerShapeDiameter/2 startAngle:shapeStartAngle endAngle:shapeEndAngle clockwise:clockwise];
[slicePath addArcWithCenter:self.archedCenterRef radius:(outerShapeDiameter/2 - self.rowHeight) startAngle:shapeEndAngle endAngle:shapeStartAngle clockwise:!clockwise];
[slicePath closePath];
CAShapeLayer *sliceShape = CAShapeLayer.new;
sliceShape.path = slicePath.CGPath;
sliceShape.fillColor = [UIColor colorWithWhite:0 alpha:.4].CGColor;
[self.layer addSublayer:sliceShape];
...
...
button.hitTestPath = slicePath;
In a separate method in my button subclass to detect if it contains the point or not: (self here is the button of course)
...
if ([self.hitTestPath containsPoint:touchPosition]) {
if (key.alpha > 0 && !key.isHidden) return YES;
else return NO;
}
else return NO;
You completely missunderstood the property, this is for assistive technology, from the docs:
Excerpt:
"The default value of this property is nil. If no path is set, the accessibility frame rectangle is used to highlight the element.
When you specify a value for this property, the assistive technology uses the path object you specify (in addition to the accessibility frame, and not in place of it) to highlight the element."
You can only get the path from a CAShapeLayer, alls other CALayers don't need to be drawn with a path at all.
Update to your update:
I think the offset is due to a missing
UIView convert(_ point: CGPoint, to view: UIView?)
The point needs to be converted to the buttons coordinate systems.

Observe UIView frame while animating [duplicate]

I want to observe changes to the x coordinate of my UIView's origin while it is being animated using animateWithDuration:delay:options:animations:completion:. I want to track changes in the x coordinate during this animation at a granular level because I want to make a change in interaction to another view that the view being animated may make contact with. I want to make that change at the exact point of contact. I want to understand the best way to do something like this at a higher level:
-- Should I use animateWithDuration:... in the completion call back at the point of contact? In other words, The first animation runs until it hits that x coordinate, and the rest of the animation takes place in the completion callback?
-- Should I use NSNotification observers and observe changes to the frame property? How accurate / granular is this? Can I track every change to x? Should I do this in a separate thread?
Any other suggestions would be welcome. I'm looking for a abest practice.
Use CADisplayLink since it is specifically built for this purpose. In the documentation, it says:
Once the display link is associated with a run loop, the selector on the target is called when the screen’s contents need to be updated.
For me I had a bar that fills up, and as it passed a certain mark, I had to change the colors of the view above that mark.
This is what I did:
let displayLink = CADisplayLink(target: self, selector: #selector(animationDidUpdate))
displayLink.frameInterval = 3
displayLink.addToRunLoop(NSRunLoop.mainRunLoop(), forMode: NSDefaultRunLoopMode)
UIView.animateWithDuration(1.2, delay: 0.0, options: [.CurveEaseInOut], animations: {
self.viewGaugeGraph.frame.size.width = self.graphWidth
self.imageViewGraphCoin.center.x = self.graphWidth
}, completion: { (_) in
displayLink.invalidate()
})
func animationDidUpdate(displayLink: CADisplayLink) {
let presentationLayer = self.viewGaugeGraph.layer.presentationLayer() as! CALayer
let newWidth = presentationLayer.bounds.width
switch newWidth {
case 0 ..< width * 0.3:
break
case width * 0.3 ..< width * 0.6:
// Color first mark
break
case width * 0.6 ..< width * 0.9:
// Color second mark
break
case width * 0.9 ... width:
// Color third mark
break
default:
fatalError("Invalid value observed. \(newWidth) cannot be bigger than \(width).")
}
}
In the example, I set the frameInterval property to 3 since I didn't have to rigorously update. Default is 1 and it means it will fire for every frame, but it will take a toll on performance.
create a NSTimer with some delay and run particular selector after each time lapse. In that method check the frame of animating view and compare it with your colliding view.
And make sure you use presentationLayer frame because if you access view.frame while animating, it gives the destination frame which is constant through out the animation.
CGRect animationViewFrame= [[animationView.layer presentationLayer] frame];
If you don't want to create timer, write a selector which calls itself after some delay.Have delay around .01 seconds.
CLARIFICATION->
Lets say you have a view which you are animating its position from (0,0) to (100,100) with duration of 5secs. Assume you implemented KVO to the frame of this view
When you call the animateWithDuration block, then the position of the view changes directly to (100,100) which is final value even though the view moves with intermediate position values.
So, your KVO will be fired one time at the instant of start of animation.
Because, layers have layer Tree and Presentation Tree. While layer tree just stores destination values while presentation Layer stores intermediate values.
When you access view.frame it will always gives the value of frame in layer tree not the intermediate frames it takes.
So, you had to use presentation Layer frame to get intermediate frames.
Hope this helps.
UIDynamics and collision behaviours would be worth investigating here. You can set a delegate which is called when a collision occurs.
See the collision behaviour documentation for more details.

ios swift 4 Google Maps - How do I remove the fade in/fade out animation on a GMS Marker while dragging?

I'm trying to achieve an affect on ios that was pretty easy to do in android. In android, Google Map markers have a visibility attribute (boolean) so it was easy, but the closest thing I've found in the ios SDK is the opacity field.
Whenever I set my opacity to zero, there's a fade out affect that isn't what I want.
Is there anyway to simply remove the fade animation on a Marker?
Thank you for any insights
-T
Try use the animation keypath layer, it's work for me
CATransaction.begin()
let markerLayer = marker.layer
let fadeOutAnimation = CABasicAnimation()
fadeOutAnimation.keyPath = "opacity"
fadeOutAnimation.fromValue = 1
fadeOutAnimation.toValue = 0
fadeOutAnimation.duration = 0.35
CATransaction.setCompletionBlock {
marker.map = nil
}
markerLayer.add(fadeOutAnimation, forKey: "fade")
CATransaction.commit()
In order to animate (or not) a GMSMarker you need to control its tracksViewChanges property:
Controls whether the icon for this marker should be redrawn every
frame.
Note that when this changes from NO to YES, the icon is
guaranteed to be redrawn next frame.
Defaults to YES.
Once you create an instance of GMSMarker first, set its tracksViewChanges to false - should solve your issue since the iconView property won't get redrawn again
Also, you can animate its iconView property by:
Set its tracksViewChanges to true
Use the iconView reference to animate whatever you need
When you're done, set its tracksViewChanges to false

UIButton - text moved down

While I am using this code
self.layer.cornerRadius = self.bounds.size.width / 2
self.layer.masksToBounds = true
//
self.titleLabel?.adjustsFontSizeToFitWidth = true
self.titleLabel?.minimumScaleFactor = 0.5
self.titleLabel?.baselineAdjustment = .AlignCenters
self.titleLabel?.numberOfLines = 1
//
self.showsTouchWhenHighlighted = true
to autoshrink label/text that is part of UIButton. For smaller type of devices I got the result like this
Any idea?
EDIT: I did it again from scratch and the result is bit different. Since I use almost everything the same, I might guess this is due to the fact that this version 8 of xCode with SWIFT 2.3 can produce strange things.
Basically what I did is to keep the button resizing with aspect ratio 1:1 and set constraints with the width 414 in the IB.
Smaller devices with width 320 are actually displaying the results bit down than centre depends on the size, bigger == more down. Strange. I didn't touch any insets btw. Default is 10 left and 10 right.
Exchanging this line actually works, strange
self.titleLabel?.baselineAdjustment = .None

Setting UIView.transform to arbitrary translate CGAffineTransform does nothing

I have a UIView called container that I want to move (offset) using affine transfrom. This view contains UIImageView and is a subview of UICollectionViewCell.
So it should be simple:
container.transform = CGAffineTransformMakeTranslation(100, 200) //render container 100 points right and 200 points down
Instead it is very hard, because theat code does not do anything. The view is rendered excatly on the same place as if I delete that line. So I added 'print' to verify what affine translation was set:
container.transform = CGAffineTransformMakeTranslation(100, 200)
print(container.transform) //prints: CGAffineTransform(a: 1.0, b: 0.0, c: 0.0, d: 1.0, tx: 100.0, ty: 200.0)
That seems all right. So I tried rotating the container view instead with CGAffineTransformMakeRotation and it rotates the view just not around its center as it should according to documentation. I tried different combinations of translate, rotation and scale transforms just to find that the affine transformation matrixes set are OK, but attributes tx and ty seems to be ignored and a, b, c and d seems to be using different anchor point then the centre of the view (cannot say what that point is).
Any ideas on what can be causing this and how to fix it?
There must be something like auto layout messing things up for you. In the absence of outside influence, setting a view's affine transform to CGAffineTransformMakeTranslation(100, 200) will shift it right 100 points and down 200. I verified this by making a new Single View Project in Xcode and changing the viewDidLoad method in the ViewController.swift class to:
override func viewDidLoad()
{
super.viewDidLoad()
view.backgroundColor = UIColor.blueColor();
let container = UIView(frame: CGRectMake(0,0,100,100));
container.backgroundColor = UIColor.greenColor();
container.transform = CGAffineTransformMakeTranslation(100, 200);
view.addSubview(container);
}
As expected this makes the green container view appear 100 points to the right and 200 points down, even though its frame is (0,0,100,100).
So please check for auto layout and other such things that might influence the placement of this view, and if you can't find anything please post more code. Also, if your container view doesn't have a background color, please give it one so that you can see its position directly, instead of deducing its position by looking at the image view.
n.b. Setting a view's transform doesn't actually move the view itself, it just changes how/where it draws its content.

Resources