How to resize UIImage ratio into view in ios swift? - ios

I am trying to resize uiimage ratio. Right now i am getting image into square not in correct aspect ratio. If i change the width and height uiimage into pickImage.frame.size.width, pickImage.frame.size.hight then UIImage look so large.
If i set pickImage?.contentMode = .scaleAspectFit then image set its position but drop shadow is shown full image view not image picked one.
Here is the screenshots of result i got
And close button should be top left corner, here when i pick image from image picker any other position of close button image set to properly based on landscape or portrait.
Here is the code i used:
func addImage(url : URL) {
let tag = Int(arc4random_uniform(6))
pickImage = UIImageView()
pickImage?.sd_setImage(with:url)
pickImage?.sd_setShowActivityIndicatorView(true)
pickImage?.backgroundColor = UIColor.lightGray
pickImage?.sd_setIndicatorStyle(.gray)
pickImage?.frame = CGRect(x: randomNumber(inRange:
200...Int(touchDrawview.frame.width - 200)), y: Int(getYValue(maxYValue:
Int(touchDrawview.frame.height - 200))), width: 200, height: 200)
pickImage?.autoresizingMask = [.flexibleTopMargin, .flexibleHeight,
.flexibleRightMargin, .flexibleLeftMargin, .flexibleTopMargin,
.flexibleWidth]
pickImage?.contentMode = .scaleAspectFit
pickImage?.tag = tag
pickImage?.isUserInteractionEnabled = true
let imageclose = UIImage(named: "imageclose")
closeImage = UIImageView(image : imageclose)
closeImage?.frame = CGRect(x: 10, y: 10, width: 30, height: 30)
closeImage?.tag = tag
closeImage?.isHidden = true
closeImage?.isUserInteractionEnabled = true
pickImage?.layer.shadowColor = UIColor.white.cgColor
pickImage?.layer.shadowOffset = CGSize(width: 0, height: 3)
pickImage?.layer.shadowOpacity = 1
pickImage?.layer.shadowRadius = 1.0
pickImage?.clipsToBounds = false
let longGuetureImage = UILongPressGestureRecognizer(target: self, action:
#selector(longPressImage(sender:)))
longGuetureImage.minimumPressDuration = 0.1
pickImage?.isUserInteractionEnabled = true
longGuetureImage.delegate = self
pickImage?.addGestureRecognizer(longGuetureImage)
let panGesture = UIPanGestureRecognizer(target: self, action:
#selector(handlePanImage(recognizer:)))
panGesture.delegate = self
pickImage?.isUserInteractionEnabled = true
pickImage?.addGestureRecognizer(panGesture)
let tapGuetureImage = UITapGestureRecognizer(target: self, action:
#selector(removeImage(sender:)))
tapGuetureImage.delegate = self
closeImage?.addGestureRecognizer(tapGuetureImage)
let tapGueturemainImage = UITapGestureRecognizer(target: self, action:
#selector(selectdragImageTap(_:)))
tapGueturemainImage.delegate = self
pickImage?.addGestureRecognizer(tapGueturemainImage)
let rotate = UIRotationGestureRecognizer(target: self, action:
#selector(handlerotateImage(recognizer:)))
rotate.delegate = self
pickImage?.addGestureRecognizer(rotate)
let pinch = UIPinchGestureRecognizer(target: self, action:
#selector(handlePinchImage(sender:)))
pinch.delegate = self
pickImage?.addGestureRecognizer(pinch)
pickImage?.dropShadowOff()
addPickedImage(image: pickImage!, closeimage: closeImage!,imageType :
PickedType.image.rawValue,imageData: url.absoluteString)
pickImage = nil
closeImage = nil
}

Try This:
pickImage?.contentMode = .scaleAspectFill
pickImage?.clipsToBounds = true
Create a UIView with clipsToBounds = true first, apply shadow on this view and then add your pickimage and button as a subView in this view. because clipsToBounds = true stop dropping shadow on current view.

When the image is resized, the aspect ratio of image may not be the same as the view. There are two approaches for what you are trying to achieve.
Approach 1:
Resize the image to fit the view. Here the aspect ratio may be different than view hence image may get distorted. To resize image, refer below,
Refer:
https://aurvan.github.io/atkit-ios-release/index.html
Class Reference:
https://aurvan.github.io/atkit-ios-release/helpbook/Extensions/UIImage.html
Code:
import ATKit
let anImage :UIImage = UIImage(named: "DefaultAvatar")!
let aResizedImage :UIImage? = anImage.resize(size: CGSize(width: 100.0, height: 200.0), scaleMode: UIImageScaleMode.aspectFit)
Approach 2: Calculate the image size manually and adjust the close button from the horizontal center of the view. I have not tried the code, but something like below should work,
anImageX = (anImageViewWidth - anImageWidth) / 2.0
anImageY = (anImageViewHeight - anImageHeight) / 2.0

Related

Animate image crop from right to left

I want to animate a change in the width of an ImageView in Swift, in a way it will be appeared as if the image is being cropped from right to left. I mean that I want the image to always stick to the left edge of its superView and only its right edge will be changed during the animation, without changing the image scale.
I've managed to animate the change in the width of the image while preserving its scale, but the image is being cropped from both sides towards its centerX and not from right to left only.
imageView.contentMode = .scaleAspectFill
imageView.clipToBounds = true
let animation = CABasicAnimation(keyPath: "bounds.size.width")
animation.fromValue = 250
animation.toValue = 50
imageView.layer.add(animation, forKey: nil)
According to this SO thread one should change the anchorPoint of imageView.layer to have x=0, but doing so moves the left edge of the image to the center of the view, and also moves the image when animating so that when it is in its smaller width, the image's centerX point will be visible at the center of the screen.
I would suggest you to use additional view (panel) that defines the size and place your image view on this view. Panel may then clip the image view to create a desired effect.
I created a short example all in code just to demonstrate the approach:
class ViewController: UIViewController {
private var isImageExtended: Bool = true
private var imagePanel: UIView?
override func viewDidLoad() {
super.viewDidLoad()
let panel = UIView(frame: CGRect(x: 0.0, y: 0.0, width: 200.0, height: 100.0))
panel.backgroundColor = .lightGray
let imageView = UIImageView(image: UIImage(named: "test_image"))
imageView.frame = CGRect(x: 0.0, y: 0.0, width: panel.bounds.width, height: panel.bounds.height)
imageView.contentMode = .scaleAspectFill
imageView.translatesAutoresizingMaskIntoConstraints = false
panel.addSubview(imageView)
imageView.backgroundColor = .red
view.addSubview(panel)
panel.center = view.center
panel.clipsToBounds = true
imagePanel = panel
view.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(onTap)))
}
#objc private func onTap() {
guard let imagePanel else { return }
isImageExtended = !isImageExtended
let panelWidth: CGFloat = isImageExtended ? 200 : 50
UIView.animate(withDuration: 0.5) {
imagePanel.frame.size.width = panelWidth
}
}
}
I could easily achieve the same result using storyboard and constraints with significantly less code:
class ViewController: UIViewController {
#IBOutlet private var panelWidthConstrain: NSLayoutConstraint?
private var isImageExtended: Bool = true
override func viewDidLoad() {
super.viewDidLoad()
view.addGestureRecognizer(UITapGestureRecognizer(target: self, action: #selector(onTap)))
}
#objc private func onTap() {
isImageExtended = !isImageExtended
let panelWidth: CGFloat = isImageExtended ? 200 : 50
UIView.animate(withDuration: 0.5) {
self.panelWidthConstrain?.constant = panelWidth
self.view.layoutIfNeeded()
}
}
}
I hope the code speaks enough for itself.

Programmatically center UIImage inside parent view vertically

I am on Swift 5.
The goal is to center a UIImageView vertically inside a view. Currently it looks like
Note all the image bubbles are running off of the cell.
This is the code that lead to this:
let imageView = UIImageView()
let width = self.frame.width
let height = self.frame.height
let img_width = height //* 0.8
let img_height = height
let y = (height - img_height)/2
let x = width*0.05
imageView.frame = CGRect(
x: x
, y: CGFloat(y)
, width: img_width
, height: img_height
)
let rounded = imageView
.makeRounded()
.border(width:1.0, color:Color.white.cgColor)
self.addSubview(rounded)
The imageView extension functions are:
func makeRounded() -> UIImageView {
self.layer.borderWidth = 0.5
self.layer.masksToBounds = false
self.layer.borderColor = Color.white.cgColor
self.layer.cornerRadius = self.frame.width/2
self.clipsToBounds = true
// see https://developer.apple.com/documentation/uikit/uiview/contentmode
self.contentMode = .scaleAspectFill
return self
}
func border( width: CGFloat, color: CGColor ) -> UIImageView{
self.layer.borderWidth = width
self.layer.borderColor = color
return self
}
Which is very vanilla.
This is odd because I laid out the textview vertically in the exact same way, that is: (parentHeight - childHeight)/2, and it is centered. You can see it in the blue text boxes in cell two and three.
____ EDIT _______
This is how I laid out the cell
let data = dataSource[ row - self._data_source_off_set ]
let cell = tableView.dequeueReusableCell(withIdentifier: "OneUserCell", for: indexPath) as! OneUserCell
// give uuid and set delegate
cell.uuid = data.uuid
cell.delegate = self
// render style: this must be set
cell.hasFooter = false //true
cell.imageSource = data
cell.headerTextSource = data
cell.footerTextSource = data
// color schemes
cell.backgroundColor = Color.offWhiteLight
cell.selectionColor = Color.graySecondary
Add these constraints to you imageView and remove frame and its calculations
self.contentView.addSubview(rounded)
self.mimageView.translatesAutoresizingMaskIntoConstraints = false
self.mimageView.leadingAnchor.constraint(equalTo: contentView.leadingAnchor,constant: 20).isActive = true
self.mimageView.centerYAnchor.constraint(equalTo: contentView.centerYAnchor).isActive = true
self.mimageView.heightAnchor.constraint(equalTo: contentView.heightAnchor).isActive = true
self.mimageView.widthAnchor.constraint(equalTo: contentView.heightAnchor).isActive = true

Give inset to UIImageView before setting Its frame

I have an UITextField component which has an icon on the right side of it. Icon is just an UIImageView. I'm trying to give inset to UIImageView with below code
imageView.image = UIImage(named: "example")?.withAlignmentRectInsets(UIEdgeInsets(top: -4, left: 0, bottom: -4, right: 0))
However, It doesn't work cuz It's frame hasn't been set when setting Its' Image. I'm giving Its frame with below code;
iconImageView.frame = CGRect(
x: bounds.size.width - iconWidth - iconMarginLeft,
y: bounds.size.height - textHeight() - iconMarginBottom,
width: iconWidth,
height: textHeight()
)
and how I setup UIImageView
fileprivate func createIconImageView() {
let iconImageView = UIImageView()
iconImageView.backgroundColor = .clear
iconImageView.contentMode = .scaleAspectFill
iconImageView.autoresizingMask = [.flexibleTopMargin, .flexibleRightMargin]
if (hasToolTip) {
let tapGesture = UITapGestureRecognizer(target: self, action:#selector(toolTipTapped(_:)))
iconImageView.addGestureRecognizer(tapGesture)
iconImageView.isUserInteractionEnabled = true
let image = UIImage(named: "infotooltip")
image?.accessibilityIdentifier = "tooltipImage"
iconImageView.image = image!
changeIconAlphaToOne()
self.iconImageView = iconImageView
} else {
self.iconImageView = iconImageView
changeIconAlphaToZero()
}
addSubview(iconImageView)
}
My problem is that, I need to give UIEdgeInset to the image. However, because I'm trying to give it before setting the frame, It doesn't work. How can I achieve this?
I also try to resize the image by referencing below post but doesn't work for me. I also DO NOT want to put it inside a content view which will make my component class more complicated.
The simplest way to resize an UIImage?

iOS swift how to place an imageview in a uiview exactly

I have an issue, I'm creating an imageView programmatically and then add it to a center view, which is kind of working. But the problem is that is not taking the whole space in the center view, it appears yes in the uiview but not covering all always a bit down. Any help?
The code:
//let backgroundImage = UIImageView(frame: centerView.frame)
let backgroundImage: UIImageView = UIImageView(frame: CGRect(x: 0, y: 0, width: self.centerView.bounds.size.width, height: self.centerView.bounds.size.height))
print("backgorundImage coordinates: \(backgroundImage.frame)")
backgroundImage.image = drawOverImage
backgroundImage.sizeThatFits(CGSizeMake(centerView.bounds.width, self.centerView.bounds.height))
//check this the image is being drawn bottom because is the fame for the previous 0.0
//backgroundImage.autoPinEdgeToSuperviewMargin(ALEdge.Top, relation: NSLayoutRelation.Equal)
//backgroundImage.contentMode = UIViewContentMode.ScaleAspectFill //too big
//backgroundImage.contentMode = UIViewContentMode.ScaleAspectFit //sama
backgroundImage.contentMode = UIViewContentMode.ScaleAspectFill
backgroundImage.clipsToBounds = true
//imageView.image = background
backgroundImage.center = view.center
let coordinatesForImage: CGRect = self.view.convertRect(backgroundImage.frame, toView: centerView)
let pointOfImage: CGPoint = backgroundImage.convertPoint(self.centerView.frame.origin, toView: backgroundImage)
print("coordinates test: \(coordinatesForImage)")
print("point x: \(pointOfImage.x)")
print("point y: \(pointOfImage.y)")
//backgroundImage.contentMode = UIViewContentMode.ScaleToFill
self.centerView.insertSubview(backgroundImage, atIndex: 0)
let pointOfImageToSuperView: CGPoint = (backgroundImage.superview?.convertPoint(backgroundImage.center, toView: self.centerView))!
print("superview imagepoint: \(pointOfImageToSuperView)")
The comments are all the thing I'm trying to do.
EDIT:
This is what is happening.
I missing a little bit from the bottom, now I don't know if is the size of the image or what, could i change the size of the uiview to match the image?
Simply try:
let backgroundImage: UIImageView = UIImageView(frame: self.centerView.bounds)
backgroundImage.clipsToBounds = true
backgroundImage.contentMode = .ScaleAspectFill
self.centerView.addSubview(backgroundImage)

Distance between screen border and bar button item

Is there a way to get the the distance between the outer screen border and the bar button item. I was thinking of something similar like how to get the status bar height?
(I am asking this, because it is different depending on the device.)
This is some trial code:
let navigationBar = UINavigationBar(frame: CGRectMake(0, 0, view.frame.size.width, 64))
let navigationBar.backgroundColor = UIColor.redColor()
let barButtonItem = UIBarButtonItem(barButtonSystemItem: .Camera, target: self, action: nil)
navigationItem.leftBarButtonItem = barButtonItem
navigationBar.items = [navigationItem]
let buttonItemView = barButtonItem.valueForKey("view") as! UIView
let frame = buttonItemView.superview!.convertRect(buttonItemView.frame, toView: UIApplication.sharedApplication().keyWindow?.rootViewController?.view)
let xCoordinateMin = CGRectGetMinX(frame)
let yCoordinateMax = CGRectGetMaxY(frame)
let yCoordinateMin = CGRectGetMinY(frame)
let label = UILabel(frame: CGRectMake(CGFloat(xCoordinateMin), CGFloat(yCoordinateMin), 100, yCoordinateMax - yCoordinateMin))
label.backgroundColor = UIColor.greenColor()
But it still gives me 0 as xMin, although it seems to work with the y-coordinate..
By outer screen, I assume you are talking about the main window.
You can try converting the bar button's frame into window's coordinate in order to find the distance to each direction:
let buttonItemView = barButtonItem.valueForKey("view") as! UIView
let frame = buttonItemView.superview!.convertRect(buttonItemView.frame, toView: nil)
print("Distance to left: ", CGRectGetMinX(frame))
print("Distance to top: ", CGRectGetMinY(frame))
However, main window receives rotation events and passes them onto controllers which means it doesn't change its frame size so the above solution will not work properly for landscape mode.
You can try converting button's rect to the root view controller's coordinate system, but now you have to take the status bar height into consideration:
let frame = buttonItemView.superview!.convertRect(buttonItemView.frame, toView: UIApplication.sharedApplication().keyWindow?.rootViewController?.view)
Solved with this code:
let navigationBar = UINavigationBar(frame: CGRectMake(0, 0, view.frame.size.width, 124))
navigationBar.backgroundColor = UIColor.redColor()
let barButtonItem = UIBarButtonItem(barButtonSystemItem: .Camera, target: self, action: nil)
navigationItem.leftBarButtonItem = barButtonItem
navigationBar.items = [navigationItem]
let buttonItemView = barButtonItem.valueForKey("view") as! UIView
let frame = buttonItemView.superview!.convertRect(buttonItemView.frame, toView: nil)
let xCoordinateMin: CGFloat = CGRectGetMinX(frame)
let xCoordinateMax: CGFloat = CGRectGetMaxX(frame)
let yCoordinateMax: CGFloat = CGRectGetMaxY(frame)
let yCoordinateMin = CGRectGetMinY(frame)
let label = UILabel(frame: CGRectMake(abs(xCoordinateMin), yCoordinateMin, xCoordinateMax - xCoordinateMin, yCoordinateMax - yCoordinateMin))
label.backgroundColor = UIColor.greenColor()
The trick was to declare the let xCoordinateMin as CGFloat because then it was considered as negative value. Then when creating the label just use abs() to get the absolute value. You now have a label that is absolutely the same size as your bar button item!
UIApplication.sharedApplication().statusBarFrame.height

Resources