How to make a UIImageView have a square size based on a given bounds/container - ios

So I am trying to create a simple UIImageView to make it have a square frame/size with CGSize. Based on a given bounds. So for example if the bounds container is the width & height of the screen then. The function should resize the UIImageView to fit like a perfect square base on those bounds on the screen.
Code:
let myImageView = UIImageView()
myImageView.frame.origin.y = (self.view?.frame.height)! * 0.0
myImageView.frame.origin.x = (self.view?.frame.width)! * 0.0
myImageView.backgroundColor = UIColor.blue
self.view?.insertSubview(myImageView, at: 0)
//("self.view" is the ViewController view that is the same size as the devices screen)
MakeSquare(view: myImageView, boundsOf: self.view)
func MakeSquare(view passedview: UIImageView, boundsOf container: UIView) {
let ratio = container.frame.size.width / container.frame.size.height
if container.frame.width > container.frame.height {
let newHeight = container.frame.width / ratio
passedview.frame.size = CGSize(width: container.frame.width, height: newHeight)
} else{
let newWidth = container.frame.height * ratio
passedview.frame.size = CGSize(width: newWidth, height: container.frame.height)
}
}
The problem is its giving me back the same bounds/size of the container & not changed
Note: I really have know idea how to pull this off, but wanted to see if its possible. My function comes from a question here. That takes a UIImage and resizes its parent view to make the picture square.

This should do it (and will centre the image view in the containing view):
func makeSquare(view passedView: UIImageView, boundsOf container: UIView) {
let minSize = min(container.bounds.maxX, container.bounds.maxY)
passedView.bounds = CGRect(x: container.bounds.midX - minSize / 2.0,
y: container.bounds.midY - minSize / 2.0,
width: minSize, height: minSize)
}

Related

How to wait for subview layout settle down after the initial viewDidLoad cycle?

I have a UIViewController, I made an zoomableImageView by embedding a UIImageView inside a UIScrollView.
class ZoomableImageView: UIScrollView {
// public so that delegate can access
public let imageView: UIImageView = {
let _imageView = UIImageView()
_imageView.translatesAutoresizingMaskIntoConstraints = false
return _imageView
} ()
// this method will be called multiple times to display different images
public func setImage(image: UIImage) {
imageView.image = image
imageView.frame = CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height)
self.contentSize = image.size
// gw: not working here, too early
setZoomScale()
}
func setZoomScale() {
let imageViewSize = imageView.bounds.size
let scrollViewSize = self.bounds.size
let widthScale = scrollViewSize.width / imageViewSize.width
let heightScale = scrollViewSize.height / imageViewSize.height
print("gw: imageViewSize: \(imageViewSize), scrollViewSize: \(scrollViewSize)")
self.minimumZoomScale = min(widthScale, heightScale)
self.maximumZoomScale = 1.2 // allow maxmum 120% of original image size
// set initial zoom to fit the longer side (longer side ==> smaller scale)
zoomScale = minimumZoomScale
}
}
Each time I change the UIImage of the image view, I want to wait for the UIImageView's bound size to settle down, before I can use it to calculate a scale factor for zooming in UIScrollView.
Question: What is the appropriate place to put setZoomScale()? I put it right before exiting the setImage method, but the imageView.bounds.size is not correct in my print statement. Note that it needs to be triggered each time the image changes, not just the initial view loading stage.
I also tried to put setZoomScale in ViewController's viewWillLayoutSubviews, but I have addtional question here: is viewWillLayoutSubviews only called once at view initialization stage? Can I force trigger it using setNeedsLayout? (which I tried, but not re-triggering viewWillLayoutSubviews)
As any change to UI elements dispatch the operation to the main dispatch queue, you can put your code in DispathQueue.main.async{} to make sure it run after the UI change is done.
public func setImage(image: UIImage) {
imageView.image = image
imageView.frame = CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height)
self.contentSize = image.size
// gw: not working here, too early
DispatchQueue.main.async{
self.setZoomScale()
}
}

Swift iOS -How to extract a separate view or image from within it's own UIImageView's bounds? [duplicate]

I'm trying to crop a sub-image of a image view using an overlay UIView that can be positioned anywhere in the UIImageView. I'm borrowing a solution from a similar post on how to solve this when the UIImageView content mode is 'Aspect Fit'. That proposed solution is:
func computeCropRect(for sourceFrame : CGRect) -> CGRect {
let widthScale = bounds.size.width / image!.size.width
let heightScale = bounds.size.height / image!.size.height
var x : CGFloat = 0
var y : CGFloat = 0
var width : CGFloat = 0
var height : CGFloat = 0
var offSet : CGFloat = 0
if widthScale < heightScale {
offSet = (bounds.size.height - (image!.size.height * widthScale))/2
x = sourceFrame.origin.x / widthScale
y = (sourceFrame.origin.y - offSet) / widthScale
width = sourceFrame.size.width / widthScale
height = sourceFrame.size.height / widthScale
} else {
offSet = (bounds.size.width - (image!.size.width * heightScale))/2
x = (sourceFrame.origin.x - offSet) / heightScale
y = sourceFrame.origin.y / heightScale
width = sourceFrame.size.width / heightScale
height = sourceFrame.size.height / heightScale
}
return CGRect(x: x, y: y, width: width, height: height)
}
The problem is that using this solution when the image view is aspect fill causes the cropped segment to not line up exactly with where the overlay UIView was positioned. I'm not quite sure how to adapt this code to accommodate for Aspect Fill or reposition my overlay UIView so that it lines up 1:1 with the segment I'm trying to crop.
UPDATE Solved using Matt's answer below
class ViewController: UIViewController {
#IBOutlet weak var catImageView: UIImageView!
private var cropView : CropView!
override func viewDidLoad() {
super.viewDidLoad()
cropView = CropView(frame: CGRect(x: 0, y: 0, width: 45, height: 45))
catImageView.image = UIImage(named: "cat")
catImageView.clipsToBounds = true
catImageView.layer.borderColor = UIColor.purple.cgColor
catImageView.layer.borderWidth = 2.0
catImageView.backgroundColor = UIColor.yellow
catImageView.addSubview(cropView)
let imageSize = catImageView.image!.size
let imageViewSize = catImageView.bounds.size
var scale : CGFloat = imageViewSize.width / imageSize.width
if imageSize.height * scale < imageViewSize.height {
scale = imageViewSize.height / imageSize.height
}
let croppedImageSize = CGSize(width: imageViewSize.width/scale, height: imageViewSize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imageSize.width-croppedImageSize.width)/2.0,
y: (imageSize.height-croppedImageSize.height)/2.0),
size: croppedImageSize)
let renderer = UIGraphicsImageRenderer(size:croppedImageSize)
let _ = renderer.image { _ in
catImageView.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
}
#IBAction func performCrop(_ sender: Any) {
let cropFrame = catImageView.computeCropRect(for: cropView.frame)
if let imageRef = catImageView.image?.cgImage?.cropping(to: cropFrame) {
catImageView.image = UIImage(cgImage: imageRef)
}
}
#IBAction func resetCrop(_ sender: Any) {
catImageView.image = UIImage(named: "cat")
}
}
The Final Result
Let's divide the problem into two parts:
Given the size of a UIImageView and the size of its UIImage, if the UIImageView's content mode is Aspect Fill, what is the part of the UIImage that fits into the UIImageView? We need, in effect, to crop the original image to match what the UIImageView is actually displaying.
Given an arbitrary rect within the UIImageView, what part of the cropped image (derived in part 1) does it correspond to?
The first part is the interesting part, so let's try it. (The second part will then turn out to be trivial.)
Here's the original image I'll use:
https://static1.squarespace.com/static/54e8ba93e4b07c3f655b452e/t/56c2a04520c64707756f4267/1455596221531/
That image is 1000x611. Here's what it looks like scaled down (but keep in mind that I'm going to be using the original image throughout):
My image view, however, will be 139x182, and is set to Aspect Fill. When it displays the image, it looks like this:
The problem we want to solve is: what part of the original image is being displayed in my image view, if my image view is set to Aspect Fill?
Here we go. Assume that iv is the image view:
let imsize = iv.image!.size
let ivsize = iv.bounds.size
var scale : CGFloat = ivsize.width / imsize.width
if imsize.height * scale < ivsize.height {
scale = ivsize.height / imsize.height
}
let croppedImsize = CGSize(width:ivsize.width/scale, height:ivsize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imsize.width-croppedImsize.width)/2.0,
y: (imsize.height-croppedImsize.height)/2.0),
size: croppedImsize)
So now we have solved the problem: croppedImrect is the region of the original image that is showing in the image view. Let's proceed to use our knowledge, by actually cropping the image to a new image matching what is shown in the image view:
let r = UIGraphicsImageRenderer(size:croppedImsize)
let croppedIm = r.image { _ in
iv.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
The result is this image (ignore the gray border):
But lo and behold, that is the correct answer! I have extracted from the original image exactly the region portrayed in the interior of the image view.
So now you have all the information you need. croppedIm is the UIImage actually displayed in the clipped area of the image view. scale is the scale between the image view and that image. Therefore, you can easily solve the problem you originally proposed! Given any rectangle imposed upon the image view, in the image view's bounds coordinates, you simply apply the scale (i.e. divide all four of its attributes by scale) — and now you have the same rectangle as a portion of croppedIm.
(Observe that we didn't really need to crop the original image to get croppedIm; it was sufficient, in reality, to know how to perform that crop. The important information is the scale along with the origin of croppedImRect; given that information, you can take the rectangle imposed upon the image view, scale it, and offset it to get the desired rectangle of the original image.)
EDIT I added a little screencast just to show that my approach works as a proof of concept:
EDIT Also created a downloadable example project here:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/39cc800d18aa484d17c26ffcbab8bbe51c614573/bk2ch02p058cropImageView/Cropper/ViewController.swift
But note that I can't guarantee that URL will last forever, so please read the discussion above to understand the approach used.
Matt answered the question perfectly. I was creating a full-screen camera and had a need to make the final output match the full-screen preview. Offering here a compact extension of Matt's overall answer in Swift 5 for easy use by others. Recommend reading Matt's answer as it explains things very well.
extension UIImage {
func cropToRect(rect: CGRect) -> UIImage? {
var scale = rect.width / self.size.width
scale = self.size.height * scale < rect.height ? rect.height/self.size.height : scale
let croppedImsize = CGSize(width:rect.width/scale, height:rect.height/scale)
let croppedImrect = CGRect(origin: CGPoint(x: (self.size.width-croppedImsize.width)/2.0,
y: (self.size.height-croppedImsize.height)/2.0),
size: croppedImsize)
UIGraphicsBeginImageContextWithOptions(croppedImsize, true, 0)
self.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
let croppedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return croppedImage
}
}

Crop picture from UIImagePickerController like credit card scanning in iOS [duplicate]

I'm trying to crop a sub-image of a image view using an overlay UIView that can be positioned anywhere in the UIImageView. I'm borrowing a solution from a similar post on how to solve this when the UIImageView content mode is 'Aspect Fit'. That proposed solution is:
func computeCropRect(for sourceFrame : CGRect) -> CGRect {
let widthScale = bounds.size.width / image!.size.width
let heightScale = bounds.size.height / image!.size.height
var x : CGFloat = 0
var y : CGFloat = 0
var width : CGFloat = 0
var height : CGFloat = 0
var offSet : CGFloat = 0
if widthScale < heightScale {
offSet = (bounds.size.height - (image!.size.height * widthScale))/2
x = sourceFrame.origin.x / widthScale
y = (sourceFrame.origin.y - offSet) / widthScale
width = sourceFrame.size.width / widthScale
height = sourceFrame.size.height / widthScale
} else {
offSet = (bounds.size.width - (image!.size.width * heightScale))/2
x = (sourceFrame.origin.x - offSet) / heightScale
y = sourceFrame.origin.y / heightScale
width = sourceFrame.size.width / heightScale
height = sourceFrame.size.height / heightScale
}
return CGRect(x: x, y: y, width: width, height: height)
}
The problem is that using this solution when the image view is aspect fill causes the cropped segment to not line up exactly with where the overlay UIView was positioned. I'm not quite sure how to adapt this code to accommodate for Aspect Fill or reposition my overlay UIView so that it lines up 1:1 with the segment I'm trying to crop.
UPDATE Solved using Matt's answer below
class ViewController: UIViewController {
#IBOutlet weak var catImageView: UIImageView!
private var cropView : CropView!
override func viewDidLoad() {
super.viewDidLoad()
cropView = CropView(frame: CGRect(x: 0, y: 0, width: 45, height: 45))
catImageView.image = UIImage(named: "cat")
catImageView.clipsToBounds = true
catImageView.layer.borderColor = UIColor.purple.cgColor
catImageView.layer.borderWidth = 2.0
catImageView.backgroundColor = UIColor.yellow
catImageView.addSubview(cropView)
let imageSize = catImageView.image!.size
let imageViewSize = catImageView.bounds.size
var scale : CGFloat = imageViewSize.width / imageSize.width
if imageSize.height * scale < imageViewSize.height {
scale = imageViewSize.height / imageSize.height
}
let croppedImageSize = CGSize(width: imageViewSize.width/scale, height: imageViewSize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imageSize.width-croppedImageSize.width)/2.0,
y: (imageSize.height-croppedImageSize.height)/2.0),
size: croppedImageSize)
let renderer = UIGraphicsImageRenderer(size:croppedImageSize)
let _ = renderer.image { _ in
catImageView.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
}
#IBAction func performCrop(_ sender: Any) {
let cropFrame = catImageView.computeCropRect(for: cropView.frame)
if let imageRef = catImageView.image?.cgImage?.cropping(to: cropFrame) {
catImageView.image = UIImage(cgImage: imageRef)
}
}
#IBAction func resetCrop(_ sender: Any) {
catImageView.image = UIImage(named: "cat")
}
}
The Final Result
Let's divide the problem into two parts:
Given the size of a UIImageView and the size of its UIImage, if the UIImageView's content mode is Aspect Fill, what is the part of the UIImage that fits into the UIImageView? We need, in effect, to crop the original image to match what the UIImageView is actually displaying.
Given an arbitrary rect within the UIImageView, what part of the cropped image (derived in part 1) does it correspond to?
The first part is the interesting part, so let's try it. (The second part will then turn out to be trivial.)
Here's the original image I'll use:
https://static1.squarespace.com/static/54e8ba93e4b07c3f655b452e/t/56c2a04520c64707756f4267/1455596221531/
That image is 1000x611. Here's what it looks like scaled down (but keep in mind that I'm going to be using the original image throughout):
My image view, however, will be 139x182, and is set to Aspect Fill. When it displays the image, it looks like this:
The problem we want to solve is: what part of the original image is being displayed in my image view, if my image view is set to Aspect Fill?
Here we go. Assume that iv is the image view:
let imsize = iv.image!.size
let ivsize = iv.bounds.size
var scale : CGFloat = ivsize.width / imsize.width
if imsize.height * scale < ivsize.height {
scale = ivsize.height / imsize.height
}
let croppedImsize = CGSize(width:ivsize.width/scale, height:ivsize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imsize.width-croppedImsize.width)/2.0,
y: (imsize.height-croppedImsize.height)/2.0),
size: croppedImsize)
So now we have solved the problem: croppedImrect is the region of the original image that is showing in the image view. Let's proceed to use our knowledge, by actually cropping the image to a new image matching what is shown in the image view:
let r = UIGraphicsImageRenderer(size:croppedImsize)
let croppedIm = r.image { _ in
iv.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
The result is this image (ignore the gray border):
But lo and behold, that is the correct answer! I have extracted from the original image exactly the region portrayed in the interior of the image view.
So now you have all the information you need. croppedIm is the UIImage actually displayed in the clipped area of the image view. scale is the scale between the image view and that image. Therefore, you can easily solve the problem you originally proposed! Given any rectangle imposed upon the image view, in the image view's bounds coordinates, you simply apply the scale (i.e. divide all four of its attributes by scale) — and now you have the same rectangle as a portion of croppedIm.
(Observe that we didn't really need to crop the original image to get croppedIm; it was sufficient, in reality, to know how to perform that crop. The important information is the scale along with the origin of croppedImRect; given that information, you can take the rectangle imposed upon the image view, scale it, and offset it to get the desired rectangle of the original image.)
EDIT I added a little screencast just to show that my approach works as a proof of concept:
EDIT Also created a downloadable example project here:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/39cc800d18aa484d17c26ffcbab8bbe51c614573/bk2ch02p058cropImageView/Cropper/ViewController.swift
But note that I can't guarantee that URL will last forever, so please read the discussion above to understand the approach used.
Matt answered the question perfectly. I was creating a full-screen camera and had a need to make the final output match the full-screen preview. Offering here a compact extension of Matt's overall answer in Swift 5 for easy use by others. Recommend reading Matt's answer as it explains things very well.
extension UIImage {
func cropToRect(rect: CGRect) -> UIImage? {
var scale = rect.width / self.size.width
scale = self.size.height * scale < rect.height ? rect.height/self.size.height : scale
let croppedImsize = CGSize(width:rect.width/scale, height:rect.height/scale)
let croppedImrect = CGRect(origin: CGPoint(x: (self.size.width-croppedImsize.width)/2.0,
y: (self.size.height-croppedImsize.height)/2.0),
size: croppedImsize)
UIGraphicsBeginImageContextWithOptions(croppedImsize, true, 0)
self.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
let croppedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return croppedImage
}
}

Limit image size to ScrollView, Swift

I have an image which is set inside a scroll view, though I have set the frame of the scrollView to fixed height and width as shown below, the image goes beyond the bounds (see below picture).
How can I limit the picture to fit inside the scrollView.
imageScrollView.frame = CGRect(x: 0, y: 0, width: viewWidth, height: viewHeight-50)
imageScrollView.clipsToBounds = true // Has no affect on the image
Do you have a reference to the UIImageView? If so, then set its content mode to aspect fit. Like this:
theImageView.contentMode = .scaleAspectFit
The clipsToBounds you set only covers up any parts of child views that are sticking out of the bounds of the parent view, so that's why it doesn't do anything for you.
OR if you're using Interface Builder, set this option:
So, what if you don't have the reference to the UIImageView?...
You could iterate through the subviews of your scroll view, and whenever it finds a UIImageView, you can set the content mode like that. Something like:
//This is off the top of my head, so my filtering may not be right...
//This is also a one and done solution if you've got a lot of images in your scroll view
for anImgVw in imageScrollView.subviews.filter({$0.isKind(of: UIImageView.self)})
{
anImgVw.contentMode = .scaleAspectFit
}
Otherwise, I'm not sure if it's possible without a reference to the UIImageView.
The library you are using is coded to match the scaling to the device orientation. So, if the image orientation doesn't match the view orientation, you end up with the image not quite fitting in your scroll view.
You'll need to edit the ImageScrollView.swift source file. Assuming you're using the same version that is currently at the link you provided ( https://github.com/huynguyencong/ImageScrollView ), change the setMaxMinZoomScalesForCurrentBounds() function as follows:
fileprivate func setMaxMinZoomScalesForCurrentBounds() {
// calculate min/max zoomscale
let xScale = bounds.width / imageSize.width // the scale needed to perfectly fit the image width-wise
let yScale = bounds.height / imageSize.height // the scale needed to perfectly fit the image height-wise
// fill width if the image and phone are both portrait or both landscape; otherwise take smaller scale
//let imagePortrait = imageSize.height > imageSize.width
//let phonePortrait = bounds.height >= bounds.width
//var minScale = (imagePortrait == phonePortrait) ? xScale : min(xScale, yScale)
//
// just take the min scale, so the image will completely fit regardless of orientation
var minScale = min(xScale, yScale)
let maxScale = maxScaleFromMinScale*minScale
// don't let minScale exceed maxScale. (If the image is smaller than the screen, we don't want to force it to be zoomed.)
if minScale > maxScale {
minScale = maxScale
}
maximumZoomScale = maxScale
minimumZoomScale = minScale * 0.999 // the multiply factor to prevent user cannot scroll page while they use this control in UIPageViewController
}
you can use the screenHeight rather than the viewHeight
let screenSize: CGRect = UIScreen.mainScreen().bounds
let screenWidth = screenSize.width
let screenHeight = screenSize.height
imageScrollView.frame = CGRect(x: 0, y: 0, width: viewWidth, height: screenHeight-50)

How to crop a UIImageView to a new UIImage in 'aspect fill' mode?

I'm trying to crop a sub-image of a image view using an overlay UIView that can be positioned anywhere in the UIImageView. I'm borrowing a solution from a similar post on how to solve this when the UIImageView content mode is 'Aspect Fit'. That proposed solution is:
func computeCropRect(for sourceFrame : CGRect) -> CGRect {
let widthScale = bounds.size.width / image!.size.width
let heightScale = bounds.size.height / image!.size.height
var x : CGFloat = 0
var y : CGFloat = 0
var width : CGFloat = 0
var height : CGFloat = 0
var offSet : CGFloat = 0
if widthScale < heightScale {
offSet = (bounds.size.height - (image!.size.height * widthScale))/2
x = sourceFrame.origin.x / widthScale
y = (sourceFrame.origin.y - offSet) / widthScale
width = sourceFrame.size.width / widthScale
height = sourceFrame.size.height / widthScale
} else {
offSet = (bounds.size.width - (image!.size.width * heightScale))/2
x = (sourceFrame.origin.x - offSet) / heightScale
y = sourceFrame.origin.y / heightScale
width = sourceFrame.size.width / heightScale
height = sourceFrame.size.height / heightScale
}
return CGRect(x: x, y: y, width: width, height: height)
}
The problem is that using this solution when the image view is aspect fill causes the cropped segment to not line up exactly with where the overlay UIView was positioned. I'm not quite sure how to adapt this code to accommodate for Aspect Fill or reposition my overlay UIView so that it lines up 1:1 with the segment I'm trying to crop.
UPDATE Solved using Matt's answer below
class ViewController: UIViewController {
#IBOutlet weak var catImageView: UIImageView!
private var cropView : CropView!
override func viewDidLoad() {
super.viewDidLoad()
cropView = CropView(frame: CGRect(x: 0, y: 0, width: 45, height: 45))
catImageView.image = UIImage(named: "cat")
catImageView.clipsToBounds = true
catImageView.layer.borderColor = UIColor.purple.cgColor
catImageView.layer.borderWidth = 2.0
catImageView.backgroundColor = UIColor.yellow
catImageView.addSubview(cropView)
let imageSize = catImageView.image!.size
let imageViewSize = catImageView.bounds.size
var scale : CGFloat = imageViewSize.width / imageSize.width
if imageSize.height * scale < imageViewSize.height {
scale = imageViewSize.height / imageSize.height
}
let croppedImageSize = CGSize(width: imageViewSize.width/scale, height: imageViewSize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imageSize.width-croppedImageSize.width)/2.0,
y: (imageSize.height-croppedImageSize.height)/2.0),
size: croppedImageSize)
let renderer = UIGraphicsImageRenderer(size:croppedImageSize)
let _ = renderer.image { _ in
catImageView.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
}
#IBAction func performCrop(_ sender: Any) {
let cropFrame = catImageView.computeCropRect(for: cropView.frame)
if let imageRef = catImageView.image?.cgImage?.cropping(to: cropFrame) {
catImageView.image = UIImage(cgImage: imageRef)
}
}
#IBAction func resetCrop(_ sender: Any) {
catImageView.image = UIImage(named: "cat")
}
}
The Final Result
Let's divide the problem into two parts:
Given the size of a UIImageView and the size of its UIImage, if the UIImageView's content mode is Aspect Fill, what is the part of the UIImage that fits into the UIImageView? We need, in effect, to crop the original image to match what the UIImageView is actually displaying.
Given an arbitrary rect within the UIImageView, what part of the cropped image (derived in part 1) does it correspond to?
The first part is the interesting part, so let's try it. (The second part will then turn out to be trivial.)
Here's the original image I'll use:
https://static1.squarespace.com/static/54e8ba93e4b07c3f655b452e/t/56c2a04520c64707756f4267/1455596221531/
That image is 1000x611. Here's what it looks like scaled down (but keep in mind that I'm going to be using the original image throughout):
My image view, however, will be 139x182, and is set to Aspect Fill. When it displays the image, it looks like this:
The problem we want to solve is: what part of the original image is being displayed in my image view, if my image view is set to Aspect Fill?
Here we go. Assume that iv is the image view:
let imsize = iv.image!.size
let ivsize = iv.bounds.size
var scale : CGFloat = ivsize.width / imsize.width
if imsize.height * scale < ivsize.height {
scale = ivsize.height / imsize.height
}
let croppedImsize = CGSize(width:ivsize.width/scale, height:ivsize.height/scale)
let croppedImrect =
CGRect(origin: CGPoint(x: (imsize.width-croppedImsize.width)/2.0,
y: (imsize.height-croppedImsize.height)/2.0),
size: croppedImsize)
So now we have solved the problem: croppedImrect is the region of the original image that is showing in the image view. Let's proceed to use our knowledge, by actually cropping the image to a new image matching what is shown in the image view:
let r = UIGraphicsImageRenderer(size:croppedImsize)
let croppedIm = r.image { _ in
iv.image!.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
}
The result is this image (ignore the gray border):
But lo and behold, that is the correct answer! I have extracted from the original image exactly the region portrayed in the interior of the image view.
So now you have all the information you need. croppedIm is the UIImage actually displayed in the clipped area of the image view. scale is the scale between the image view and that image. Therefore, you can easily solve the problem you originally proposed! Given any rectangle imposed upon the image view, in the image view's bounds coordinates, you simply apply the scale (i.e. divide all four of its attributes by scale) — and now you have the same rectangle as a portion of croppedIm.
(Observe that we didn't really need to crop the original image to get croppedIm; it was sufficient, in reality, to know how to perform that crop. The important information is the scale along with the origin of croppedImRect; given that information, you can take the rectangle imposed upon the image view, scale it, and offset it to get the desired rectangle of the original image.)
EDIT I added a little screencast just to show that my approach works as a proof of concept:
EDIT Also created a downloadable example project here:
https://github.com/mattneub/Programming-iOS-Book-Examples/blob/39cc800d18aa484d17c26ffcbab8bbe51c614573/bk2ch02p058cropImageView/Cropper/ViewController.swift
But note that I can't guarantee that URL will last forever, so please read the discussion above to understand the approach used.
Matt answered the question perfectly. I was creating a full-screen camera and had a need to make the final output match the full-screen preview. Offering here a compact extension of Matt's overall answer in Swift 5 for easy use by others. Recommend reading Matt's answer as it explains things very well.
extension UIImage {
func cropToRect(rect: CGRect) -> UIImage? {
var scale = rect.width / self.size.width
scale = self.size.height * scale < rect.height ? rect.height/self.size.height : scale
let croppedImsize = CGSize(width:rect.width/scale, height:rect.height/scale)
let croppedImrect = CGRect(origin: CGPoint(x: (self.size.width-croppedImsize.width)/2.0,
y: (self.size.height-croppedImsize.height)/2.0),
size: croppedImsize)
UIGraphicsBeginImageContextWithOptions(croppedImsize, true, 0)
self.draw(at: CGPoint(x:-croppedImrect.origin.x, y:-croppedImrect.origin.y))
let croppedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return croppedImage
}
}

Resources