Image doesn't fit to screen - ios

I have an image on screen:
image = SKSpriteNode(imageNamed: "image.png")
image.name = "leftside1"
image.position = CGPoint(x: 0, y: 0)
image.zPosition = 3
self.addChild(image)
Image has w250xh1920 - when I show screen, image not fit to screen, its bigger up and down, but I have set aspectFit in previous scene:
nextScene!.scaleMode = .aspectFit
Try aspectFill doesn't solve my problem. Any advice? Thanks!

Try setting the image size to be the size of the scene like this.
if let size = nextScene?.size {
image.size = size
}
Here is an example of how you can dynamically calculate the width.
if let size = nextScene?.size {
// Calculate ratio
let ratio = size.width / size.height
// Multiple image width by ratio
let width = image.size.width * ratio
// Create a new CGSize
let newSize = CGSize(width: width, height: size.height)
image.size = newSize
}

Related

Swift: Overlay NxN grid of buttons of aspect fit image

Tried searching for a while but couldn't find a way to do this.
I currently have a uiimage that is aspect fit. I'm trying to overlay a NxN grid of buttons over the image.
My thought process was to find the boundaries of the image in the imageview, and then overlay the buttons based off those boundaries. I usually use storyboards but am taking a stab at this programmatically as I couldn't figure out the aspect fit part in storyboards.
I found this code for finding the boundaries of my image after its been fit and did some print statements to be sure that it is finding the correct boundaries which I believe it is.
extension UIImageView {
var contentClippingRect: CGRect {
guard let image = image else { return bounds }
guard contentMode == .scaleAspectFit else { return bounds }
guard image.size.width > 0 && image.size.height > 0 else { return bounds }
let scale: CGFloat
if image.size.width > image.size.height {
scale = bounds.width / image.size.width
} else {
scale = bounds.height / image.size.height
}
let size = CGSize(width: image.size.width * scale, height: image.size.height * scale)
let x = (bounds.width - size.width) / 2.0
let y = (bounds.height - size.height) / 2.0
return CGRect(x: x, y: y, width: size.width, height: size.height)
}
}
I'm currently trying to initialize the stackview by passing the image.contentRectClipping frame to it. My understanding would be that this would create a frame based off the x,y,width and height and I would be good to go. However it places the stackview in the top left of the imageview and not the actual image. Any guidance would be greatly appreciated.

How to get x and y position of UIImage in UIImageView?

I want to get original x and y position of UIImage when we set it in UIImageView with scaleAspectFill.
As we know in scaleAspectFill, some of the portion is clipped. So as per my requirement I want to get x and y value (it may be - value I don't know.).
Here is the original image from gallery
Now I am setting this above image to my app view.
So as above situation, I want to get it's hidden x, y position of image which are clipped.
Can any one tell how to get it?
Use following extension
extension UIImageView {
var imageRect: CGRect {
guard let imageSize = self.image?.size else { return self.frame }
let scale = UIScreen.main.scale
let imageWidth = (imageSize.width / scale).rounded()
let frameWidth = self.frame.width.rounded()
let imageHeight = (imageSize.height / scale).rounded()
let frameHeight = self.frame.height.rounded()
let ratio = max(frameWidth / imageWidth, frameHeight / imageHeight)
let newSize = CGSize(width: imageWidth * ratio, height: imageHeight * ratio)
let newOrigin = CGPoint(x: self.center.x - (newSize.width / 2), y: self.center.y - (newSize.height / 2))
return CGRect(origin: newOrigin, size: newSize)
}
}
Usage
let rect = imageView.imageRect
print(rect)
UI Test
let testView = UIView(frame: rect)
testView.backgroundColor = UIColor.red.withAlphaComponent(0.5)
imageView.superview?.addSubview(testView)
Use below extension to find out accurate details of Image in ImageView.
extension UIImageView {
var contentRect: CGRect {
guard let image = image else { return bounds }
guard contentMode == .scaleAspectFit else { return bounds }
guard image.size.width > 0 && image.size.height > 0 else { return bounds }
let scale: CGFloat
if image.size.width > image.size.height {
scale = bounds.width / image.size.width
} else {
scale = bounds.height / image.size.height
}
let size = CGSize(width: image.size.width * scale, height: image.size.height * scale)
let x = (bounds.width - size.width) / 2.0
let y = (bounds.height - size.height) / 2.0
return CGRect(x: x, y: y, width: size.width, height: size.height)
}
}
How to test
let rect = imgTest.contentRect
print("Image rect:", rect)
Reference: https://www.hackingwithswift.com/example-code/uikit/how-to-find-an-aspect-fit-images-size-inside-an-image-view
If you want to show image like it shows in gallery then you can use contraints
"H:|[v0]|" and "V:|[v0]|" and in imageview use .aspectFit
And if you want the image size you can use imageView.image!.size and calculate the amount of image which is getting cut. In aspectFill the width is matched to screenwidth and accordingly the height gets increased. So I guess you can find how how much amount of image is getting cut.
Try this Library ImageCoordinateSpace
I am not sure if it works for you or not, but it has a feature to convert CGPoint from image coordinates to any view coordinates and vice versa.

Get the position of imageview dynamically

I need to get the dynamic x,y position of the image view.
I have tried this
self.imgView.bounds.origin.y
and
self.imgView.frame.origin.x
I am getting 0.0 all the time.
How can i get this?
Thanks
The bounds.origin of any view should be (0,0) at all times since it is the coordinate system of the view itself (relative to its origin)
The frame.origin is in the coordinate system of the its superview, so if it's 0,0, then that might be that its origin is at its superview's origin.
Maybe you want to know the origin with respect to the screen? If so,
let posInWindow = v.convert(v.bounds.origin, to: nil)
Please use below code to get rect in UIImageVIew
This code in swift3 And your ImageViewContentMode should be Aspect fit
func calculateRectOfImageInImageView(imageView: UIImageView) -> CGRect {
let imageViewSize = imageView.frame.size
let imgSize = imageView.image?.size
guard let imageSize = imgSize, imgSize != nil else {
return CGRect.zero
}
let scaleWidth = imageViewSize.width / imageSize.width
let scaleHeight = imageViewSize.height / imageSize.height
let aspect = fmin(scaleWidth, scaleHeight)
var imageRect = CGRect(x: 0, y: 0, width: imageSize.width * aspect, height: imageSize.height * aspect)
// Center image
imageRect.origin.x = (imageViewSize.width - imageRect.size.width) / 2
imageRect.origin.y = (imageViewSize.height - imageRect.size.height) / 2
// Add imageView offset
imageRect.origin.x += imageView.frame.origin.x
imageRect.origin.y += imageView.frame.origin.y
return imageRect
}
Bounds will always return an x and y of 0, you want to use frame. In your example these values may be correct? Try adding your imageView in the centre of the view and logging the:
self.imgView.frame.origin.x
self.imgView.frame.origin.y

UIImageView cropping square

First of all, my app asks the user to pick a photo from collectionview or taken camera pictures, then navigate to cropping image editor, so that our final image will be square according to cropping area position.But, the problem is there is few source detail about it. Without UIImagePicker, how is to be done?
I also tried taking square picture, with no success.
I implemented UIPinchGesture imageview so that user can zoom in or out, but there is no cropping square on the image view. I have to add cropping area.
This is cropping UIImage function:
func croppImageByRect() -> UIImage {
let ratio: CGFloat = 1 // square
let delta: CGFloat
let offSet: CGPoint
//make a new square size, that is the resized imaged width
let newSize = CGSizeMake(size.width, (size.width - size.width / 8))
//figure out if the picture is landscape or portrait, then
//calculate scale factor and offset
if (size.width > size.height) {
delta = (ratio * size.width - ratio * size.height)
offSet = CGPointMake(delta / 2, 0)
} else {
delta = (ratio * size.height - ratio * size.width)
offSet = CGPointMake(0, delta / 2)
}
//make the final clipping rect based on the calculated values
let clipRect = CGRectMake(-offSet.x, -offSet.y, (ratio * size.width) + delta, (ratio * size.height) + delta)
//start a new context, with scale factor 0.0 so retina displays get
//high quality image
UIGraphicsBeginImageContextWithOptions(newSize, true, 0.0)
UIRectClip(clipRect)
drawInRect(clipRect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}
Well, to start, this line:
// make a new square size, that is the resized imaged width
let newSize = CGSizeMake(size.width, (size.width - size.width / 8))
Will not result in a square size since width and height are not the same.

How to improve sharpness when resizing UIImage?

In my app I need to upload photos on server, so before that, I want to resize and compress them to acceptable size. I tried to resize them in two ways, and the first way is:
// image is an instance of original UIImage that I want to resize
let width : Int = 640
let height : Int = 640
let bitsPerComponent = CGImageGetBitsPerComponent(image.CGImage)
let bytesPerRow = CGImageGetBytesPerRow(image.CGImage)
let colorSpace = CGImageGetColorSpace(image.CGImage)
let bitmapInfo = CGImageGetBitmapInfo(image.CGImage)
let context = CGBitmapContextCreate(nil, width, height, bitsPerComponent, bytesPerRow, colorSpace, bitmapInfo)
CGContextSetInterpolationQuality(context, kCGInterpolationHigh)
CGContextDrawImage(context, CGRect(origin: CGPointZero, size: CGSize(width: CGFloat(width), height: CGFloat(height))), image.CGImage)
image = UIImage(CGImage: CGBitmapContextCreateImage(context))
The other way:
image = RBResizeImage(image, targetSize: CGSizeMake(640, 640))
func RBResizeImage(image: UIImage?, targetSize: CGSize) -> UIImage? {
if let image = image {
let size = image.size
let widthRatio = targetSize.width / image.size.width
let heightRatio = targetSize.height / image.size.height
// Figure out what our orientation is, and use that to form the rectangle
var newSize: CGSize
if(widthRatio > heightRatio) {
newSize = CGSizeMake(size.width heightRatio, size.height heightRatio)
} else {
newSize = CGSizeMake(size.width widthRatio, size.height widthRatio)
}
// This is the rect that we've calculated out and this is what is actually used below
let rect = CGRectMake(0, 0, newSize.width, newSize.height)
// Actually do the resizing to the rect using the ImageContext stuff
UIGraphicsBeginImageContextWithOptions(newSize, false, 1.0)
image.drawInRect(rect)
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
} else {
return nil
}
}
After that, I use UIImageJPEGRepresentation to compress UIImage, but even if compressionQuality is 1, photo is still blurry (that's visible on object edges mostly, maybe it's not a big deal, but photo is three to five times larger than same photo from Instagram, e.g. but doesn't have same sharpness). For 0.5 is even worse, of course, and photo is still larger (in KB) than same photo from Instagram.
Photo from my app, compressionQuality is 1, edges are blurry, and size is 341 KB
Photo from Instagram, edges are sharp, and size is 136 KB
EDIT:
Ok, but I'm little confused right now, I'm not sure what to do, to maintain aspect ratio? This is how I crop image (scrollView has UIImageView, so I can move and zoom image, and on the end, I'm able to crop visible part of scrollView which is sqare). Anyway, image from above was originally 2048x2048, but it's still blurry.
var scale = 1/scrollView.zoomScale
var visibleRect : CGRect = CGRect()
visibleRect.origin.x = scrollView.contentOffset.x * scale
visibleRect.origin.y = scrollView.contentOffset.y * scale
visibleRect.size.width = scrollView.bounds.size.width * scale
visibleRect.size.height = scrollView.bounds.size.height * scale
image = crop(image!, rect: visibleRect)
func crop(srcImage : UIImage, rect : CGRect) -> UIImage? {
var imageRef = CGImageCreateWithImageInRect(srcImage.CGImage, rect)
var cropped = UIImage(CGImage: imageRef)
return cropped
}
Your given code is wright but problem is u don't maintain the aspect ratio of image
as in your code you create a new rect as
let rect = CGRectMake(0, 0, newSize.width, newSize.height)
if your given image of same height and width it will give smooth resize image but if height and width are different your image is blur . so try to maintain the aspect ratio
Reply to Edit question :
make height or width of crop image constant
For example if you make width as constant than use the following code
visibleRect.size.height = orignalImg.size.height * visibleRect.size.width / orignalImg.size.width
image = crop(image!, rect: visibleRect)

Resources