Photo resizing and compressing - ios

I am resizing and compressing my photos an unusual result.
When I choose the image from photo album, the image compresses and resizes fine. However, If I do it on a image that was passed from the camera, the image becomes oddly small (And unwatchable). What I have done as a test is assign some compression and resizing function in my button that takes an image either from a camera source or photo album. Below are my code and console output
#IBAction func testBtnPressed(sender: AnyObject) {
let img = selectedImageView.image!
print("before resize image \(img.dataLengh_kb)kb size \(img.size)")
let resizedImg = img.resizeWithWidth(1080)
print("1080 After resize image \(resizedImg!.dataLengh_kb)kb size \(resizedImg!.size)")
let compressedImageData = resizedImg!.mediumQualityJPEGNSData
print("Compress to medium quality = \(compressedImageData.length / 1024)kb")
}
extension UIImage {
var mediumQualityJPEGNSData: NSData { return UIImageJPEGRepresentation(self, 0.5)! }
func resizeWithWidth(width: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: width, height: CGFloat(ceil(width/size.width * size.height)))))
imageView.contentMode = .ScaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.renderInContext(context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
}
When photo was selected from photo album
before resize image 5004kb size (3024.0, 3024.0)
1080 After resize image 1023kb size (1080.0, 1080.0)
Compress to medium quality = 119kb
When photo was passed by camera
before resize image 4653kb size (24385.536, 24385.536)
1080 After resize image 25kb size (1080.576, 1080.576)
Compress to medium quality = 4kb

I have replaced the image resizing function with the following one and it worked a lot better
func resizeImage(newHeight: CGFloat) -> UIImage {
let scale = newHeight / self.size.height
let newWidth = self.size.width * scale
UIGraphicsBeginImageContext(CGSizeMake(newWidth, newHeight))
self.drawInRect(CGRectMake(0, 0, newWidth, newHeight))
let newImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return newImage
}

Related

iOS Swift UIImage resizing issue affecting performance; unexpected dimensions upon resizing [duplicate]

This question already has an answer here:
How to set the scale when using UIGraphicsImageRenderer
(1 answer)
Closed 3 years ago.
I’m having a performance issue with images in a tableView (very jerky) and several view controllers (slow loading). In order to speed things up and smooth things out, I’m trying to resize the incoming images to the exact size of their respective imageViews upon initial loading instead of on the fly every time they are loaded from Core Data. The resized images will be stored as separate attributes on their parent Entity. All images are square, i.e., 1:1 aspect ratio.
Here is the resizing code:
extension UIImage {
func resize(targetSize: CGSize) -> UIImage {
print("Inside resize function")
print("targetSize is \(targetSize)")
return UIGraphicsImageRenderer(size:targetSize).image { _ in
self.draw(in: CGRect(origin: .zero, size: targetSize))
}
}
}
And here is the function from which the resize function is called and to which it returns:
func difAndAssignPics() {
guard let appDelegate =
UIApplication.shared.delegate as? AppDelegate else {
return
}
// This is a copy of the original image, from which the smaller ones will be created
print("bigToLittlePic size is \(bigToLittlePic?.size as Any)")
// These are the desired sizes of the three images I'm trying to create to facilitate performance
let littlePicSize = CGSize(width: 110, height: 110)
let mediumPicSize = CGSize(width: 160, height: 160)
let displayPicSize = CGSize(width: 340, height: 340)
// This is the call to the resize function for each target size
littlePic = bigToLittlePic?.resize(targetSize: littlePicSize)
mediumPic = bigToLittlePic?.resize(targetSize: mediumPicSize)
displayPic = bigToLittlePic?.resize(targetSize: displayPicSize)
// This code differentiates between front view and rear view of the item and prints out the size of each
switch switchTag {
case 1:
newCoin.obversePic = (displayPic)!.pngData()! as NSData
newCoin.obversePicThumb = (littlePic)!.pngData()! as NSData
newCoin.obversePicMedium = (mediumPic)!.pngData()! as NSData
selectIndicatorLabel.text = "Now select Reverse photo"
appDelegate.saveContext()
print("Inside switch case 1, difAndAssignPics")
if let imageData = newCoin.obversePicThumb{
let image = UIImage(data: imageData as Data)
print("obversePicThumb size is \(image?.size as Any)")
}
if let imageData = newCoin.obversePicMedium{
let image = UIImage(data: imageData as Data)
print("obversePicMedium size is \(image?.size as Any)")
}
if let imageData = newCoin.obversePic{
let image = UIImage(data: imageData as Data)
print("obversePicBig size is \(image?.size as Any)")
}
case 2:
newCoin.reversePic = (displayPic)!.pngData()! as NSData
newCoin.reversePicThumb = (littlePic)!.pngData()! as NSData
newCoin.reversePicMedium = (mediumPic)!.pngData()! as NSData
selectIndicatorLabel.text = "Now select Obverse photo"
appDelegate.saveContext()
print("Inside switch case 2, difAndAssignPics")
if let imageData = newCoin.reversePicThumb{
let image = UIImage(data: imageData as Data)
print("reversePicThumb size is \(image?.size as Any)")
}
if let imageData = newCoin.reversePicMedium{
let image = UIImage(data: imageData as Data)
print("reversePicMedium size is \(image?.size as Any)")
}
if let imageData = newCoin.reversePic{
let image = UIImage(data: imageData as Data)
print("reversePicBig size is \(image?.size as Any)")
}
default: break
}
reactivateDataButtons()
}
Here’s what is being printed in the console at the introduction of each new image:
bigToLittlePic size is Optional((2320.0, 2320.0))
Inside resize function
targetSize is (110.0, 110.0)
Inside resize function
targetSize is (160.0, 160.0)
Inside resize function
targetSize is (340.0, 340.0)
Ok, so far so good. However, when the image gets back to difAndAssignPics, this is the printout:
reversePicThumb size is Optional((330.0, 330.0))
reversePicMedium size is Optional((480.0, 480.0))
reversePicBig size is Optional((1020.0, 1020.0))
I included just the printout for the reverse images for brevity. Obverse gives identical numbers.
As you can see, somehow the size of each resized image has ballooned by a factor of 3. The pictures load, and the quality is high, but the performance is still noticeably suboptimal. I don’t know how to reconcile these numbers.
Anybody have any suggestions?
Many thanks in advance!
Edit
I printed out the size of the images that are being squeezed into the 110 x 110 imageViews in my custom cell. The numbers confirm that something in the resizing function is snafued. Here are the numbers in cellForRow:
obversePicThumb.size == Optional((330.0, 330.0))
reversePicThumb.size == Optional((330.0, 330.0))
Edit #2
This post answered my question. Many thanks to those who took the time to look!
I don't know if this will make any difference, but here's my extension to accomplish this:
extension UIImage {
public func resizeToBoundingSquare(_ boundingSquareSideLength : CGFloat) -> UIImage {
let imgScale = self.size.width > self.size.height ? boundingSquareSideLength / self.size.width : boundingSquareSideLength / self.size.height
let newWidth = self.size.width * imgScale
let newHeight = self.size.height * imgScale
let newSize = CGSize(width: newWidth, height: newHeight)
UIGraphicsBeginImageContext(newSize)
self.draw(in: CGRect(x: 0, y: 0, width: newWidth, height: newHeight))
let resizedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
return resizedImage!
}
}
My usage isn't with a UITableView, nor does it use CoreData. (I'd recommend doing things asynchronously for something like that.) But if your issue is with resizing, this should work with acceptable performance.

Reduce image size upto max 100 kb and with fix dimension 512x512 in iOS

I am trying to create sticker image from image captured from default camera.
I need to resize this image with specific 512x512 size and with maximum size of 100 kb. but, i am not able to find any solution.
I have tried to resize image for 512x512. with below code.
var image = imgFront?.resizeWithWidth(width: 512)
extension UIImage {
func resizeWithWidth(width: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: width, height: width)))
imageView.contentMode = .scaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(CGSize(width: width, height: width), false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.render(in: context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
}
this makes image in 512x512 properly. but, when i try to reduce size of image below 100 kb with below method, it changes the dimension of image.
extension UIImage {
func resized(withPercentage percentage: CGFloat) -> UIImage? {
let canvasSize = CGSize(width: 512, height: 512)
UIGraphicsBeginImageContextWithOptions(canvasSize, false, scale)
defer { UIGraphicsEndImageContext() }
draw(in: CGRect(origin: .zero, size: canvasSize))
return UIGraphicsGetImageFromCurrentImageContext()
}
func resizedTo100KB() -> UIImage? {
guard self.pngData() != nil else { return nil }
let megaByte = 800.0
var resizingImage = self
var imageSizeKB = 990.0 // ! Or devide for 1024 if you need KB but not kB
while imageSizeKB > megaByte { // ! Or use 1024 if you need KB but not kB
guard let resizedImage = resizingImage.resized(withPercentage: 0.1),
let imageData = resizedImage.pngData() else { return nil }
resizingImage = resizedImage
imageSizeKB = Double(imageData.count) / megaByte // ! Or devide for 1024 if you need KB but not kB
}
return resizingImage
}
}
can any one suggest me good solution to achieve this?
Constraints
Image dimension must be 512x512
Image size must be any size below 100
kb.
You should use jpg data instead of png data to get image have small byte size.

UIImageJpgRepresentation doubles image resolution

I am trying to save an image coming from the iPhone camera to a file. I use the following code:
try UIImageJPEGRepresentation(toWrite, 0.8)?.write(to: tempURL, options: NSData.WritingOptions.atomicWrite)
This results in a file double the resolution of the toWrite UIImage. I confirmed in the watch expressions that creating a new UIImage from UIImageJPEGRepresentation doubles its resolution
-> toWrite.size CGSize (width = 3264, height = 2448)
-> UIImage(data: UIImageJPEGRepresentation(toWrite, 0.8)).size CGSize? (width = 6528, height = 4896)
Any idea why this would happen, and how to avoid it?
Thanks
Your initial image has scale factor = 2, but when you init your image from data you will get image with scale factor = 1. Your way to solve it is to control the scale and init the image with scale property:
#available(iOS 6.0, *)
public init?(data: Data, scale: CGFloat)
Playground code that represents the way you can set scale
extension UIImage {
class func with(color: UIColor, size: CGSize) -> UIImage? {
let rect = CGRect(origin: .zero, size: size)
UIGraphicsBeginImageContextWithOptions(size, true, 2.0)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
context.setFillColor(color.cgColor)
context.fill(rect)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
let image = UIImage.with(color: UIColor.orange, size: CGSize(width: 100, height: 100))
if let image = image {
let scale = image.scale
if let data = UIImageJPEGRepresentation(image, 0.8) {
if let newImage = UIImage(data: data, scale: scale) {
debugPrint(newImage?.size)
}
}
}

swift ios reduce image size before upload [duplicate]

This question already has answers here:
How do I resize the UIImage to reduce upload image size
(21 answers)
Closed 5 years ago.
I am trying to reduce the file size of my images as much as possible before I upload them. Right now I am doing this:
if let firstImageData = UIImageJPEGRepresentation(pickedImage, 0.1) {
self.imgArray.append(firstImageData)
}
This will take any image coming from the camera or photo album, make it jpg and reduce its size.
I have set the setting to 0.1 but when I upload my images the size still end up around 300-350kb, is there any way I could resize them even more, I am aiming towards 50-70kb
you can resize your image first to smaller size using these extension by resizing it by percent or width
extension UIImage {
func resizeWithPercent(percentage: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: size.width * percentage, height: size.height * percentage)))
imageView.contentMode = .ScaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.renderInContext(context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
func resizeWithWidth(width: CGFloat) -> UIImage? {
let imageView = UIImageView(frame: CGRect(origin: .zero, size: CGSize(width: width, height: CGFloat(ceil(width/size.width * size.height)))))
imageView.contentMode = .ScaleAspectFit
imageView.image = self
UIGraphicsBeginImageContextWithOptions(imageView.bounds.size, false, scale)
guard let context = UIGraphicsGetCurrentContext() else { return nil }
imageView.layer.renderInContext(context)
guard let result = UIGraphicsGetImageFromCurrentImageContext() else { return nil }
UIGraphicsEndImageContext()
return result
}
}
to use it just call it like this
myImage = myImage.resizeWithWidth(700)!
Now next you can still compress it using compression ratio of your choice
let compressData = UIImageJPEGRepresentation(myImage, 0.5) //max value is 1.0 and minimum is 0.0
let compressedImage = UIImage(data: compressData!)
You only can change size of image (less size = less data) and compress it by using image compression algorithms like JPEG, there is no other way (better algorythm = less size in same quality).
I heard google improved JPEG algorithm using neural networks lately (Google’s TensorFlow)

After cropping images in Swift I'm getting results tilted with 90 degrees - why?

I'm using a nice github plugin for Swift https://github.com/budidino/ShittyImageCrop responsible for cropping the image.
I need aspect ratio 4:3, so I call this controller like this:
let shittyVC = ShittyImageCropVC(frame: (self.navigationController?.view.frame)!, image: image!, aspectWidth: 3, aspectHeight: 4)
self.navigationController?.present(shittyVC, animated: true, completion: nil)
Now, when I provide horizontal image (wider than taller) - cropped result is fine - I see a photo with aspect ratio 4:3 as an output.
But when I provide vertical image and try to cropp it - I'm seeing tilted output. So for example, when normal photo is like this:
vertical - and tilted - one looks like this:
(sorry for low res here). Why does it get shifted to one side?
I suspect the problem might be somewhere in the logic of the crop-button:
func tappedCrop() {
print("tapped crop")
var imgX: CGFloat = 0
if scrollView.contentOffset.x > 0 {
imgX = scrollView.contentOffset.x / scrollView.zoomScale
}
let gapToTheHole = view.frame.height/2 - holeRect.height/2
var imgY: CGFloat = 0
if scrollView.contentOffset.y + gapToTheHole > 0 {
imgY = (scrollView.contentOffset.y + gapToTheHole) / scrollView.zoomScale
}
let imgW = holeRect.width / scrollView.zoomScale
let imgH = holeRect.height / scrollView.zoomScale
print("IMG x: \(imgX) y: \(imgY) w: \(imgW) h: \(imgH)")
let cropRect = CGRect(x: imgX, y: imgY, width: imgW, height: imgH)
let imageRef = img.cgImage!.cropping(to: cropRect)
let croppedImage = UIImage(cgImage: imageRef!)
var path:String = NSTemporaryDirectory() + "tempFile.jpeg"
if let data = UIImageJPEGRepresentation(croppedImage, 0.95) { //0.4 - compression quality
//print("low compression is here")
try? data.write(to: URL(fileURLWithPath: path), options: [.atomic])
}
self.dismiss(animated: true, completion: nil)
}
ShittyImageCrop saves cropped images directly to your album and I couldn't replicate your issue using vertical images.
I see you used UIImageJPEGRepresentation compared to UIImageWriteToSavedPhotosAlbum from ShittyImageCrop and it seems other people also have problems with image rotation after using UIImageJPEGRepresentation.
Look up iOS UIImagePickerController result image orientation after upload and iOS JPEG images rotated 90 degrees
EDIT
try implementing fixOrientation() from https://stackoverflow.com/a/27775741/611879
add fixOrientation():
func fixOrientation(img:UIImage) -> UIImage {
if (img.imageOrientation == UIImageOrientation.Up) {
return img
}
UIGraphicsBeginImageContextWithOptions(img.size, false, img.scale)
let rect = CGRect(x: 0, y: 0, width: img.size.width, height: img.size.height)
img.drawInRect(rect)
let normalizedImage : UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return normalizedImage
}
and then do it before using UIImageJPEGRepresentation:
if let data = UIImageJPEGRepresentation(fixOrientation(croppedImage), 0.95) {
try? data.write(to: URL(fileURLWithPath: path), options: [.atomic])
}
EDIT 2
please edit the init method of ShittyImageCrop by replacing img = image with:
if (image.imageOrientation != .up) {
UIGraphicsBeginImageContextWithOptions(image.size, false, image.scale)
var rect = CGRect.zero
rect.size = image.size
image.draw(in: rect)
img = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
} else {
img = image
}

Resources