Confused about UIImage's init(data:scale:) method in Swift 3.0 - ios

Here is my purpose, I have an image width of 750 and I want to scale it to 128
Then I found an init method of UIImage called
init(data:scale:)
The next is my code
func scaleImage(image:UIImage, ToSpecificWidth targetWidth:Int) -> UIImage{
var scale = Double(targetWidth) / Double(image.size.width)
let scaledImage = UIImage(data: UIImagePNGRepresentation(image)! as Data, scale: CGFloat(scale))
return scaledImage!
}
print("originImageBeforeWidth: \(portrait?.size.width)") // output: 750
let newImage = Tools.scaleImage(image: portrait!, ToSpecificWidth: 128) // the scale is about 0.17
print("newImageWidth: \(newImage.size.width)") // output: 4394.53125
apparently the new width is too far away from my intension
I'm looking for 750 * 0.17 = 128
But I get 750 / 0.17 = 4394
then I change my scaleImage func
Here is the updated code
func scaleImage(image:UIImage, ToSpecificWidth targetWidth:Int) -> UIImage{
var scale = Double(targetWidth) / Double(image.size.width)
scale = 1/scale // new added
let scaledImage = UIImage(data: UIImagePNGRepresentation(image)! as Data, scale: CGFloat(scale))
return scaledImage!
}
print("originImageBeforeWidth: \(portrait?.size.width)") // output: 750
let newImage = Tools.scaleImage(image: portrait!, ToSpecificWidth: 128) // the scale is about 5.88
print("newImageWidth: \(newImage.size.width)") // output: 128
Which is exactly what I want, but the code scale =1/scale doesn't make any sense
What is going on here?

The init method you are trying to use is not for the purpose of resizing an image. And the scale parameter is not a resizing scale. It's the 1x, 2x, 3x scale. Essentially, the only valid values for scale are currently 1.0, 2.0, or 3.0.
While setting the scale to the inverse of what you expected gives you a size property returning your desired result, it is not at all what you should be doing.
There are proper ways to resize an image such as How to Resize image in Swift? as well as others.

According to apple document,
UIImage.scale
The scale factor to assume when interpreting the image data. Applying a scale factor of 1.0 results in an image whose size matches the pixel-based dimensions of the image. Applying a different scale factor changes the size of the image as reported by the size property.
UIImage.size
This value reflects the logical size of the image and takes the image’s current orientation into account. Multiply the size values by the value in the scale property to get the pixel dimensions of the image.
so,
real pixel size = size * scale
That's why you need to set 1/scale.
By the way,
scale only affect the size value of UIImage properly.
It mean it only affect the size for showing on screen, not changing pixel size.
If you want to resize, you can draw with scale and use
UIGraphicsGetImageFromCurrentImageContext()

Related

How to scale to desired size in pixels using af_imageAspectScaled

I'm using AlamoFireImage to crop an user profile picture before sending it to the server. Our server has some restrictions and we can't send images larger than 640x640.
I'm using the af_imageAspectScaled UIImage extension function like so:
let croppedImage = image.af_imageAspectScaled(
toFill: CGSize(
width: 320,
height: 320
)
)
I was expecting this to crop image to a 320px by 320px image. However I found out that the output image is being saved as a 640x640px image with scale 2.0. The following XCTest shows this:
class UIImageTests: XCTestCase {
func testAfImageAspectScaled() {
if let image = UIImage(
named: "ipad_mini2_photo_1.JPG",
in: Bundle(for: type(of: self)),
compatibleWith: nil
) {
print (image.scale) // prints 1.0
print (image.size) // prints (1280.0, 960.0)
let croppedImage = image.af_imageAspectScaled(
toFill: CGSize(
width: 320,
height: 320
)
)
print (croppedImage.scale) // prints 2.0
print (croppedImage.size) // prints (320.0, 320.0)
}
}
}
I'm running this on the iPhone Xr simulator on Xcode 10.2.
The original image is 1280 by 960 points, with scale 1, which would be equivalent to 1280 by 960 pixels. The cropped image is 320 by 320 points, with scale 2, which would be equivalent to 640 by 640 pixels.
Why is the scale set to 2? Can I change that? How can I generate a 320 by 320 pixels image independent on the scale and device?
Well, checking the source code for the af_imageAspectScaled method I found the following code for generating the actual scaled image:
UIGraphicsBeginImageContextWithOptions(size, af_isOpaque, 0.0)
draw(in: CGRect(origin: origin, size: scaledSize))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext() ?? self
UIGraphicsEndImageContext()
The parameter with value 0.0 on UIGraphicsBeginImageContextWithOptions tells the method to use the main screen scale factor for defining the image size.
I tried setting this to 1.0 and, when running my testcase, af_imageAspectScaled generated an image with the correct dimensions I wanted.
Here there is a table showing all the iOS devices resolutions. My app was sending appropriate sized images for all devices where the scale factor was 2.0, however several devices have scale factor 3.0. For those the app wasn't working.
Well, unfortunately it seems that if I want to use af_imageAspectScaled I have to divide the final size I want by the device's scale when setting the scaled size like so:
let scale = UIScreen.main.scale
let croppedImage = image.af_imageAspectScaled(
toFill: CGSize(
width: 320/scale,
height: 320/scale
)
)
I've sent a pull request to AlamofireImage proposing the addition of a parameter scale to the functions af_imageAspectScaled(toFill:), af_imageAspectScaled(toFit:) and af_imageScaled(to:). If they accept it, the above code should become:
// this is not valid with Alamofire 4.0.0 yet! waiting for my pull request to
// be accepted
let croppedImage = image.af_imageAspectScaled(
toFill: CGSize(
width: 320,
height: 320
),
scale: 1.0
)
// croppedImage would be a 320px by 320px image, regardless of the device type.

Image size is resized when convert it from data in swift 3

I want to save an image in database. Therefore I convert it to Data. However during these steps the width and height of the image will change. It is increased in size.
// Original Image Size
print("Original Image Size : \(capturedImage.size)") // Displays (320.0, 427.0)
// Convert to Data
var imageData: Data?
imageData = UIImagePNGRepresentation(capturedImage)
// Store imageData into Db.
// Convert it back
m_CarImgVw.image = UIImage(data: damageImage!.imageData!, scale: 1.0)
print("m_CarImgVw Image Size : \(m_CarImgVw.image.size)") // Displays (640.0, 854.0)
I do not want the imagesize to increase!
If it’s originally an image from your assets, it’s probably #2x, which means the size in pixels (real size) is double the size in pts (displayed size). So the image size isn’t actually increasing, it was 640x854 before and after the transform. It’s just that before the OS automatically scaled it because it was named #2x.
To use the original image scale you can replace 1.0 with capturedImage.scale.
Your problem is in this line:
m_CarImgVw.image = UIImage(data: damageImage!.imageData!, scale: 1.0)
Can you see it?
Hint: It's in scale: 1.0.
It looks like your original image was Retina (or #2x), so it had scale 2.0.
So you should either put your original image scale (damageImage.scale) there, or if you're presenting image on the screen you should use UIScreen's scale.

Unable to take screenshot - screen size and return image issue

I have screen equal the size of the SceneView im looking to capture
let screen = CGRect(x: 0, y: 0, width: sceneView.frame.size.width, height: sceneView.frame.size.height - 60)
Regarding the size, I am greeted with a "cannot invoke initializer for type CGFloat with an argument list of type (CGRect). How would I go about fixing this error?
func Screenshot()
{
UIGraphicsBeginImageContextWithOptions(UIScreen.main.bounds.size, false, CGFloat(screen))
view.layer.render(in: UIGraphicsGetCurrentContext()!)
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(image!, nil, nil, nil)
}
You have passed wrong parameter type an argument to UIGraphicsBeginImageContextWithOptions. Last argument is of type CGFloat, that accepts float value but you've passed value of type CGRect (screen).
UIGraphicsBeginImageContextWithOptions has following parameters arguments
void UIGraphicsBeginImageContextWithOptions(CGSize size, BOOL opaque, CGFloat scale)
Parameters
size
The size (measured in points) of the new bitmap context. This represents the size of the image returned by the UIGraphicsGetImageFromCurrentImageContext function. To get the size of the bitmap in pixels, you must multiply the width and height values by the value in the scale parameter.
opaque
A Boolean flag indicating whether the bitmap is opaque. If you know the bitmap is fully opaque, specify YES to ignore the alpha channel and optimize the bitmap’s storage. Specifying NO means that the bitmap must include an alpha channel to handle any partially transparent pixels.
scale
The scale factor to apply to the bitmap. If you specify a value of 0.0, the scale factor is set to the scale factor of the device’s main screen.
Here Apple's Document for this function: UIGraphicsBeginImageContextWithOptions
Solution (hint): Pass value 0.0 or 1.0 instead of screen and see.
UIGraphicsBeginImageContextWithOptions(UIScreen.main.bounds.size, false, CGFloat(1.0))

CIImage extent in pixels or points?

I'm working with a CIImage, and while I understand it's not a linear image, it does hold some data.
My question is whether or not a CIImage's extent property returns pixels or points? According to the documentation, which says very little, it's working space coordinates. Does this mean there's no way to get the pixels / points from a CIImage and I must convert to a UIImage to use the .size property to get the points?
I have a UIImage with a certain size, and when I create a CIImage using the UIImage, the extent is shown in points. But if I run a CIImage through a CIFilter that scales it, I sometimes get the extent returned in pixel values.
I'll answer the best I can.
If your source is a UIImage, its size will be the same as the extent. But please, this isn't a UIImageView (which the size is in points). And we're just talking about the source image.
Running something through a CIFilter means you are manipulating things. If all you are doing is manipulating color, its size/extent shouldn't change (the same as creating your own CIColorKernel - it works pixel-by-pixel).
But, depending on the CIFilter, you may well be changing the size/extent. Certain filters create a mask, or tile. These may actually have an extent that is infinite! Others (blurs are a great example) sample surrounding pixels so their extent actually increases because they sample "pixels" beyond the source image's size. (Custom-wise these are a CIWarpKernel.)
Yes, quite a bit. Taking this to a bottom line:
What is the filter doing? Does it need to simply check a pixel's RGB and do something? Then the UIImage size should be the output CIImage extent.
Does the filter produce something that depends on the pixel's surrounding pixels? Then the output CIImage extent is slightly larger. How much may depend on the filter.
There are filters that produce something with no regard to an input. Most of these may have no true extent, as they can be infinite.
Points are what UIKit and CoreGraphics always work with. Pixels? At some point CoreImage does, but it's low-level to a point (unless you want to write your own kernel) you shouldn't care. Extents can usually - but keep in mind the above - be equated to a UIImage size.
EDIT
Many images (particularly RAW ones) can have so large a size as to affect performance. I have an extension for UIImage that resizes an image to a specific rectangle to help maintain consistent CI performance.
extension UIImage {
public func resizeToBoundingSquare(_ boundingSquareSideLength : CGFloat) -> UIImage {
let imgScale = self.size.width > self.size.height ? boundingSquareSideLength / self.size.width : boundingSquareSideLength / self.size.height
let newWidth = self.size.width * imgScale
let newHeight = self.size.height * imgScale
let newSize = CGSize(width: newWidth, height: newHeight)
UIGraphicsBeginImageContext(newSize)
self.draw(in: CGRect(x: 0, y: 0, width: newWidth, height: newHeight))
let resizedImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext();
return resizedImage!
}
}
Usage:
image = image.resizeToBoundingSquare(640)
In this example, an image size of 3200x2000 would be reduced to 640x400. Or an image size or 320x200 would be enlarged to 640x400. I do this to an image before rendering it and before creating a CIImage to use in a CIFilter.
I suggest you think of them as points. There is no scale and no screen (a CIImage is not something that is drawn), so there are no pixels.
A UIImage backed by a CGImage is the basis for drawing, and in addition to the CGImage it has a scale; together with the screen resolution, that gives us our translation from points to pixels.

Swift: UIGraphicsBeginImageContextWithOptions scale factor set to 0 but not applied

I used to resize an image with the following code and it used to work just fine regarding the scale factor. Now with Swift 3 I can't figure out why the scale factor is not taken into account. The image is resized but the scale factor not applied. Do you know why?
let layer = self.imageview.layer
UIGraphicsBeginImageContextWithOptions(layer.bounds.size, true, 0)
layer.render(in: UIGraphicsGetCurrentContext()!)
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
print("SCALED IMAGE SIZE IS \(scaledImage!.size)")
print(scaledImage!.scale)
For example if I take a screenshot on iPhone 5 the image size will be 320*568. I used to get 640*1136 with exact same code.. What can cause the scale factor not to be applied?
When I print the scale of the image it would print 1, 2 or 3 based on the device resolution but will not be applied to the image taken from the context.
scaledImage!.size will not return the image size in pixel.
CGImageGetWidth and CGImageGetHeight returns the same size (in pixels)
That is image.size * image.scale
If you wanna test it out, at first you have to import CoreGraphics
let imageSize = scaledImage!.size //(320,568)
let imageWidthInPixel = CGImageGetWidth(scaledImage as! CGImage) //640
let imageHeightInPixel = CGImageGetHeight(scaledImage as! CGImage) //1136

Resources