Scale an Image in ImageView (Scale x,y) - IOS - ios

I usually scale the image of an imageView in Android with the scaleX and scaleY properties.
How can I do the same in IOS? I'm really new with IOS,I'm using swift 3 and I didn't find any analog thing like that.

If you need to scale the image proportionally, the following code might work for you:
if let cgImage = imageView.image?.cgImage, let orientation = imageView.image?.imageOrientation {
imageView.image = UIImage(cgImage: cgImage, scale: newScale, orientation: orientation)
}
However, if you only want to display it with a size different than that of the original image, you should control the size of the image using the size of the UIImageView that presents it. To make sure the image scales correctly with the size of the view, take a look at the UIImageView.contentMode property: https://developer.apple.com/reference/uikit/uiview/1622619-contentmode

Related

UIImage size doubled when converted to CGImage?

I am trying to crop part of an image taken with the iPhone's camera via the cropping(to:) method on a CGImage but I am encountering a weird phenomenon where my UIImage's dimensions are doubled when converted with .cgImage which, obviously, prevents me from doing what I want.
The flow is:
Picture is taken with the camera and goes into a full-screen imageContainerView
A "screenshot" of this imageContainerView is made with a UIView extension, effectively resizing the image to the container's dimensions
imageContainerView's .image is set to now be the "screenshot"
let croppedImage = imageContainerView.renderToImage()
imageContainerView.image = croppedImage
print(imageContainerView.image!.size) //yields (320.0, 568.0)
print(imageContainerView.image!.cgImage!.width, imageContainerView.image!.cgImage!.height) //yields (640, 1136) ??
extension UIView {
func renderToImage(afterScreenUpdates: Bool = false) -> UIImage {
let rendererFormat = UIGraphicsImageRendererFormat.default()
rendererFormat.opaque = isOpaque
let renderer = UIGraphicsImageRenderer(size: bounds.size, format: rendererFormat)
let snapshotImage = renderer.image { _ in
drawHierarchy(in: bounds, afterScreenUpdates: afterScreenUpdates)
}
return snapshotImage
}
}
I have been wandering around here with no success so far and would gladly appreciate a pointer or a suggestion on how/why the image size is suddenly doubled.
Thanks in advance.
This is because print(imageContainerView.image!.size) prints the size of the image object in points and print(imageContainerView.image!.cgImage!.width, imageContainerView.image!.cgImage!.height) print the size of the actual image in pixels.
On iPhone you are using there are 2 pixels for evert point in both horizontal and vertical. The UIImage scale property will give you the factor which in your case will be 2.
See this link iPhone Resolutions

bounds of imageView are always 240

I have a probably simple mistake that drives me crazy.
I'm working with UIImageView within a UIScrollView. To fit the image in the view I want to get the width of the imageView to adjust the zoom scale.
But the code
imageView.bounds.width
always returns 240.0 no matter what size the actual image has.
In the Interface Builder the imageView is horizontally and verically centered in the view, clip subviews is true and Mode is aspect fit.
Any ideas?
The size of the UIImageView is not related to the size of the image it contains. The UIImageView is probably sized to 240.0 in the storyboard or wherever else you generate it. The image will scale down or up to fit the view based on the mode. To get the size of the actual image, try the following code:
let image = UIImage("my_image_file")
let imageHeight = image.size.height
let imageWidth = image.size.width
With the size of the image now know, you can set the size of the view appropriately.
I had the same problem. Now I check the bounds in the main_queue and everything works fine.
dispatch_async(dispatch_get_main_queue(), {
print(self.image.bounds.width)
})

Get an image from two imageViews in Swift

I'm developing an app where I have an area where there is an image in the background, and another image that I can move, like a sticker.
My goal is to create and save an image with the background image and the "sticker" above, using Swift. Here's my function that allows me to do what I want (my background view is called "imageView", my imageview sticker is called "jacques") :
let newSize = CGSizeMake(imageView.image!.size.width, imageView.image!.size.height)
UIGraphicsBeginImageContextWithOptions(newSize, false, 0.0);
imageView.image!.drawInRect(CGRectMake(0,0,newSize.width,newSize.height))
let jacquesX = ((jacques.frame.origin.x - imageView.frame.origin.x) * (imageView.image?.size.width)!) / UIScreen.mainScreen().bounds.width
let jacquesY = ((jacques.frame.origin.y - imageView.frame.origin.y) * (imageView.image?.size.height)!) / UIScreen.mainScreen().bounds.height
let jacquesWidth: CGFloat = jacques.image!.size.width
let jacquesHeight: CGFloat = jacques.image!.size.height
jacques.image!.drawInRect(CGRectMake(jacquesX, jacquesY, jacquesWidth, jacquesHeight))
let newImage: UIImage = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
UIImageWriteToSavedPhotosAlbum(newImage, self, "image:didFinishSavingWithError:contextInfo:", nil)
Unfortunately, my sticker isn't the right size. I think it's well placed in the image, but it's size is way too small. I don't know if my solution is the best one, i'm open to all suggestion. And I'm new, so if you have good practice to share, i'm all ears :)
You could do this in the storyboard. First you could size the picture however you want. To create a sticker effect you could just add imageView, size it to background size,and then put jacques on the storyboard, and finally resize them.

Get picture from square AVCaptureVideoPreviewLayer

So what I am doing is creating a custom image picker, and I have a 320 X 320 AVCaptureVideoPreviewLayer that I am using, and when I take a picture, I want to get a UIImage of what is actually seen in the preview layer, but what I get when I take a photo with captureStillImageAsynchronouslyFromConnection: completionHandler: is a normal image with size 2448 X 3264. So what would be the best way to get make this image into a 320 x 320 square image like is seen in the preview layer without messing it up? Is there a Right Way™ to do this? Also, I am using AVLayerVideoGravityResizeAspectFill for the videoGravity property of AVCaptureVideoPreviewLayer, if that is relevant.
Have you tried to transform the image?
try using CGAffineTransformMakeScale(<#sx: CGFloat#>, <#sy: CGFloat#>) to scale the image down. Transforms can do magic! If you have ever taken linear algebra, you should recall your standard transformation matrices. I have not used this for images, so I am not sure if this would work well with the pixels.
you could also try
// grab the original image
UIImage *originalImage = [UIImage imageNamed:#"myImage.png"];
// scaling set to 2.0 makes the image 1/2 the size.
UIImage *scaledImage =
[UIImage imageWithCGImage:[originalImage CGImage]
scale:(originalImage.scale * 2.0)
orientation:(originalImage.imageOrientation)];
where you can change the scale factor

Crop UIImage from a transformed UIImageView

I am letting the user capture an image from the camera or picking one from the library.
This image I display in an UIImageView.
The user can now scale and position the image within a bounding box, exactly like you would do using the UIImagePickerController when allowsEditing is set to YES.
When the user is satisfied with the result and taps Done I would like to produce a cropped UIImage.
The problem arises when using CGImageCreateWithImageInRect as this does not take the scaling into account. The transform is applied to the imageView like this:
CGAffineTransform transform = CGAffineTransformScale(self.imageView.transform, newScale, newScale);
[self.imageView setTransform:transform];
Using a gestureRecognizer.
I assume what is happening is; the UIImageView is scaled and moved, it then applies the UIViewContentModeScaleAspectFit to the UIImage is holds and when I ask it to crop the image, it does exactly that - whit no regards to the scaling positioning. The reason I think this, is that if I don't scale or move the image but just tap Done straight away the cropping works.
I crop the image like this:
- (UIImage *)cropImage:(UIImage*) img toRect:(CGRect)rect {
CGFloat scale = [[UIScreen mainScreen] scale];
if (scale>1.0) {
rect = CGRectMake(rect.origin.x*scale , rect.origin.y*scale, rect.size.width*scale, rect.size.height*scale);
}
CGImageRef imageRef = CGImageCreateWithImageInRect([img CGImage], rect);
UIImage *result = [UIImage imageWithCGImage:imageRef scale:self.imageView.image.scale orientation:self.imageView.image.imageOrientation];
// UIImage *result = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
return result;
}
Passing in a cropRect from a view that is a subView of my main view (the square overlay box, like in UIImagePickerController). Main UIView has a UIImageView that gets scaled and a UIView that displays the crop rectangle.
How can I get the "what you see is what you get" cropping and which factors must I take into account. Or maybe suggestions if I should implemented the hierarchy or scaling differently.
Try a simple trick. Apple has got samples on its site to show how to zoom into a photo using code. Once done zooming, using graphic context take the frame size of the bounding view, and take the image with that. Eg Uiview contains scroll view which has the zoomed image. So the scrollview zooms and so does your image, now take the frame size of your bounding UIview, and create an image context out of it and then save that as a new image. Tell me if that makes sense.
Cheers :)

Resources