iOS - Get framing of Visible part of UIImage from UIImageView - ios

I am trying to make a transition like APP Tinder.
Detail:
In Screen One there is a Vertical Rectangular UIImaveView with contentMode = Aspect Fill, so it hides some portion of Image to adujust Aspect Ratio.
In Screen Two (Detail Screen) the same image after transition has to to be passed, but the ImageView in Second screen is Square One.
I want to make a Morphing kind of Transition in which User should think that the same ImageView from Screen One become square one in Second one without stretching the Image.So What should i do?
Currently i am trying to get Frame of UIImage that is in visible area of UIImageView so that I can do some Logical stuff to achieve this. but can anyone help me to Get the Frame of Visible Portion of UIImage.
EDIT
Please Find out the Attached Image for understanding

I think there's a little ambiguity in the question: a frame must be specified in a coordinate system. But I think you're looking for a rect relative to the original, unclipped image.
If that's right, then the rect can be computed as follows. Say the image is called image, and the image view is imageView. The size of the rect is the size of the image view:
imageView.bounds.size
And, since aspect fill will center the oversized dimension, it's origin is:
CGPointMake((image.size.width - imageView.bounds.size.width) / 2.0, 0.0);

Related

Trying to clip edges from UIImage to fit to ImageView

I have added an icon to my assets to be used in an image view. However, When I put it in an image view the outer edges seem to be larger than the image itself. Is there any way I can fully fill the image. Even when I try scale to fill or aspect fill, fit options still there remains a margin in between.
In other words I want my circular image to be tangent to the image view rectangle
The example of what I am trying to do

How to determine the scale of an Image

I have an image that I'm using which is 960x1280. I have coordinates for a rectangle of x:391 y:772, x:574 y:772, x:574 y:870, x:391 y:870 which allows me to put the rectangle in the proper spot IF the image is still 960x1280. Of course, when I'm in Xcode, the screen size is 375x667.
When drawing the rectangle with the above coordinates, the rectangle is no longer visible. If just use the screen scale of 3 (UIScreen.main.scale) it's not accurate either.
I create a UIImageView that has constraints of 0 for all four sides and using aspect .fit or .fill. How do I now know the proper scale of the image so the rectangle is drawn in the right spot?
Thanks

Xcode 6 UIImageView will not scale correctly

Please see screen shot of flower setup below. The flower image has been correctly loaded from an asset catalog and when the app is run on various simulators the correct pixel resolution is assigned to each device. My problem is how to get the flower image to be scaled (equally sized to fit) the same on each device ??
I have learnt how to position the image to different positions using constraints and frames but the image never scales correctly - please see first pic
The following image is a mock up of what I want to be able to do (flower image scaled correctly on each device)
Judging by your mockups, it looks like you want the image to fill half the width, and keep its square aspect ratio to determine its height. One way to approach this would be to use AutoLayout to make a left UIImageView and a right placeholder (blank) view. Pin the left view to the left edge of the parent, the right view to the right edge, and then set them to be 0 pixels from each other. Then set an equal widths constraint on them. Finally, control drag the image view to itself and you can select aspect ratio -- and assuming that in IB, the width and height is the same, it will keep it square. Adding an equal heights constraint will give the other view the height it needs to be equal in case you need that.
This gives you a left image view that is 50% and with your mode set to Aspect Fit or Aspect Fill, it should give you the results in your mockups. In case you have an image that isn't square, make sure to check Clip Subviews for your UIImageView to prevent showing the overflow.
The real problem is that your goals are not well defined. Scaled with respect to what? The screen has a height and a width. You can't scale with respect to both, because that would distort the image (because different devices have different overall aspect ratios). Thus you have to pick one dimension that you will scale to.
Judging from your mockup, I'm going to guess that what you want is to scale with respect to height. If so, then give the image view a height constraint that is a fixed fraction of its superview's height. It looks to be about 0.25 in your mockups but you will have to eyeball what you think is a good fraction.
Now, if the content mode of the image view is Aspect Fit (or Aspect Fill), the image will change its height along with the image view without distorting the aspect ratio.
However, it would be best if you would make the other dimension (here, the width) of the image view a fixed fraction of the height, such as to make the aspect ratio of the image view the same as the aspect ratio of the image. The reason is that otherwise the image might end up centered in the image view in such a way that it doesn't touch the top or left any more, even though the image view itself does.
CGFloat screen_width = [[UIScreen mainScreen] bounds].size.width;
your_imageview.frame = CGRectMake(0, 0, screen_width/2, screen_width/2);

UIImage distorted when using it for UIImageView

I have taken a photo, and then I'm initializing a UIImageView object with this photo. The only problem is, when I take the photo, the photo is being taken using the full iPhone screen (portrait).
The UIImageView that is being initialized with this photo is only set to take up the top 50% of the iphone's screen. So you can imagine the image looks distorted.
I have been able to make it look a lot better by using the following code:
UIImageView *halfView = [[UIImageView alloc]initWithImage:image];
[self.view addSubview:halfView];
halfView.frame = CGRectMake(0, 0, self.view.bounds.size.width, self.view.frame.size.height/2);
halfView.contentMode = UIViewContentModeScaleAspectFill;
The only problem is, the final UIImageView called "halfView" is still slightly distorted.
I have a feeling that this is impossible to fix, because the original photo is being taken with the full iphone screen and can never perfectly scale to fit a UIImageView that only takes up the top 50% of the iphone screen.
I was basically trying to copy the frontback app. Here is what it looks like when you are taking the original image in their app:
This is what my app's screen looks like when you are taking the picture:
And then right after you take the picture, my app's screen changes to look like the frontback screen and takes the picture you just took and places it in the top half and tries to scale it.
I hope that makes sense. I know it is a long question, but I just really wanted to let the user use the full screen while taking the photo and then just scale it to half the screen.
Am I going about this all wrong? Am I crazy to think I could ever properly scale the image to half the screen when it was originally captured as a "full screen" image?
Thanks for the help.
For the sake of argument let's say your captured image size is 640x1136 (twice the size of an iPhone 5 screen) and you are trying to display it in a UIImageView with of size 320x284 (half the size of an iPhone 5 screen).
As you can already see from these dimensions the captured image's width is smaller than its height whereas the UIImageView's width is larger than its height - the proportions are different.
Therefore, scaling the captured image to fit the UIImageView's width (scale by 0.5) means the captured image will be of size 320x568 - its height is larger than the UIImageView's height.
Scaling the captured image to fit the UIImageView's height (scale by 0.25) means the captured image will be of size 160x284 - its width is smaller the the UIImageView's width.
The image can't scale exactly like you want it to scale. However, you can use UIViewContentModeScaleAspectFill to fill the entire UIImageView but lose some of the image (image's height is too big to fit). You can also choose to use UIViewContentModeScaleAspectFit which will show the entire image but will leave some space on the sides (image's width is too small).
Another option you have is to actually capture the image in the proportions of your UIImageView in the first place but that means you won't be able to capture a full screen image.
Try this function, pass your UIImage in this function along with the new size, in turn it will return you the UIImage with size specified by you.
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
I guess this is what you want.
Hope this helps.
You mention the image takes up the full size of the screen. If it's to display the UIImageView taking up half the screen, then you'll need to add this code to clip the frame
halfView.clipToBounds = YES;
Despite making the size of the imageview half the screen, the actual image will show outside the boundaries of the imageview if it's original size is bigger with the aspectFit property. clipToBounds will fix this.
I hope this is what you're looking for. Thanks, Jim.

Again edit of edited UIImage in iOS app

In my iOS app, I am putting several UIImages on one back UIImage and then saving the overall back image with all subimages added on it by taking screenshot programmatically.
Now I want to change the subviews UIImages position of that saved image. So want to know how to detect the subview images position as I have taken whole image as screenshot.
Record their frame as converted to window coordinates. The pixels of the image should be the same as the frame origin (for normal) or double for retina. The screenshot is of the whole screen, so its dimensions are equivalent to the window frame. UIView has some convenience methods to convert arbitrary view frames to other view (or window) coordinates.
EDIT: to deal with content fit, you have to do the math yourself. You know the frame of the imageView, and you can ask the image for its size. Knowing the aspect ratio of each will let you determine in which dimension the image completely fits, and then you can compute the other dimension (which will be a value less than the imageView frame. Divide the difference of the view dimension minus the image dimension by two, and that lets you know the offset to the image inside the view. Now you can save the frame of the image as its displayed in the view.

Resources