iOS : Cropping Image from Rotated ImageView - ios

How to crop a rectangle(red square in screen shot) of UIImage which is rotated as well as zoomed using UIScrollView.
The edges of UIImageView are hidden because of rotation(UIImageView Transformation). Please help.

Well, you can do all the complicated core graphics things or do a simple UIView screenshot. I vote for the easy solution: What you have to do is create a new view with the frame same as where that small rect lies. Then add the whole image view to that small view converting its frame so it looks the same. Then take the screenshot of the small view. After you are done simply put the image view back the way it was and remove the small view.
As this is still easier said then done here is some code to chew on (I did NOT test this so please correct the bugs if any after you succeed).
- (UIImage *)getScreenshotInRect:(CGRect)frame {
UIImageView *theImageView; //your original image view
UIView *backupSuperView = theImageView.superview; //backup original superview
CGRect backupFrame = theImageView.frame; //backup original frame
UIView *frameView = [[UIView alloc] initWithFrame:frame]; //create new view where the image should be taken at
frameView.clipsToBounds = YES; //not really necessery but can be usefull for cases like using corner radius
[self addSubview:frameView];
theImageView.frame = [theImageView.superview convertRect:theImageView.frame toView:frameView]; //set the new frame for the image view
[frameView addSubview:theImageView];
UIImage *toReturn = [self imageFromView:frameView]; //get the screenshot
theImageView.frame = backupFrame; //reset the image view frame
[backupSuperView addSubview:theImageView]; //reset the image view's superview
[frameView removeFromSuperview];
frameView = nil;
return toReturn;
}
- (UIImage *)imageFromView:(UIView *)view {
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, .0f);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
I do hope this doesn't break because you have rotations. If that is the case I suggest you create another view on which the rotated image view lies and add this view to the small view.

Related

Fit image in UIImageView using UIViewContentModeScaleAspectFit

I'm facing a really weird problem with UIImageView, I was trying to set an image - which created by take the screenshot of the current view - to an ImageView with content mode is UIViewContentModeScaleAspectFit.
It worked fine when I set the image by the interface builder in the xib file or when I set the image created by [UIImage imageNamed:]. They both worked fine with UIViewContentModeScaleAspectFit.
But when I take the snap shot of a view and set the image to the image view, the image did not fit to the UIImageView. I've tried all the solutions I found on here like .ClipsToBound = YES but they didn't work at all. I'm really confused by now.
Here's the code when I take the screen shot and create the UIImage:
- (UIImage *)screenshotWithRect:(CGRect)captureRect
{
CGFloat scale = [[UIScreen mainScreen] scale];
UIImage *screenshot;
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, scale);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureRect);
{
if(UIGraphicsGetCurrentContext() == nil)
{
NSLog(#"UIGraphicsGetCurrentContext is nil. You may have a UIView (%#) with no really frame (%#)", [self class], NSStringFromCGRect(self.frame));
}
else
{
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
screenshot = UIGraphicsGetImageFromCurrentImageContext();
}
}
UIGraphicsEndImageContext();
return screenshot;
}
And when I set the image to the image view
UIImage* snap = [[UIImage alloc] init];
// start snap shot
UIView* superView = [self.view superview];
CGRect cutRect = [superView convertRect:self.cutView.frame fromView:_viewToCut];
snap = [superView screenshotWithRect:cutRect];
[self.view addSubview:self.editCutFrameView];
// end snap shot -> show edit view
[self.editCutFrameView setImage:snap];
Here's a picture compare the 2 results:
Many thanks for your help.
UPDATE: As #Saheb Roy mentioned about the size, I checked the image size and it's about 400x500px and the thumbnail.png's size is 512x512px so I think it's not about the size of the image.
This is because in the second case, the snapshot image is itself exactly that size as you can see. Hence the image is not being stretched or fitted accordingly.
Earlier images are fitting to screen accordingly as the images were bigger than the imageview but with different ratio or same than that of the image.
But the one where it is not fitting to the imageview, the image itself is of that much size, i.e. smaller than that of the imageview, hence it is NOT being fitted to the bounds.

UIImageView image not being removed when set to nil in uicollectionviewcell

Basically what I am doing is taking an image of the view, applying a blur to it, and then using that as a blurred uiview overlay in reusable collection view cells in a simple uicollectionview.
// Capture Screen for blurr.
-(UIImage *) captureScreen:(CGRect)frame
{
CGRect grabRect = frame;
//for retina displays
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
{
UIGraphicsBeginImageContextWithOptions(grabRect.size, NO, [UIScreen mainScreen].scale);
}
else
{
UIGraphicsBeginImageContext(grabRect.size);
}
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ctx, -grabRect.origin.x, -grabRect.origin.y);
[self.contentView.layer renderInContext:ctx];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
viewImage = [viewImage applyBlurWithRadius:1.8f tintColor:nil saturationDeltaFactor:1.0 maskImage:viewImage atFrame:self.imageView.frame];
return viewImage;
}
// Set blurred image as image view image.
- (void) updateBlur
{
UIImage* infoViewImage = [self captureScreen:self.infoView.frame];
self.infoImageView.image = infoViewImage;
}
// Prepare for reuse.
- (void) prepareForReuse
{
self.infoImageView.image = nil;
}
Note the uiimageview is created and added as a subview to the cell's contentView in the initialization. Whenever I scroll slowly this works fine. If I scroll quickly the image will only be removed from the image view sometimes... I am not really sure why this is happening. So far I have tried a number of solutions, even removing the whole uiimageview from the superview and re-initializing/re-adding it as a subview each time but this action has the same issue. Please help!
The problem is that the reusable collection view cells are... reusable. You need to implement collectionView:cellForItemAtIndexPath: to deal with every cell it is ever handed, without making any assumptions about whether you may previously have added the subview to it.

How to get the screen shot of invisible part in ios?

I am working on iOS application where I need to capture the view and send MMS to particular person.Its all working fine.But I am facing problem to capture which is not visible (For more clarification I attached image).
I am getting the screen shot of the view which is visible.How to solve the problem? Is there any approach to reach my requirement? The image what I am getting is
I used the code to take screenshot is
UIGraphicsBeginImageContext(webview_pdf.bounds.size);
[webview_pdf.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *pdfImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Good suggestions are appreciable.Thanks in advance.!
Finally found the solution to this
+ (UIImage *) imageFromWebView:(UIWebView *)view
{
// tempframe to reset view size after image was created
CGRect tmpFrame = view.frame;
// set new Frame
CGRect aFrame = view.frame;
aFrame.size.height = [view sizeThatFits:[[UIScreen mainScreen] bounds].size].height;
view.frame = aFrame;
// do image magic
UIGraphicsBeginImageContext([view sizeThatFits:[[UIScreen mainScreen] bounds].size]);
CGContextRef resizedContext = UIGraphicsGetCurrentContext();
[view.layer renderInContext:resizedContext];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// reset Frame of view to origin
view.frame = tmpFrame;
return image;
}
You need to create another view that is the full size of the content. You can add this view off screen and then capture it the same way as you have done here. The reason it is cut off is because the view has only rendered that part of the content.

How to fill the background with image in landscape in IOS?

What I'm doing is creating filling in a view's background with an image returned from a UIImagePickerController. The image fills fine in portrait mode; however, the image will repeat when filled as background in landscape mode, but I have no idea why this is occuring. This is a private method I use to resize my image.
+ (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize landscape:(BOOL)landscape {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
When this method is called the newsize parameter is equal to the views bounds size (self.view.bounds.size). The size is accessed after the view's transformation to landscape, but the image doesn't properly.
This is the code that is called right after getting an image from the UIImagePickerController.
-(void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
if (image.size.width > image.size.height) {
self.view.transform = CGAffineTransformRotate(self.view.transform, M_PI_2);
self.composition.landscapemode = YES;
} else {
self.composition.landscapemode = NO;
}
self.composition.image = [NewCompositionViewController imageWithImage:image scaledToSize:self.view.bounds.size landscape:self.composition.landscapemode];
self.view.backgroundColor = [UIColor colorWithPatternImage:self.composition.image];
[self dismissViewControllerAnimated:YES completion:nil];
}
[UIColor colorWithPatternImage:] is meant for tiling images, so it's behaving as it should.
I would recommend creating a UIImageView with screen-sized frame, setting an image to it, and adding it as subview:
UIImageView *backgroundImage = [[UIImageView alloc] initWithFrame:self.view.frame];
[backgroundImage setImage:self.composition.image];
// choose best mode that works for you
[backgroundImage setContentMode:UIViewContentModeScaleAspectFill];
[self.view insertSubview:backgroundImage atIndex:0];
//OR
[self.view addSubview:backgroundImage];
[self.view sendSubviewToBack:backgroundImage];
once it's added, you can rotate it and experiment with autoresizing masks to make sure it's displayed properly for all orientations. Exact method would depend on if you are using auto-layout or not.
UIViewContentModeScaleAspectFill may be more appropriate here than UIViewContentModeScaleAspectFit since the image is filling a background view. AspectFit will maintain the image's aspect ratio and make the entire image fit in the space, which may leave portions of the view transparent. AspectFill also maintains aspect ratio, but will fill the entire view and clip any portions of the image that don't match the view bounds.
I've been able to apply an "aspect fit" UIImage to a UIView background by combining a few AVFoundation and UIKit APIs. Here's one example:
UIImage *image = [UIImage imageWithContentsOfFile:self.desiredBackgroundImageFilePathString];
UIGraphicsBeginImageContext(self.drawingImage.frame.size);
[image drawInRect:AVMakeRectWithAspectRatioInsideRect(image.size, self.drawingImage.bounds)];
image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
self.drawingImage.backgroundColor = [UIColor colorWithPatternImage:image];
This flows through a few simple, but important steps:
Generate a UIImage from a file (or whatever).
Define the context of the image (the desired UIView for the background) with UIGraphicsBeginImageContext().
Use drawInRect in combination with AVMakeRectWithAspectRatioInsideRect to scale the image. Provide AVMakeRect...() with the image's .size and the bounds of the target UIView.
Apply the resized image to the desired image context.
Apply your now-resized image to the .backgroundColor of the target UIView using colorWithPatternImage.
I'm able to swap out images with both landscape and portrait aspect ratios without alignment or clipping issues using this code.

Combining two images from uiimageviews into one uiimage with correct positions

I have a view with two UIImageViews. The first uiimageview is a picture, which can be changed but is always at the same position. The second uiimageview contains a uiimage of a logo, this uiimageview can be resized, panned and rotated.
Both uiimageviews are set to Aspect Fit by the way.
When im done dragging and scaling the second uiimageview, i want to save these two uiimages, and draw one single uiimage. But in the new uiimage i need them to be positioned exactly as they were after i finished dragging and resizing the second uiimageview.
Im pushing this combined uiimage over to a new uiviewcontroller, which is called PreviewViewController. Here i want to show the uiimage in an imageview, and save it if the user presses yes.
I almost got it working, but the x and y position is confusing me.
And there is another problem, when drawin the second uiimageview image onto the new image, its view mode looks like Scale to fill and it looks ugly.
Heres my code
- (UIImage *)combineImages{
UIImage *tshirt = self.tskjorteTemplateView.image;
UIImage *logo = self.bildeView.image;
UIGraphicsBeginImageContext(self.tskjorteTemplateView.image.size);
UIGraphicsBeginImageContextWithOptions(self.tskjorteTemplateView.frame.size, NO, 0.0);
[tshirt drawInRect:CGRectMake(0,0, self.tskjorteTemplateView.frame.size.width, self.tskjorteTemplateView.frame.size.height)];
[logo drawInRect:CGRectMake(bildeView.center.x - (bildeView.frame.size.width / 2) ,
bildeView.center.y - (bildeView.frame.size.height / 2), self.bildeView.frame.size.width, self.bildeView.frame.size.height)];
UIImage *result = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return result;
}
and its called here:
fullPreviewImage = [self combineImages];
WantToSaveViewController *save = (WantToSaveViewController *)segue.destinationViewController;
save.delegate = [self.navigationController.viewControllers objectAtIndex:0];
save.previewImage = fullPreviewImage;

Resources