Basically what I am doing is taking an image of the view, applying a blur to it, and then using that as a blurred uiview overlay in reusable collection view cells in a simple uicollectionview.
// Capture Screen for blurr.
-(UIImage *) captureScreen:(CGRect)frame
{
CGRect grabRect = frame;
//for retina displays
if ([[UIScreen mainScreen] respondsToSelector:#selector(scale)])
{
UIGraphicsBeginImageContextWithOptions(grabRect.size, NO, [UIScreen mainScreen].scale);
}
else
{
UIGraphicsBeginImageContext(grabRect.size);
}
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(ctx, -grabRect.origin.x, -grabRect.origin.y);
[self.contentView.layer renderInContext:ctx];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
viewImage = [viewImage applyBlurWithRadius:1.8f tintColor:nil saturationDeltaFactor:1.0 maskImage:viewImage atFrame:self.imageView.frame];
return viewImage;
}
// Set blurred image as image view image.
- (void) updateBlur
{
UIImage* infoViewImage = [self captureScreen:self.infoView.frame];
self.infoImageView.image = infoViewImage;
}
// Prepare for reuse.
- (void) prepareForReuse
{
self.infoImageView.image = nil;
}
Note the uiimageview is created and added as a subview to the cell's contentView in the initialization. Whenever I scroll slowly this works fine. If I scroll quickly the image will only be removed from the image view sometimes... I am not really sure why this is happening. So far I have tried a number of solutions, even removing the whole uiimageview from the superview and re-initializing/re-adding it as a subview each time but this action has the same issue. Please help!
The problem is that the reusable collection view cells are... reusable. You need to implement collectionView:cellForItemAtIndexPath: to deal with every cell it is ever handed, without making any assumptions about whether you may previously have added the subview to it.
Related
I am trying to take a screenshot of my UIImageView and a UILabel that is on top of that.
What I have so far grabs the UIImage in the ImageView and then renders the overlay on it but the positioning of the UILabel is all wrong. I am setting the size of the capture to the actual image size(which isn't what i want).
I just want to be able to take a screenshot exactly how it appears on the screen.
CGSize imageSize = self.imageFromOtherView.size;
// define the size and grab a UIImage from it
UIGraphicsBeginImageContextWithOptions(imageSize, NO, 0);
[self.imageFromOtherView drawInRect:CGRectMake(0, 0, imageSize.width, imageSize.height)];
[self.socialLabel.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Create a UIView to hold both the UIImageView and UILabel. Then pass this container view through this code. I have my view as a property called viewForPhoto so when the code is called it only captures that view. You can tweak it so that it receives a view. This will return the UIImage that you want.
- (UIImage *)imageByRenderingView
{
UIGraphicsBeginImageContextWithOptions(self.viewForPhotoView.bounds.size, NO, 0.0);
[self.viewForPhotoView.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultingImage;
}
Add both those views to a common container view, and then call - (BOOL)drawViewHierarchyInRect:(CGRect)rect afterScreenUpdates:(BOOL)afterUpdates on the view to render it in a context.
I'm facing a really weird problem with UIImageView, I was trying to set an image - which created by take the screenshot of the current view - to an ImageView with content mode is UIViewContentModeScaleAspectFit.
It worked fine when I set the image by the interface builder in the xib file or when I set the image created by [UIImage imageNamed:]. They both worked fine with UIViewContentModeScaleAspectFit.
But when I take the snap shot of a view and set the image to the image view, the image did not fit to the UIImageView. I've tried all the solutions I found on here like .ClipsToBound = YES but they didn't work at all. I'm really confused by now.
Here's the code when I take the screen shot and create the UIImage:
- (UIImage *)screenshotWithRect:(CGRect)captureRect
{
CGFloat scale = [[UIScreen mainScreen] scale];
UIImage *screenshot;
UIGraphicsBeginImageContextWithOptions(self.frame.size, NO, scale);
CGContextClipToRect (UIGraphicsGetCurrentContext(),captureRect);
{
if(UIGraphicsGetCurrentContext() == nil)
{
NSLog(#"UIGraphicsGetCurrentContext is nil. You may have a UIView (%#) with no really frame (%#)", [self class], NSStringFromCGRect(self.frame));
}
else
{
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
screenshot = UIGraphicsGetImageFromCurrentImageContext();
}
}
UIGraphicsEndImageContext();
return screenshot;
}
And when I set the image to the image view
UIImage* snap = [[UIImage alloc] init];
// start snap shot
UIView* superView = [self.view superview];
CGRect cutRect = [superView convertRect:self.cutView.frame fromView:_viewToCut];
snap = [superView screenshotWithRect:cutRect];
[self.view addSubview:self.editCutFrameView];
// end snap shot -> show edit view
[self.editCutFrameView setImage:snap];
Here's a picture compare the 2 results:
Many thanks for your help.
UPDATE: As #Saheb Roy mentioned about the size, I checked the image size and it's about 400x500px and the thumbnail.png's size is 512x512px so I think it's not about the size of the image.
This is because in the second case, the snapshot image is itself exactly that size as you can see. Hence the image is not being stretched or fitted accordingly.
Earlier images are fitting to screen accordingly as the images were bigger than the imageview but with different ratio or same than that of the image.
But the one where it is not fitting to the imageview, the image itself is of that much size, i.e. smaller than that of the imageview, hence it is NOT being fitted to the bounds.
How to crop a rectangle(red square in screen shot) of UIImage which is rotated as well as zoomed using UIScrollView.
The edges of UIImageView are hidden because of rotation(UIImageView Transformation). Please help.
Well, you can do all the complicated core graphics things or do a simple UIView screenshot. I vote for the easy solution: What you have to do is create a new view with the frame same as where that small rect lies. Then add the whole image view to that small view converting its frame so it looks the same. Then take the screenshot of the small view. After you are done simply put the image view back the way it was and remove the small view.
As this is still easier said then done here is some code to chew on (I did NOT test this so please correct the bugs if any after you succeed).
- (UIImage *)getScreenshotInRect:(CGRect)frame {
UIImageView *theImageView; //your original image view
UIView *backupSuperView = theImageView.superview; //backup original superview
CGRect backupFrame = theImageView.frame; //backup original frame
UIView *frameView = [[UIView alloc] initWithFrame:frame]; //create new view where the image should be taken at
frameView.clipsToBounds = YES; //not really necessery but can be usefull for cases like using corner radius
[self addSubview:frameView];
theImageView.frame = [theImageView.superview convertRect:theImageView.frame toView:frameView]; //set the new frame for the image view
[frameView addSubview:theImageView];
UIImage *toReturn = [self imageFromView:frameView]; //get the screenshot
theImageView.frame = backupFrame; //reset the image view frame
[backupSuperView addSubview:theImageView]; //reset the image view's superview
[frameView removeFromSuperview];
frameView = nil;
return toReturn;
}
- (UIImage *)imageFromView:(UIView *)view {
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, .0f);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}
I do hope this doesn't break because you have rotations. If that is the case I suggest you create another view on which the rotated image view lies and add this view to the small view.
I am working on iOS application where I need to capture the view and send MMS to particular person.Its all working fine.But I am facing problem to capture which is not visible (For more clarification I attached image).
I am getting the screen shot of the view which is visible.How to solve the problem? Is there any approach to reach my requirement? The image what I am getting is
I used the code to take screenshot is
UIGraphicsBeginImageContext(webview_pdf.bounds.size);
[webview_pdf.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *pdfImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Good suggestions are appreciable.Thanks in advance.!
Finally found the solution to this
+ (UIImage *) imageFromWebView:(UIWebView *)view
{
// tempframe to reset view size after image was created
CGRect tmpFrame = view.frame;
// set new Frame
CGRect aFrame = view.frame;
aFrame.size.height = [view sizeThatFits:[[UIScreen mainScreen] bounds].size].height;
view.frame = aFrame;
// do image magic
UIGraphicsBeginImageContext([view sizeThatFits:[[UIScreen mainScreen] bounds].size]);
CGContextRef resizedContext = UIGraphicsGetCurrentContext();
[view.layer renderInContext:resizedContext];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// reset Frame of view to origin
view.frame = tmpFrame;
return image;
}
You need to create another view that is the full size of the content. You can add this view off screen and then capture it the same way as you have done here. The reason it is cut off is because the view has only rendered that part of the content.
I am applying blur to various sections of my app using the UIImage+ImageEffects.h sample code that was provided in one of the apps from the WWDC in 2013. It works well and I'm able to recreate the iOS7 effects for any UIImage.
However, I would like to recreate the effect of the UINavigationBar blurred transparency but using any view that I choose. Similar to the screenshot shown below.
For instance, say that I have a UITableView that takes up half of the screen. I also have a UIImageView background as a separate view behind it that occupies the entire screen. I would only like to blur the UIImageView background for just that section of the screen that's under the tableview.
Here's my question. How do I create create a UIImage by taking a "screenshot" of whatever is behind a UIView that is displayed? Is this even possible?
Here is my screen hierarchy. Nothing complex. I would like the "Blurred Image View" to contain a blurred image of the section of the "Image View" that is sitting as the main UIImageView in the hierarchy.
If you are deploing only on iOS7 you can use the new api, that are a lot faster than -renderInContext and use the ImageEffects category on that image taken from the view. Add this a a category on UIView
#interface UIView (RenderView)
- (UIImage *) imageByRenderingView;
- (UIImage *) imageByRenderingViewOpaque:(BOOL) yesOrNO;
#end
#implementation UIView (RenderView)
- (UIImage *) imageByRenderingViewOpaque:(BOOL) yesOrNO {
UIGraphicsBeginImageContextWithOptions(self.bounds.size, yesOrNO, 0);
if ([self respondsToSelector:#selector(drawViewHierarchyInRect:afterScreenUpdates:)]) {
[self drawViewHierarchyInRect:self.bounds afterScreenUpdates:NO];
}
else {
[self.layer renderInContext:UIGraphicsGetCurrentContext()];
}
UIImage *resultingImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return resultingImage;
}
- (UIImage *) imageByRenderingView{
return [self imageByRenderingViewOpaque:NO];
}
This snippet is a UIView category ok also for system lower than iOS7. It takes an image on a view and its subviews.
Use the below piece of code for iOS < 7
#import <QuartzCore/QuartzCore.h>
+ (UIImage *) imageWithView:(UIView *)view
{
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage * img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return img;
}