Capturing (animated) screen behind UIView - ios

I asked a similar question recently on how to add a translucent effect to a UIView and got a good response.
However it used a lot of CPU power to process so I used some of the ideas behind the answer but did the filtering using GPUImage which is much more efficient.
It works well for a static screen, but I want to change the background UIImageView's image with an animation. However when I set the UIView to sample the background during the transition, it seems to ignore the transition and show the new Image before the animation has started.
The relevant code is as follows (ask if you need more!):
The superview containing the custom UIView and the UIImageView background:
//The Transition
[contentView setScheduled:YES]; //Starts the sampling every 0.2 seconds
//bg is the UIImageView and contentView is the 'translucent' UIView subclass
[UIView transitionWithView:bg duration:1.0 options:UIViewAnimationOptionTransitionCrossDissolve animations:^{
[bg setImage:newImage];
}completion:^(BOOL finished){
[contentView setScheduled:NO];
}];
Then in the UIView subclass:
- (UIImage *)snapshotOfSuperview:(UIView *)superview
{
CGFloat scale = 0.5;
if (([UIScreen mainScreen].scale > 1 || self.contentMode == UIViewContentModeScaleAspectFill))
{
CGFloat blockSize = 12.0f/5;
scale = blockSize/MAX(blockSize * 2, floor(self.blurRadius));
}
UIGraphicsBeginImageContextWithOptions(self.bounds.size, YES, scale);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context, -self.frame.origin.x, -self.frame.origin.y);
self.hidden=YES; //Don't take a snapshot of the view
[superview.layer renderInContext:context];
self.hidden=NO;
UIImage *snapshot = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return snapshot;
}
-(void)updateViewBG{
UIImage *superviewImage = [self snapshotOfSuperview:self.superview];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
GPUImageGaussianBlurFilter* filter = [[GPUImageGaussianBlurFilter alloc] init];
filter.blurSize = 0.8f;
UIImage* newBG = [self applyTint:self.tintColour image:[filter imageByFilteringImage:superviewImage]];
dispatch_async(dispatch_get_main_queue(), ^{
self.layer.contents = (id)newBG.CGImage;
self.layer.contentsScale = newBG.scale;
});
});
}
-(UIImage*)applyTint:(UIColor*)colour image:(UIImage*)inImage{
UIImage *newImage;
if (colour) {
UIGraphicsBeginImageContext(inImage.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
CGRect area = CGRectMake(0, 0, inImage.size.width, inImage.size.height);
CGContextScaleCTM(ctx, 1, -1);
CGContextTranslateCTM(ctx, 0, -area.size.height);
CGContextSaveGState(ctx);
CGContextClipToMask(ctx, area, inImage.CGImage);
[colour set];
CGContextFillRect(ctx, area);
CGContextRestoreGState(ctx);
CGContextSetBlendMode(ctx, kCGBlendModeLighten);
CGContextDrawImage(ctx, area, inImage.CGImage);
newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
}else{
newImage = inImage;
}
return newImage;
}
-(void)displayLayer:(CALayer *)layer{
[self updateViewBG];
}
How can I get the background to follow the animation?

I think the problem is that as soon as you change the image, displayLayer is called (disregarding the transition being in progress) and thus the new image is used for the translucency.
You should modify updateViewBG so that it does not update the layer content before the transition is finished. E.g., you could add a flag to your class, set it when you start the transition and reset it when it completes. When the flag is set you do not call updateViewBG.
[UIView transitionWithView:bg duration:1.0 options:UIViewAnimationOptionTransitionCrossDissolve animations:^{
self.isTransitionInProgress = YES;
[bg setImage:newImage];
}completion:^(BOOL finished){
[contentView setScheduled:NO];
self.isTransitionInProgress = NO;
[contentView.layer setNeedsDisplay];
}];
-(void)displayLayer:(CALayer *)layer{
if (!self.isTransitionInProgress)
[self updateViewBG];
}

Related

Objective-C change image color performance

I am currently using the function below to change png image color, the color is set through a color slider, so while sliding through colors everything works fine and get the resulted image colored correspondingly, I am only having issue with slider performance while sliding, it keeps lagging as well as for the image color update, need help to make the process smooth.
- (UIImage*)imageWithImage:(UIImage *)sourceImage fixedHue:(CGFloat)hue saturation:(CGFloat)saturation brightness:(CGFloat)brightness alpha:(CGFloat)alpha{
CGSize imageSize = [sourceImage size];
UIGraphicsBeginImageContext(imageSize);
CGContextRef context = UIGraphicsGetCurrentContext();
CGContextTranslateCTM(context, 0, sourceImage.size.height);
CGContextScaleCTM(context, 1.0, -1.0);
CGRect rect = CGRectMake(0, 0, sourceImage.size.width, sourceImage.size.height);
CGContextSetBlendMode(context, kCGBlendModeNormal);
CGContextDrawImage(context, rect, sourceImage.CGImage);
CGContextSetBlendMode(context, kCGBlendModeColor);
[[UIColor colorWithHue:hue saturation:saturation brightness:brightness alpha:alpha] setFill];
CGContextFillRect(context, rect);
CGContextSetBlendMode(context, kCGBlendModeDestinationIn);
CGContextDrawImage(context, rect, sourceImage.CGImage);
CGContextFlush(context);
UIImage *editedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return editedImage;
}
Make an async version of your function as follows...
- (void)imageWithImage:(UIImage *)sourceImage
fixedHue:(CGFloat)hue
saturation:(CGFloat)saturation
brightness:(CGFloat)brightness
alpha:(CGFloat)alpha
completion:(void (^)(UIImage *))completion {
dispatch_queue_t queue = dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0);
dispatch_async(queue, ^{
// call your original function. Use this to create the context...
UIGraphicsBeginImageContextWithOptions(imageSize, YES, 0.0 );
// don't call CGContextFillRect, call...
UIRectFill(rect);
// don't call CGContextDrawImage, call...
[sourceImage drawInRect:rect]
// don't call CGContextFlush, don't need to replace that
UIImage *image = [self imageWithImage:sourceImage fixedHue:hue saturation:saturation brightness:brightness alpha:alpha];
dispatch_async(dispatch_get_main_queue(), ^{
completion(image);
});
});
}
Use it like this:
- (IBAction)sliderValueChanged:(UISlider *)sender {
[self imageWithImage:sourceImage
fixedHue:hue
saturation:saturation
brightness:brightness
alpha:alpha
completion:^(UIImage *image) {
// update the UI here with image
}];
}

View lose cornerRadius with Animation of UIViewAnimationOptionTransitionFlipFromLeft on iOS 8

In the code, I have set:
imageView.layer.masksToBounds = YES;
imageView.layer.cornerRadius = imageView.frame.size.width/2;
And run it in animation block:
[UIView transitionWithView:imageView
duration:1.0
options:UIViewAnimationOptionTransitionFlipFromLeft
animations:nil
completion:^(BOOL finished) {
imageView.hidden = YES;
}];
But when running on iOS 8, the imageView is back to normal, means there is no cornerRadius for imageView.
Could anyone tell me why?
You need to slash corners of UIImage not UIImageView using CALayer using below code.
// load image on imageView programmatically
UIImage *image = [UIImage imageNamed:#"yourImage.png"];
// assign the changes reflected using method before load image to imageView
image = [self makeRoundedImage:image radius:_imageView.frame.size.width/2];
_imageView.image = image;
// method to create round image
-(UIImage *)makeRoundedImage:(UIImage *) image radius: (float) radius{
CALayer *imageLayer = [CALayer layer];
imageLayer.frame = CGRectMake(0, 0, image.size.width, image.size.height);
imageLayer.contents = (id) image.CGImage;
imageLayer.masksToBounds = YES;
imageLayer.cornerRadius = radius;
UIGraphicsBeginImageContext(image.size);
[imageLayer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *roundedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return roundedImage;
}
Now you can apply animation simply.

iOS8 can I animate hue/saturation/brightness of UIImage using CIFilter?

I'm looking at an Apple example that uses core image filters to adjust Hue/saturation/brightness of an image. In this case the input image has it's properties adjusted, and another image is returned as a result of the filter operation. I'm interested if this transition can be animated (step by step transition).
Is there a way for me to programmatically animate a black and white image having color slowly fading in?
I checked the Apple view programming guidelines, but don't see anything about animating images or colors.
- (void)hueChanged:(id)sender
{
CGFloat hue = [hueSlider value];
CGFloat saturation = [saturationSlider value];
CGFloat brightness = [brightnessSlider value];
// Update labels
[hueLabel setText:[NSString stringWithFormat:NSLocalizedString(#"Hue: %f", #"Hue label format."), hue]];
[saturationLabel setText:[NSString stringWithFormat:NSLocalizedString(#"Saturation: %f",
#"Saturation label format."), saturation]];
[brightnessLabel setText:[NSString stringWithFormat:NSLocalizedString(#"Brightness: %f", #"Brightness label format."), brightness]];
// Apply effects to image
dispatch_async(processingQueue, ^{
if (!self.colorControlsFilter) {
self.colorControlsFilter = [CIFilter filterWithName:#"CIColorControls"];
}
[self.colorControlsFilter setValue:self.baseCIImage forKey:kCIInputImageKey];
[self.colorControlsFilter setValue:#(saturation) forKey:#"inputSaturation"];
[self.colorControlsFilter setValue:#(brightness) forKey:#"inputBrightness"];
CIImage *coreImageOutputImage = [self.colorControlsFilter valueForKey:kCIOutputImageKey];
if (!self.hueAdjustFilter) {
self.hueAdjustFilter = [CIFilter filterWithName:#"CIHueAdjust"];
}
[self.hueAdjustFilter setValue:coreImageOutputImage forKey:kCIInputImageKey];
[self.hueAdjustFilter setValue:#(hue) forKey:#"inputAngle"];
coreImageOutputImage = [self.hueAdjustFilter valueForKey:kCIOutputImageKey];
CGRect rect = CGRectMake(0,0,self.image.size.width,self.image.size.height);
CGImageRef cgImage = [self.context createCGImage:coreImageOutputImage fromRect:rect];
UIImage *image = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
dispatch_async(dispatch_get_main_queue(), ^{
[imageView setImage:image];
});
});
// imageView.center = self.view.center;
// imageView.frame = self.view.frame;
}
For slowly fade animations you can use core animations in IOS
theView.alpha = 0;
[UIView animateWithDuration:1.0
animations:^{
//Apply effects to image here
theView.center = midCenter;
theView.alpha = 1;
}
completion:^(BOOL finished){
[UIView animateWithDuration:1.0
animations:^{
//Apply effects to image here
}
completion:^(BOOL finished){
}];
}];

iOS scaling with CGAffineTransformMakeScale on drawRect content

float scaleFactor = 0.5;
self.bubble.transform = CGAffineTransformMakeScale(scaleFactor, scaleFactor);
[self.bubble setNeedsDisplay];
[UIView animateWithDuration:2.0 animations:^{
self.frame = _rectToMoveTo;
} completion:^(BOOL finished) {
[UIView animateWithDuration:2.0f animations:^{
self.bubble.transform = CGAffineTransformMakeScale(1.0, 1.0);
}];
}];
The above code performs the animation mostly correct. As you can see, the self.bubbles are scaled down to ½ then animated back to normal. The problem is that self.bubble has a drawRect method for drawing a circle. The problem is that this circle gets scaled down to ¼ from the start! When the second animation runs, the subviews of self.bubble get scaled to normal, but the circle scales up to only ½. I've tried using setNeedsDisplay to redraw the circle but it will only work after the animations have completed so it's no good. how do you fix this?
Edit: here is _bubble drawRect method.
CGContextRef c = UIGraphicsGetCurrentContext();
if (!_colour) _colour = [UIColor darkGrayColor];
[_colour setFill];
[[UIColor clearColor] setStroke];
CGContextAddEllipseInRect(c, CGRectMake(0, 0, self.frame.size.width, self.frame.size.height));
CGContextFillPath(c);
[self addSubview:_title];
[self addSubview:_icon];
self.frame is in the superview's coordinate system, and is thus affected by your transform, which you don't want.
Inside drawRect:, you should only use self.bounds, not self.frame.
Also, you should not add subviews in drawRect:.

Gaussian Blur Filter Reversal: iOS

I've got a gaussian blur which I'm doing for an app.
//Get a UIImage from the UIView
UIGraphicsBeginImageContext(self.view.bounds.size);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//Blur the UIImage
CIImage *imageToBlur = [CIImage imageWithCGImage:viewImage.CGImage];
CIFilter *gaussianBlurFilter = [CIFilter filterWithName:#"CIGaussianBlur"];
[gaussianBlurFilter setValue:imageToBlur forKey:#"inputImage"];
[gaussianBlurFilter setValue:[NSNumber numberWithFloat:2] forKey:#"inputRadius"];
CIImage *resultImage = [gaussianBlurFilter valueForKey:#"outputImage"];
UIImage *endImage = [[UIImage alloc] initWithCIImage:resultImage];
//Place the UIImage in a UIImageView
newView = [[UIImageView alloc] initWithFrame:self.view.bounds];
newView.image = endImage;
[self.view addSubview:newView];
It works great, but I want to be able to reverse it and return the view back to normal.
How can I do this as trying to simply remove the blur's view from it's superview didn't work. I've also tried setting various properties to nil with no luck.
Keep a pointer to viewImage in a property
#property (nonatomic, strong) UIImage* originalImage;
then after
UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
add
self.originalImage = viewImage;
To revert your image:
newView.image = self.originalImage;
when you apply the blur it does not alter viewImage.. you have created a separate CIImage which gets blurred, and you make a new UIImage from the blurred CIImage.
First thing i saw in your code is that you don't apply the filter in a async task. blur the image take time, so if you don't want to freeze the main thread you should use:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
//blur the image in a second thread
dispatch_async(dispatch_get_main_queue(), ^{
//set the blurred image to your imageView in the main thread
});
});
To can reverse the original image, you just put the blurred copy in other imageView that appear over the original. In my case the original image isn't a imageView, but the view itself. So i only add a imageView in my view to set the blurred image, and set to nil when i want to reverse.
Finally, if you want to avoid the blink when you set the blurred image, you can play with alpha in animations to make a soft transition:
[self.blurredImageView setAlpha: 0.0]; //make the imageView invisible
[self.blurredImageView setImage:blurredImage];
//and after set the image, make it visible slowly.
[UIView animateWithDuration:0.5 delay:0.1
options:UIViewAnimationOptionCurveEaseInOut
animations:^{
[self.blurredImageView setAlpha: 1.0];
}
completion:nil];
Here is my complete methods:
- (void)makeBlurredScreenShot{
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, self.view.opaque, 0.0);
[self.view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *imageView = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
CIContext *context = [CIContext contextWithOptions:nil];
CIImage *sourceImage = [CIImage imageWithCGImage:imageView.CGImage];
// Apply clamp filter:
// this is needed because the CIGaussianBlur when applied makes
// a trasparent border around the image
NSString *clampFilterName = #"CIAffineClamp";
CIFilter *clamp = [CIFilter filterWithName:clampFilterName];
if (!clamp)
return;
[clamp setValue:sourceImage forKey:kCIInputImageKey];
CIImage *clampResult = [clamp valueForKey:kCIOutputImageKey];
// Apply Gaussian Blur filter
NSString *gaussianBlurFilterName = #"CIGaussianBlur";
CIFilter *gaussianBlur = [CIFilter filterWithName:gaussianBlurFilterName];
if (!gaussianBlur)
return;
[gaussianBlur setValue:clampResult forKey:kCIInputImageKey];
[gaussianBlur setValue:[NSNumber numberWithFloat:8.0] forKey:#"inputRadius"];
CIImage *gaussianBlurResult = [gaussianBlur valueForKey:kCIOutputImageKey];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
CGImageRef cgImage = [context createCGImage:gaussianBlurResult fromRect:[sourceImage extent]];
UIImage *blurredImage = [UIImage imageWithCGImage:cgImage];
CGImageRelease(cgImage);
dispatch_async(dispatch_get_main_queue(), ^{
[self.blurredImageView setAlpha: 0.0];
[self.blurredImageView setImage:blurredImage];
[UIView animateWithDuration:0.5 delay:0.1
options:UIViewAnimationOptionCurveEaseInOut
animations:^{
[self.blurredImageView setAlpha: 1.0];
}
completion:nil];
});
});
}
- (void)removeBlurredScreenShot{
[UIView animateWithDuration:0.5 delay:0.1
options:UIViewAnimationOptionCurveEaseInOut
animations:^{
[self.blurredImageView setAlpha: 0.0];
}
completion:^(BOOL finished) {
[self.blurredImageView setImage:nil];
}];
}
Like I said in my previous comment, you could create a property/iVar that holds the image before the effect is applied and revert in case you want to undo.
if(!_originalImage)
_originalImage = [[UIImage alloc] init];
_originalImage = viewImage; //(Creates a copy, not a C-Type pass-by-reference)
// Do your Blur Stuff
// Now somewhere down the line in your program, if you don't like the blur and the user would like to undo:
viewImage = _originalImage;
As per your comment on #HeWas's answer, you shouldn't blur the view completely. If the view is getting blurred, there's something else you are doing wrong elsewhere in your program.

Resources