iOS animate a blur for a view - ios

I would like to quickly animate a blur on a UIView to use as a transition in my app. I'm having trouble knowing where to start. I believe core image is the proper tool for the job. Can anyone point me to a sample of how to blur a UIView? I'm assuming I will need to convert the view into a single UIImage, but I don't know where to proceed from there.
Thanks in advance!

Taking a snapshot of the View and using GPUImage from Brad Larson (the GPUImageGaussianBlurFilter) got me some nice results.
To animate the view I created a ImageView with the blurred image and animated the alpha channel from 0 to 1 to make the blur appear progressively.
Alternatively, I presume its possible to increase the blursize per frame.
#import "GPUImage.h"
...
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
...
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
...
GPUImageGaussianBlurFilter * filter = [[GPUImageGaussianBlurFilter alloc] init];
filter.blurSize = 0.5;
UIImage * blurred = [filter imageByFilteringImage:image];

rasterizeScale of a uiview's layer is what you need, Here is the code for adding blur effect to UIVIew:
CALayer *layer = [self.blurView layer];
[layer setRasterizationScale:0.3];
[layer setShouldRasterize:YES];
For details refer to Apple Documentation of CALayer, Also this tutorial might help You, hope that helps

I recently did some tests with blurring a series of images at different blur settings and animating them simply with UIImageView. You might want to take a look:
AnimatedGaussianBlur

Related

how to implement this animation in iOS?

I am not sure about how to present this question because I don't know the animation term that I should use.
I need to know about this tree presentation animation. As it is appearing form root to top.
Please take a look on attached .gif file and let me know if anyone know about this animation or if you can guide me with example.
I will really appreciate.
Thanks for your time.
The best option for you is to split up the gif into multiple images and then do the following:
NSArray *gifImagesArray = [NSArray arrayWithObjects:imageOne, imageTwo, imageThree, nil];
imageView.animationImages = animateImagesArray;
imageView.animationRepeatCount = 1;
imageView.animationDuration = 1.0f;
[imageView startAnimating];
Edit following comments:
If you can't use multiple images you have to write the animation yourself.
One suggestion is to add a circular mask to the UIImage and animate it's removal.
This link explains how to draw a circular CALayer: Circular Progress Bars in IOS
This one here will show you how to create a mask on a UIImage: Simply mask a UIView with a rectangle
now all you have to do is a simple animation.
I have a solution that will work for this problem.
We can use gif image and convert it into UIImage object. So image object will work same as animation.
Thanks all for your answers.
In my opinion, I will separate this animation to 4 parts:
animation to show full black tree from center.
animation to show Texts
animation to show blue leafs.
animation to show 2 buttons at bottom.
You can use this framework https://github.com/facebook/pop to operate all animations step by step or even Core Animation.
Please do more research... I think you will success to make one.
You can achieve this animation using simple UIImageView, that can load multiple images using animationImages property of UIImageView.
First, create one NSMutablerArray of images.
then asssign that images to imageView, and animationDuration for that images.
UIImageView *animationImageView = [[UIImageView alloc] initWithFrame:CGRectMake(60, 95, 86, 193)];
animationImageView.animationImages = images;
animationImageView.animationDuration = 0.5;
add the imageview to view.
[self.view addSubview:animationImageView];
[animationImageView startAnimating];

UIVibrancyEffect On iOS 7

So I've been playing around with the iOS 8 beta and implementing the new UIEffectViews in the places that my app needed them. Now I've run into the issue that I still want to have backwards compatibility for iOS 7, but maintain the vibrancy effect because it really helps readability. I've used UIToolbars in the past for a blur effect, and they work great, but not for vibrancy. I thought I'd subclass UIView and add a toolbar subview and then do some clever rendering to sort of achieve the vibrancy effect which would look like this:
1. render the toolbar to a UIImage
2. render the vibrant content to a UIImage
3. mask the toolbar image to the vibrant content image mask
4. mess with the saturation and brightness
5. have a subview of the UIView display the final result over the toolbar
I've tried doing this in drawRect: of the UIView but it doesn't want to redraw every frame, and setting a timer really messes with animation, even though the render time isn't very high. If anyone can point me to sample code or a open source library, it would be much appreciated.
Thanks.
So I never posted an answer, but I did figure it out.
The brute force approach I tried was to use Core Image effects. I would render the superview to a UIImage, blur it, then overlay it on a toolbar with the dark style. This looked great, but even on a GPU context on my 5S, it was pretty slow, so theres no way it would work on other devices. This is the best I could get it to look, and would work great for static content, but is not practical for real-time.
I was able to achieve a real time version, but it doesn't look quite as good. Basically what I do is render all the vibrant content to a image and use it for a mask for a view. Then I make the view barely visible (like .2 alpha), and then put it over a toolbar. It doesn't look quite as vibrant as iOS8, or the original CI version, but it works great and preforms well.
Heres a bit of code you can just copy and paste if you really want:
-(instancetype)initWithFrame:(CGRect)frame
{
self = [super initWithFrame:frame];
if (self)
{
self.backgroundColor = [UIColor colorWithWhite:1 alpha:0.2];
maskingContents = [[UIView alloc] initWithFrame:self.bounds];
[self addSubview:maskingContents];
}
return self;
}
-(void)addSubview:(UIView *)view
{
if (![view isEqual:maskingContents])
{
[maskingContents setHidden:NO];
[maskingContents addSubview:view];
//now we need to mask it
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0);
[maskingContents.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* mask = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//apply the mask
CALayer* maskLayer = [CALayer layer];
maskLayer.frame = self.bounds;
[maskLayer setContents:(id)mask.CGImage];
[self.layer setMask:maskLayer];
[maskingContents setHidden:YES];
} else [super addSubview:view];
}
-(void)forceVibrancyUpdate
{
[maskingContents setHidden:NO];
//now we need to mask it
UIGraphicsBeginImageContextWithOptions(self.bounds.size, NO, 0);
[maskingContents.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage* mask = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//apply the mask
CALayer* maskLayer = [CALayer layer];
maskLayer.frame = self.bounds;
[maskLayer setContents:(id)mask.CGImage];
[self.layer setMask:maskLayer];
[maskingContents setHidden:YES];
}
#end
If you want to dynamically update the content inside the vibrancy view, you would call forceVibrancyUpdate, as that would re-render the mask and apply it. Hope this helped everyone.

How to achieve this effect in iPhone SDK?

I would like to how can I apply this effect in iPhone SDK?
So, I have an image and a label on top of it.
I want to have the effect in which the bottom portion kind of blends in with the image.
So that there is no clear demarcation from where the image ends at the bottom portion of the view.
Please let me know.
Easy way to achieve this to CAGradientLayer
UIView *yourGradientView; // with that label "ENTREES", Add this view as a subview of the background view.
CAGradientLayer *gradientLayer=[CAGradientLayer layer];
[gradientLayer setFrame:[yourGradientView bounds]];
[gradientLayer setColors:#[(id)[UIColor clearColor].CGColor, (id)[[UIColor whiteColor] colorWithAlphaComponent:0.7f].CGColor]];
[gradientLayer setLocations:#[[NSNumber numberWithFloat:0.50f], [NSNumber numberWithFloat:1.0f]]];
[[yourGradientView layer] insertSublayer:gradientLayer atIndex:0];
Easiest solution: Add UIImageView with gradient PNG image as a subview between the image and label if you have the constant color.
If you need variable color of the gradient, you can either add a subview with the gradient drawn using CoreGraphics or CALayer.
If you need the image to blend with any background, you can mask the background image layer with CALayer gradient layer.
Unless you provide any more details to your question with regards to functionality and some code, the first look instance seems to have the following solution:
Step 1
Set a UIImage as a background image. In your case it is the one shown in the question.
Step 2 add a UILabel as a subview of UIImage and set the background of UILabel to be transparent. Position the label as per your needs which in your case seems to be the bootom left.
Hope his helps !!!

Blur screen with iOS 7's snapshot API

I believe the NDA is down, so I can ask this question. I have a UIView subclass:
BlurView *blurredView = ((BlurView *)[self.view snapshotViewAfterScreenUpdates:NO]);
blurredView.frame = self.view.frame;
[self.view addSubview:blurredView];
It does its job so far in capturing the screen, but now I want to blur that view. How exactly do I go about this? From what I've read I need to capture the current contents of the view (context?!) and convert it to CIImage (no?) and then apply a CIGaussianBlur to it and draw it back on the view.
How exactly do I do that?
P.S. The view is not animated, so it should be OK performance wise.
EDIT: Here is what I have so far. The problem is that I can't capture the snapshot to a UIImage, I get a black screen. But if I add the view as a subview directly, I can see the snapshot is there.
// Snapshot
UIView *view = [self.view snapshotViewAfterScreenUpdates:NO];
// Convert to UIImage
UIGraphicsBeginImageContextWithOptions(view.bounds.size, view.opaque, 0.0);
[view.layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *img = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
// Apply the UIImage to a UIImageView
BlurView *blurredView = [[BlurView alloc] initWithFrame:CGRectMake(0, 0, 500, 500)];
[self.view addSubview:blurredView];
blurredView.imageView.image = img;
// Black screen -.-
BlurView.m:
- (id)initWithFrame:(CGRect)frame {
self = [super initWithFrame:frame];
if (self) {
// Initialization code
self.imageView = [[UIImageView alloc] init];
self.imageView.frame = CGRectMake(20, 20, 200, 200);
[self addSubview:self.imageView];
}
return self;
}
Half of this question didn't get answered, so I thought it worth adding.
The problem with UIScreen's
- (UIView *)snapshotViewAfterScreenUpdates:(BOOL)afterUpdates
and UIView's
- (UIView *)resizableSnapshotViewFromRect:(CGRect)rect
afterScreenUpdates:(BOOL)afterUpdates
withCapInsets:(UIEdgeInsets)capInsets
Is that you can't derive a UIImage from them - the 'black screen' problem.
In iOS7 Apple provides a third piece of API for extracting UIImages, a method on UIView
- (BOOL)drawViewHierarchyInRect:(CGRect)rect
afterScreenUpdates:(BOOL)afterUpdates
It is not as fast as snapshotView, but not bad compared to renderInContext (in the example provided by Apple it is five times faster than renderInContext and three times slower than snapshotView)
Example use:
UIGraphicsBeginImageContextWithOptions(image.size, NULL, 0);
[view drawViewHierarchyInRect:rect];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
Then to get a blurred version
UIImage* lightImage = [newImage applyLightEffect];
where applyLightEffect is one of those Blur category methods on Apple's UIImage+ImageEffects category mentioned in the accepted answer (the enticing link to this code sample in the accepted answer doesn't work, but this one will get you to the right page: the file you want is iOS_UIImageEffects).
The main reference is to WWDC2013 session 226, Implementing Engaging UI on iOS
By the way, there is an intriguing note in Apple's reference docs for renderInContext that hints at the black screen problem..
Important: The OS X v10.5 implementation of this method does not support the entire Core Animation composition model. QCCompositionLayer, CAOpenGLLayer, and QTMovieLayer layers are not rendered. Additionally, layers that use 3D transforms are not rendered, nor are layers that specify backgroundFilters, filters, compositingFilter, or a mask values. Future versions of OS X may add support for rendering these layers and properties.
The note hasn't been updated since 10.5, so I guess 'future versions' may still be a while off, and we can add our new CASnapshotLayer (or whatever) to the list.
Sample Code from WWDC ios_uiimageeffects
There is a UIImage category named UIImage+ImageEffects
Here is its API:
- (UIImage *)applyLightEffect;
- (UIImage *)applyExtraLightEffect;
- (UIImage *)applyDarkEffect;
- (UIImage *)applyTintEffectWithColor:(UIColor *)tintColor;
- (UIImage *)applyBlurWithRadius:(CGFloat)blurRadius
tintColor:(UIColor *)tintColor
saturationDeltaFactor:(CGFloat)saturationDeltaFactor
maskImage:(UIImage *)maskImage;
For legal reason I can't show the implementation here, there is a demo project in it. should be pretty easy to get start with.
To summarize how to do this with foundry's sample code, use the following:
I wanted to blur the entire screen just slightly so for my purposes so I'll use the main screen bounds.
CGRect screenCaptureRect = [UIScreen mainScreen].bounds;
UIView *viewWhereYouWantToScreenCapture = [[UIApplication sharedApplication] keyWindow];
//screen capture code
UIGraphicsBeginImageContextWithOptions(screenCaptureRect.size, NO, [UIScreen mainScreen].scale);
[viewWhereYouWantToScreenCapture drawViewHierarchyInRect:screenCaptureRect afterScreenUpdates:NO];
UIImage *capturedImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
//blur code
UIColor *tintColor = [UIColor colorWithWhite:1.0 alpha:0];
UIImage *blurredImage = [capturedImage applyBlurWithRadius:1.5 tintColor:tintColor saturationDeltaFactor:1.2 maskImage:nil];
//or use [capturedImage applyLightAffect] but I thought that was too much for me
//use blurredImage in whatever way you so desire!
Notes on the screen capture part
UIGraphicsBeginImageContextWithOptions() 2nd argument is opacity. It should be NO unless you have nothing with any alpha other than 1. If you return yes the screen capture will not look at transparency values so it will go faster but will probably be wrong.
UIGraphicsBeginImageContextWithOptions() 3rd argument is the scale. Probably want to put in the scale of the device like I did to make sure and differentiate between retina and non-retina. But I haven't really tested this and I think 0.0f also works.
drawViewHierarchyInRect:afterScreenUpdates: watch out what you return for the screen updates BOOL. I tried to do this right before backgrounding and if I didn't put NO the app would go crazy with glitches when I returned to the foreground. You might be able to get away with YES though if you're not leaving the app.
Notes on blurring
I have a very light blur here. Changing the blurRadius will make it blurrier, and you can change the tint color and alpha to make all sorts of other effects.
Also you need to add a category for the blur methods to work...
How to add the UIImage+ImageEffects category
You need to download the category UIImage+ImageEffects for the blur to work. Download it here after logging in: https://developer.apple.com/downloads/index.action?name=WWDC%202013
Search for "UIImageEffects" and you'll find it. Just pull out the 2 necessary files and add them to your project. UIImage+ImageEffects.h and UIImage+ImageEffects.m.
Also, I had to Enable Modules in my build settings because I had a project that wasn't created with xCode 5. To do this go to your target build settings and search for "modules" and make sure that "Enable Modules" and "Link Frameworks Automatically" are both set to yes or you'll have compiler errors with the new category.
Good luck blurring!
Check WWDC 2013 sample application "running with a snap".
The blurring is there implemented as a category.

Implement Blur over parts of view

How can I implement the image below pragmatically - meaning the digits can change at runtime or even be replaced with a movie?
Just add a blurred UIView on top of your thing.
For example...make a UIImage of your desired view size, blur it using CIFilter and then add it to your view .It should achieve the desired effect.
This is generally the same question and is answered by quite a few methods.. Anyway I would propose 1 more:
Get the image from UIView
+ (UIImage *)imageFromLayer:(CALayer *)layer {
UIGraphicsBeginImageContext([layer frame].size);
[layer renderInContext:UIGraphicsGetCurrentContext()];
UIImage *outputImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return outputImage;
}
rather yet play around a bit with this to get the desired part of the view as the image. Now create a new view and add to it image views (with the image you get from layer). Then move the centers of the image views to achieve gaussian algorithm and take the image from this layer again and place it back on the original view.
Moving the center should be defined by radius fragment (I'd start with .5f) and resample range.
for(int i=1; i<resampleCount; i++) {
view1.center = CGPointMake(view1.center.x + radiusFragment*i, view1.center.y);
view2.center = CGPointMake(view2.center.x - radiusFragment*i, view2.center.y);
view3.center = CGPointMake(view3.center.x, view3.center.y + radiusFragment*i);
view4.center = CGPointMake(view4.center.x, view4.center.y - radiusFragment*i);
//add the subviews
}
//get the image from view
All the subviews need to have alpha set to 1.0f/(resampleCount*4)
This method might not be the fastest but it would be extremely easy to implement and if you can pimp the radius and resample range to minimum fragments it should do pretty well.
use a UIView whith white background and decrease the alpha property
blurView.backgroundColor=[UIColor colorWithRed:255 green:255 blue:255 alpha:0.3]

Resources