Interactive blurring of UIImage and UIView like iOS 8 spotlight - ios

iOS 8 introduces some pretty snazzy interactive blurring. Most notably, there's the interactive blur when you pull down for spotlight, but there's also the animation when opening and closing Siri (though that's not interactive). I've only noticed this interactive blur in one other place: the official Twitter app when pulling down on a profile view (parallax header image zooms and blurs sometimes).
I've attempted to animate something basic with a UISlider with both CoreImage and GPUImage (based on the answer to this question and also Apple's UIImage+ImageEffects, but nothing seems appropriately performant enough to animate the blur interactively (i.e. blurring an image to a single value works quickly once, but not at a framerate fast enough to blur continuously).
How can I implement these methods in a way that they are performant enough to both blur and unblur a UIImage (and ideally a UIView or CIContext snapshot) interactively?

There is no simple and one way of doing it, but it's definitively doable if you follow following steps and optional ways:
Most important: downsample the image. Basic resolution is not very important for gaussian blur. If you downsample only to half resolution, the amount of data is down to quarter!
Define end target blur radius.
Retrieve the architecture of device with the help of C functions and use different values of delta saturation parameter for different architectures, according to processing power, of course.
Experiment creating the blur with Apple provided library with radius step regarding to step value of parameter that is interacting with (KVO to contentOffset property, for example). dispatch_async and don't forget to callback with blurred image to main queue!
Methods above will almost for sure cater all the architectures from arm7s onwards, but you might still have some issues with arm7 - iPhone 4s).
If you still have issues, like with mentioned arm7, then double the contentOffset change required to make next blur with next radius. Then instead of changing the image property on UIImageView, rather create new UIImageView with new blurred UIImage and fade in alpha channel from 0 to 1 within the period the next blurred page is being created.
You might use number of tricks, like creating all blurred images one after another for the full interactive scale and cache them in collection and use them in method described in point 6.
There a also many other techniques if the animation is not interactive, but rather timed in the certain frame.

Related

Darken an opaque UIView without blending

My App's background is an opaque UIImageView. Under some circumstances I would like to darken this down in an animated way from full brightness to about 50%. Currently I lower the alpha property of the view and this works well. Because nothing is behind the view, the background image just becomes dark.
However, I've been profiling using the Core Animation Instrument and when I do this, I see that the whole background shows as being blended. I'd like to avoid this if possible.
It seems to me that this would be achievable during compositing. If a view is opaque, it is possible to mix is with black without anything behind showing through. It's not necessary to blend it, just adjust the pixel values.
I wondered if this was something that UIKit's GPU compositing supports. While blending isn't great, it's probably a lot better than updating the image on the CPU, so I think a CPU approach is probably not a good substitute.
Another question asks about this, and a few ideas are suggested including setting the Alpha. No one has brought up a mechanism for avoiding blending though.
An important question here is whether you want the change to using a darkened background to be animated.
Not animated
Prepare two different background images and simply swap between them. The UIImage+imageEffects library could help with generating the darkened image, or give you some leads.
Animated.
Take a look at GPUImage - "An open source iOS framework for GPU-based image and video processing". Based on this you could render the background in to the scene in a darkened way.

What is better way to draw a custom button?

I need to have a few buttons in my iPhone app (which may be then ported to iPad). I know at least 2 methods for making such buttons:
1. Using usual UIButton with an image as a background which can be drawn in any graphics editor.
2. Subclassing UIButton and implementing own drawRect: method using CoreGraphics tools.
I don't know why, but I tend to use the second one, since it seems to be more difficult and lower performing.
Am I right thinking that when implementing the button drawing programmatically, it becomes "cross platform" so that you don't need several icons for different resolutions?
If that is really simple icon, some bezier curve or circle filled with color. Will it still preform slower than an image-button?
And does somebody know any tool which has a graphical interface for drawing a vector image, and than converts it to the CoreGraphics code which one can paste into the drawRect: method?
Thank you.
Don't worry for using drawRect. Unlike Windows, iOS caches the results into a bitmap and will only redraw it when the dimensions of the view change (which is what you actually want - since you want to scale it again).
As for a vector app, you can use Opacity. It has an option to export the icon into CoreGraphics source code: http://likethought.com/opacity/

Does shouldRasterize on a CALayer cause rasterization before or after the layer's transform?

I'm attempting to optimize my app. It's quite visually rich, so has quite a lot of layered UIViews with large images and blending etc.
I've been experimenting with the shouldRasterize property on CALayers. In one case in particular, I have a UIView that consists of lots of sub views including a table. As part of a transition where the entire screen scrolls, this UIView also scales and rotates (using transforms).
The content of the UIView remains static, so I thought it would make sense to set view.layer.shouldRasterize = YES. However, I didn't see an increase in performance. Could it be that it's re-rasterizing every frame at the new scale and rotation? I was hoping that it would rasterize at the beginning when it has an identity transform matrix, and then cache that as it scales and rotates during the transition?
If not, is there a way I could force it to happen? Short of adding a redundant extra super-view/layer that does nothing but scale and rotate its rasterized contents...
You can answer your own question by profiling your application using the CoreAnimation instrument. Note that this one is only available in a device.
You can enable "Color hits in Green and Misses Red". If your layer remains red then it means that it is indeed rasterizing it every frame.

Blending V.S. offscreen-rendering, which is worse for Core Animation performance?

Blending and offscreen-rendering are both expensive in Core Animation.
One can see them in Core Animation instrument in Instruments, with Debug Options:
Here is my case:
Display 50x50 PNG images on UIImageViews. I want to round the images with a 6-point corer radius. The first method is to set UIImageView.layer's cornerRadius and masksToBounds which causes offscreen-rendering. The second method is to make PNG image copies with transparent corners which causes blending(because of the alpha channel).
I've tried both, but I can't see significant performance difference. However, I still want to know which is worse in theory and best practices if any.
Thanks a lot!
Well, short answer, the blending has to occur either way to correctly display the transparent corner pixels. However, this should typically only be an issue if you want the resulting view to also animate in some way (and remember, scrolling is the most common type of animation). Also, I'm able to recreate situations where "cornerRadius" will cause rendering errors on older devices (iPhone 3G in my case) when my views become complex. For situations where you do need performant animations, here are the recommendations I follow.
First, if you only need the resources with a single curve for the rounded corners (different scales are fine, as long as the desired curvature is the same), save them that way to avoid the extra calculation of "cornerRadius" at runtime.
Second, don't use transparency anywhere you don't need it (e.g. when the background is actually a solid color), and always specify the correct value for the "opaque" property to help the system more efficiently calculate the drawing.
Third, find ways to minimize the size of transparent views. For example, for a large border view with transparent elements (e.g. rounded corners), consider splitting the view into 3 (top, middle, bottom) or 7 (4 corners, top middle, middle, bottom middle) parts, keeping the transparent portions as small as possible and marking the rectangular portions as opaque, with solid backgrounds.
Fourth, in situations where you're drawing lots of text in scrollViews (e.g. highly customized UITableViewCell), consider using the "drawRect:" method to render these portions more efficiently. Continue using subviews for image elements, in order to split the render time between the overall view between pre-drawing (subviews) and "just-in-time" drawing (drawRect:). Obviously, experimentation (frames per second while scrolling) could show that violating this "rule-of-thumb" may be optimal for your particular views.
Finally, making sure you have plenty of time to experiment using the profiling tools (especially CoreAnimation) is key. I find that it's easiest to see improvements using the slowest device you want to target, and the results look great on newer devices.
After watching WWDC videos and having some experiments with Xcode and Instruments I can say that blending is better then offscreen rendering. Blending means that system requires some additional time to calculate color of pixels on transparent layers. The more transparent layers you have (and bigger size of these layers) then blending takes more time.
Offscreen rendering means that system will make more then one rendering iteration. At first iteration system will make rendering without visualization just to calculate bounds and shape of area which should be rendered. In next iterations system does regular rendering (depends on calculated shape) including blending if required.
Also for offscreen rendering system creates a separate graphics context and destroys it after rendering.
So you should avoid offscreen rendering and it's better to replace it with blending.

Custom UIview free rotation too slow

Programming for iOS, I have a composite custom view consisting of many UIViews. Some UIViews in this composites are responsible for drawing shadow and others for some custom shading. The shadow and shading need to be redrawn upon rotation recognized by UIRotationGestureRecognizer. However the speed of the rotation is far from satisfactory. When I commented out setNeedDisplay, the rotational speed is fine. However, if I do call setNeedDisplay, even when I commented out everything in all drawRects for the shadow and shading views, the rotation still lags significantly.
Are there any recommendations to speed things up?
I can think of one possible solution: make sure the system calls drawRect less often while in rotation. But I do not know how to do this, nor do I know if this is the best solution. Any suggestion appreciated. Thanks.
Calling setNeedsDisplay: too often, especially every frame will always be slow. setNeedsDisplay runs on the CPU, not the GPU. Don't redraw views during rotation and zooming. Wait until the end of the animation, then call setNeedsDisplay: to "render" the final position.
Take a look at how various UIKit views handle large animations:
While MapKit zooms in, the map image scales and looks blurry. Once the zoom gesture stops it renders a new image at that scale. (In this case the image is downloaded from the internet, but it still illustrates the concept.)
ZoomingPDF Sample code (see apple developer docs) shows how zooming on PDFs doesn't render in realtime, but after the zooming finishes.
Hope this helps.

Resources