We have an application that uses CATiledLayer to render a large and complex vector drawing to the display. The scrolling and zooming peformance is quite good. We are having an issue when the image is zoomed in and the user is editing the vector items. They are able to select an item and modify it by moving the item or adjusting the item control points. We only update the visible area of the screen. Using Instruments. I can tell that our rendering code is properly clipping and in only called when the backing layers are updated. Memory overhead is low and our drawing routine is fast. The issue is in the screen update; we can see tearing as the CATiledLayer updates the tiles.
Does anyone know of a way to synchronize the drawing of all the tiles that are onscreen so they are blitted at the same time? If not, has anyone come up with any techniques to deal with type of issue?
Thanks in advance for any advice or input!
Related
I'm using openseadragon with the excellent svg overlay plugin.
On Chrome, the app behaves as expected: users can tap to zoom in until a table rendered in SVG is fully visible, the note on the table is legible.
Here's the link to the demo. Zoom out to see the SVG version of the table appear, overlaying the fuzzy raster version of the background.
On Safari on iOS or OSX when zooming past a seemingly arbitrary threshold the table and everything on it start to disappear. The point of disappearance seems to depend on other factors I don't understand, hence this question for insight. For example, a orange circle drawn with two.js will disappear when the scale transform is precisely 51201 (at 51200 the circle is there). For the more complex table SVG, elements on the table will disappear at different scale levels, between ~23000 to 50000. Sometimes they'll disappear and then reappear upon a slight zoom in. Sometimes they'll disappear on zoom and then reappear as I pan around, the objects nearing the edge of the viewport.
IE 11 has a very similar issue.
Has anyone dealt with this before or solved it?
That's a really slick project!
In my experience, that kind of problem with SVG disappearing has to do with extreme amounts of zoom. The good news is you should be able to work around it by changing your viewport coordinates. By default the width of the image is a viewport value of 1, but you can set your image to be width 10,000 or some such, which will look exactly the same on the screen, but it means that the SVG thinks it's zoomed out a lot at first, so when you zoom in you can go a lot further.
If you're using two.js, another possible fix would be to switch over to canvas rendering and use https://github.com/altert/OpenSeadragonCanvasOverlay.
Btw, I'd love to share your project when it's done... please file a ticket at https://github.com/openseadragon/site-build/issues when you're ready and we can add it to http://openseadragon.github.io/examples/in-the-wild/.
In the Video Session 506 of Apples WWDC 2012 they showed a painting app which is made for high performance drawings (so the frame rate never gets below 30).
I tried to replicate the code but get stuck on multiple points.
What I am looking for is a basic drawing app (lines, Squares, Circles, bezier paths), which performs well even after hundreds of lines have been drawn.
The basic approach is to save the drawn lines (or circles bezierpaths etc) to an image after a certain numbers of them have been drawn, and then only refresh the new drawings, therefore not having to redraw all the already drawn lines.
But somehow I never get to a higher performance. How do I need to implement this? Do I need multiple layers? And how do I manage that not all layers in a view are redrawn, but only a certain sublayer?
If someone could provide me with a short example of a few lines drawn on an layer, then saving that layer to an image, an then drawing on top of that I would really appreciate it.
Thank you for any help to recreate the iPaint application, which is unfortunately not available for download from apple.
That is only half of the puzzle. The other half is to only refresh the minimum possible area of the view (via setNeedsDisplayInRect:). However, I have been through many different ways of drawing via Core Graphics. The caching is fine, but I don't use it anymore. I set the update rectangle as above, and then test each path before I stroke it (testing is fast, stroking is slow). If it is inside the update box, I stroke it, otherwise I ignore it.
I did not look at that session, but a traditional Quartz speedup has been to use CGLayers (not CALayers). You can think of a CGLayer as a cached drawing which may or may not be a bitmap (the system decides how best to cached it). If you have a backing bitmap context, you can use that as your "image" and draw the CGLayers into that (and then discard the layers) as you see fit. Read up on CGLayer (its in the Quartz documentation) and then see if this was what they talked about in that session.
I'm working on yet another drawing app with canvas that is many times bigger than screen.
I need some advice/direction on how to that.
Basically what i want is to scroll around this big canvas, drawing only in visible region.
I was thinking of two approaches:
Have 64x64 (or whatever) "tiles" to draw on, and then on scroll just load new tiles.
Record all user strokes (points) and on scroll calculate which are in specified region, and draw them, using only screen-size canvas.
If this matters, i'm using cocos2d for prototype.
Forget the 2000x200 limitation, I have an open source project that draws 18000 x 18000 NASA images.
I suggest you break this task into two parts. First, scrolling. As was suggested by CodaFi, when you scroll you will provide CATiledLayers. Each of those will be a CGImageRef that you create - a sub image of your really huge canvas. You can then easily support zooming in and out.
The second part is interacting with the user to draw or otherwise effect the canvas. When the user stops scrolling, you then create an opaque UIView subclass, which you add as a subview to your main view, overlaying the view hosting the CATiledLayers. At the moment you need to show this view, you populate it with the proper information so it can draw that portion of your larger canvas properly (say a circle at this point of such and such a color, etc).
You would do your drawing using the drawRect: method of this overlay view. So as the user takes action that changes the view, you do a "setDisplayInRect:" as needed to force iOS to call your drawRect:.
When the user decides to scroll, you need to update your large canvas model with whatever changes the user has made, then remove the opaque overlay, and let the CATiledLayers draw the proper portions of the large image. This transition is probably the most tricky part of the process to avoid visual glitches.
Supposing you have a large array of object definitions used for your canvas. When you need to create a CGImageRef for a tile, you scan through it looking for overlap between the object's frame and the tile's frame, and only then draw those items that are required for that tile.
Many mobile devices don't support textures over 2048x2048. So I would recommend:
make your big surface out of large 2048x2048 tiles
draw only the visible part of the currently visible tile to the screen
you will need to draw up to 4 tiles per frame, in case the user has scrolled to a corner of four tiles, but make sure you don't draw anything extra if there is only one visible tile.
This is probably the most efficient way. 64x64 tiles are really too small, and will be inefficient since there will be a large repeated overhead for the "draw tile" calls.
There is a tiling example in Apples ScrollViewSuite Doesn't have anything to do with the drawing part but it might give you some ideas about how to manage the tile part of things.
You can use CATiledLayer.
See WWDC2010 session 104
But for cocos2d, it might not work.
To best illustrate the issue I'm having, I created a short screen grab. Watch it here: http://cl.ly/1o3p3x2e2J1a1d3d2N1Q
Basically, the stars on the screen, as they're animated across the screen from right to left, are dimming and brightening on their own. I'm not intending on this happening. When you zoom in, the issue disappears.
My hunch is that this has to do with the size of the objects being drawn and the pixel boundaries. Is this correct? What is the best way to go about fixing this issue?
Thanks!
---Edit---
Here's how I'm loading the texture: http://pastebin.com/RDc8x7Te
And, here's how I'm setting up OpenGL ES: http://pastebin.com/SpvAqPqA
You use nearest and linear for scaling textures, which are both not very accurate. You might want to use linear for both, or build mipmaps. Also in case you use an orthogonal view, try aligning your geometry on pixels.
Hellothis weekend I started to watch the 2011 WWDC videos. I've found really interesting topics about iOS. My favorites were about performance and graphics, but I've found two of them apparently in contradiction. Of course there is something that I didn't get.
The sessions that I'm talking about are Understanding UIKit Rendering -121 and Polishing your app -105.
Unfortunately sample code from 2011 is still not downloadable, so is pretty hard to have an overall view.
In one session they explain that most of times offscreen rendering should be avoided during visualization in scrollview etc. They fix the performance issues in the sample code almost drawing everything inside the -drawRect method.
In the other session the performance issue (on a table view) seems to be due to too much code in the -drawRect method of the table's cells.
First is not clear to me when an OffScreen rendering is required by the system, I've seen in the video that some quartz function such as: cornerRadious, shadowOffset, shadowColor requires it, but does exist a general rule?
Second I don't know if I understood well, but it seems that when there is no offscreen rendering adding layers or views is the way to go.
I hope someone could bring light about that..
Thanks,
Andrea
I don't think there is a rule written down anywhere, but hopefully this will help:
First, let's clear up some definitions. I think offscreen vs onscreen rendering is not the overriding concern most of the time, because offscreen rendering can be as fast as onscreen. The main issue is whether the rendering is done in hardware or software.
There is also very little practical difference between using layers and views. Views are just a thin wrapper around CALayer and they don't introduce a significant performance penalty most of the time. You can override the type of layer used by a view using the +layerClass method if you want to have a view backed by a CAShapeLayer or CATileLayer, etc.
Generally, on iOS, pixel effects and Quartz / Core Graphics drawing are not hardware accelerated, and most other things are.
The following things are not hardware accelerated, which means that they need to be done in software (offscreen):
Anything done in a drawRect. If your view has a drawRect, even an empty one, the drawing is not done in hardware, and there is a performance penalty.
Any layer with the shouldRasterize property set to YES.
Any layer with a mask or drop shadow.
Text (any kind, including UILabels, CATextLayers, Core Text, etc).
Any drawing you do yourself (either onscreen or offscreen) using a CGContext.
Most other things are hardware accelerated, so they are much faster. However, this may not mean what you think it does.
Any of the above types of drawing are slow compared to hardware accelerated drawing, however they don't necessarily slow down your app because they don't need to happen every frame. For example, drawing a drop shadow on a view is slow the first time, but after it is drawn it is cached, and is only redrawn if the view changes size or shape.
The same goes for rasterised views or views with a custom drawRect: the view typically isn't redrawn every frame, it is drawn once and then cached, so the performance after the view is first set up is no worse, unless the bounds change or you call setNeedsDisplay on it.
For good performance, the trick is to avoid using software drawing for views that change every frame. For example, if you need an animated vector shape you'll get better performance using CAShapeLayer or OpenGL than drawRect and Core Graphics. But if you draw a shape once and then don't need to change it, it won't make much difference.
Similarly, don't put a drop shadow on an animated view because it will slow down your frame rate. But a shadow on a view that doesn't change from frame to frame won't have much negative impact.
Another thing to watch out for is slowing down the view setup time. For example, suppose you have a page of text with drop shadows on all the text; this will take a very long time to draw initially since both the text and shadows all need to be rendered in software, but once drawn it will be fast. You will therefore want to set up this view in advance when your application loads, and keep a copy of it in memory so that the user doesn't have to wait ages for the view to display when it first appears on screen.
This is probably the reason for the apparent contradiction in the WWDC videos. For large, complex views that don't change every frame, drawing them once in software (after which they are cached and don't need to be redrawn) will yield better performance than having the hardware re-composite them every frame, even though it will be slower to draw the first time.
But for views that must be redrawn constantly, like table cells (the cells are recycled so they must be redrawn each time one cell scrolls offscreen and is re-used as it scrolls back onto the other side as a different row), software drawing may slow things down a lot.
Offscreen-rendering is one of the worst defined topics in iOS rendering, today. When Apple's UIKit engineers refer to offscreen-rendering, it has a very specific meaning, and a ton of third-party iOS dev blogs are getting it wrong.
When you override "drawRect:", you're drawing via the CPU, and spitting out a bitmap. The bitmap is packaged up and sent to separate process that lives in iOS, the render server. Ideally, the render server just displays the data on screen.
If you fiddle with properties on CALayer, like turning on drop shadows, the GPU will perform additional drawing. This additional work is what UIKit engineers mean when they say "off-screen rendering." This is always performed with hardware.
The issue with off-screen drawing isn't necessarily the drawing. The off-screen pass requires a context switch, as the GPU switches its drawing destination. During this switch, the GPU is idle.
While I don't know a full list of properties that trigger an off-screen pass, you can diagnose this with the Core Animation Instrument's "Color Offscreen-rendered layer" toggle. I assume any property other than alpha is performed via an offscreen pass.
With early iOS hardware, it was reasonable to say "do everything in drawRect." Nowadays GPUs are better, and UIKit has features like shouldRasterize. Today, it's a balancing act between the time spent in drawRect, the number of off-screen passes, and the amount of blending. For the full details, watch the 2014 WWDC session 419, "Advanced Graphics and Animation for iOS Apps."
That all said, it's good to understand what's going on behind-the-scenes, and keep it in the back of your head so you don't do anything insane, but you should start from the simplest solution. Then test it on the slowest hardware you support. If you aren't hitting 60FPS, use Instruments to measure things and figure it out. There are a few possible bottlenecks, and if you aren't using data to diagnose things, you're just guessing.
Offscreen rendering / Rendering on the CPU
The biggest bottlenecks to graphics performance are offscreen rendering and blending – they can happen for every frame of the animation and can cause choppy scrolling.
Offscreen rendering (software rendering) happens when it is necessary to do the drawing in software (offscreen) before it can be handed over to the GPU. Hardware does not handle text rendering and advanced compositions with masks and shadows.
The following will trigger offscreen rendering:
Any layer with a mask (layer.mask)
Any layer with layer.masksToBounds / view.clipsToBounds being true
Any layer with layer.allowsGroupOpacity set to YES and layer.opacity is less than 1.0
When does a view (or layer) require offscreen rendering?
Any layer with a drop shadow (layer.shadow*).
Tips on how to fix: https://markpospesel.wordpress.com/tag/performance/
Any layer with layer.shouldRasterize being true
Any layer with layer.cornerRadius, layer.edgeAntialiasingMask, layer.allowsEdgeAntialiasing
Any layer with layer.borderWith and layer.borderColor?
Missing reference / proof
Text (any kind, including UILabel, CATextLayer, Core Text, etc).
Most of the drawings you do with CGContext in drawRect:. Even an empty implementation will be rendered offscreen.
This post covers blending and other things affecting performance: What triggers offscreen rendering, blending and layoutSubviews in iOS?