It's possible to place in a UIViewController object an EAGLView object and make the background of EAGLView transparent in order to view what is behind?
Thanks.
-UPDATE----
I've tried which appears in this post. But still the layer of the EAGLView appears like a black square. :(
Any idea why this is not working for me?
openGLView.opaque = NO
Also pay attention on correct opacity in alpha channel of OpenGL framebuffer.
And pay attention on Tark answer about performance drop.
There is no such thing as an EAGLView, but it is definitely possible to have a CAEAGLLayer with a transparent background on top of UIKit content. It is not recommended however, from the docs:
Because an OpenGL ES rendering surface is presented to the user using Core Animation, any effects and animations you apply to the layer affect the 3D content you render. However, for best performance, do the following:
Set the layer’s opaque attribute to TRUE.
Set the layer bounds to match the dimensions of the display.
Make sure the layer is not transformed.
Avoid drawing other layers on top of the CAEAGLLayer object. If you must draw other, non OpenGL content, you might find the performance cost acceptable if you place transparent 2D content on top of the GL content and also make sure that the OpenGL content is opaque and not transformed.
When drawing landscape content on a portrait display, you should rotate the content yourself rather than using the CAEAGLLayer transform to rotate it.
Related
I'm trying to make a kind of old-looking app, and so as a result I want my UIView's to be rendered without anti-aliasing. In other words I want my views to be pixelized ,especially the view.layer.cornerRadius, as in this case I am able to layout my views using AutoLayout.
This would make it much easier than drawing different pixel arts for different iPhone sizes. Moreover, if I did draw the pixel art whenever a view is resized I would have to create a new pixel art as scaling the images vould distort the pixel art.
The only thing that I have found is view.layer.allowsEdgeAntialiasing which by default is already set to false. I was also thinking to use the Core Graphic's BeizerPaths to draw the pixelized shadows and corners, would this be a viable way to achieve what I want?
You can go down to the Core Graphics CGContext and friends, and you can specifically tell the Graphics Context to shut off anti-aliasing, with CGContextSetShouldAntialias(context, false);
Here are two small screen grabs, one with no jaggies, and the other with jaggies.
So I just discovered QuartzCore, and I am now considering to replace a UIImageView containing a bitmap with a UIView subclass doing things like
CGContextFillEllipseInRect(contextRef, rect);
They would look exactly the same: the bitmap is just a little filled circle.
The very view I'm replacing is playing an important role in my app: it's being dragged around a lot.
Question: performance wise: should I bother? I can imagine that the vector circle is being recalculated all of the time, while the bitmap is just buffered. Or that the vector is easier to digest than a bitmap.
Can anyone advise?
thanks ahead
All UIView's on iOS are layer backed. So drawRect will only be called once and you will draw to the CALayer backing the view. You can have it draw again by calling setNeedsDisplay. When you are dragging the view around and drawing it, the view will render from the layer backing. Using a UIImageView is also layer backed and so the end result should be two layer backed views. The one place where you may see a difference is in low memory situations when the view is not visible (though I am not sure).
My iOS application draws into a bitmap (same size as my view) using Core Graphics. I want to push updated regions of the bitmap to the screen. (I've used the standard UIView drawRect method but I have some good reasons to switch to OpenGL).
I just want to replicate the same behavior as UIView/CALayer drawRect but in an OpenGL view. Essentially I would like to update dirty rectangles on my OpenGL view. Nothing more.
So far I've been able to create an OpenGL ES 1.1 view and push my entire bitmap on screen using a single quad (texture on a vertex array) for each update of my bitmap. Of course, this is pretty inefficient since I only need to refresh the dirty rectangle, not the whole view.
What would be the most efficient way to do that in OpenGL ES? Should I use a lattice of quads and update the texture of the quads that intersect with my dirty rectangle? (If I were to use that method, should I use VBO?) Is there a better way to do that?
FYI (just in case), I won't need rotation but will need to scale the entire OpenGL view.
UPDATE:
This method indeed works. However, there's a bug in iOS5.x on retina display devices that produces an artifact when using single buffering. The problem has been fixed in iOS6. I don't yet have a work around.
You could simply just update a part of the texture using TexSubImage, and redraw your standard full-screen quad, but with the scissor rect beeing set (glScissor) to the "dirty" part. The GL will then not draw any fragments outside this rect.
For this to work, you must of course use single buffering.
I'm drawing a line using OpenGL on the iPhone. I used GLView and GLKView delegates to draw the line. But now I want to set a background image to GLKViewController, and on the background image, I want to draw the line. I have tried to set a background image using UIImageView, but the image is appearing above on the (GLKView) drawn line.
How can I set a background image to the GLKViewController?
The easiest way should be to add additional view with background image (such as you proposed - but you need to set it behind OpenGL view). Try this:
[RootViewController.view addSubview: BackgroundView belowSubview: OpenGLView];
OpenGLView.opaque = NO;
But this will bring some FPS penalty (Apple recommends to use opaque OpenGL view for performance reason). More correct approach is to draw background via OpenGL as fullscreen quad.
I just want to state that the accepted answer is a very bad one... You will lose an almost immediate 20fps even on the newer devices (iPhone 5, iPad 2 & newer) by turning off opaque. The performance penalty is horrible and should not be taken lightly.
You can set a background in OpenGL and keep your opaque. Convert the UIImage to an OpenGL texture and render it. Depending on your OpenGL environment, this can be done a number of ways easily.
I'm attempting to optimize my app. It's quite visually rich, so has quite a lot of layered UIViews with large images and blending etc.
I've been experimenting with the shouldRasterize property on CALayers. In one case in particular, I have a UIView that consists of lots of sub views including a table. As part of a transition where the entire screen scrolls, this UIView also scales and rotates (using transforms).
The content of the UIView remains static, so I thought it would make sense to set view.layer.shouldRasterize = YES. However, I didn't see an increase in performance. Could it be that it's re-rasterizing every frame at the new scale and rotation? I was hoping that it would rasterize at the beginning when it has an identity transform matrix, and then cache that as it scales and rotates during the transition?
If not, is there a way I could force it to happen? Short of adding a redundant extra super-view/layer that does nothing but scale and rotate its rasterized contents...
You can answer your own question by profiling your application using the CoreAnimation instrument. Note that this one is only available in a device.
You can enable "Color hits in Green and Misses Red". If your layer remains red then it means that it is indeed rasterizing it every frame.