OpenGL ES: Drawing small objects - ios

To best illustrate the issue I'm having, I created a short screen grab. Watch it here: http://cl.ly/1o3p3x2e2J1a1d3d2N1Q
Basically, the stars on the screen, as they're animated across the screen from right to left, are dimming and brightening on their own. I'm not intending on this happening. When you zoom in, the issue disappears.
My hunch is that this has to do with the size of the objects being drawn and the pixel boundaries. Is this correct? What is the best way to go about fixing this issue?
Thanks!
---Edit---
Here's how I'm loading the texture: http://pastebin.com/RDc8x7Te
And, here's how I'm setting up OpenGL ES: http://pastebin.com/SpvAqPqA

You use nearest and linear for scaling textures, which are both not very accurate. You might want to use linear for both, or build mipmaps. Also in case you use an orthogonal view, try aligning your geometry on pixels.

Related

SpriteKit sktilemapnode vertical line glitch

I am making a 2d platformer and I decided to use multiple tilemapnodes as my backgrounds. Even with 1 tile map, I get these vertical or horizontal lines that appear and disappear when I'm moving the player around the screen. See image below:
My tiles are 256x256 and I'm storing them in a tileset sks file. Not exactly sure why I'm getting this or how to get rid of this and it is quite annoying. Wondering if others experience this as well.
Considering to not use the tile maps, but I would prefer to use them if I can.
Thanks for any help with this!!!
I had the same issue and was able to solve it by "extruding" the tiled image a couple pixels. This provides a little cushion of pixels to use when the floating point issue occurs instead of displaying nothing (hence the gap). This video sums it up pretty well.
Unity: extruding tile map images
If you're using TexturePacker to generate your sprite atlas' there is an option to add this automatically without having to do it to your tile images yourself.
Hope that helps!
Sort of like the "extruding" suggested by #cheaze, I simply make the tile size in the drawing code a tiny amount larger than the required tile size. This means the assets themselves do not have to be changed.
Eg. if you assets are sized 256 x 256 and all of your calculations are based on that; draw the textures as 256.02 x 256.02 pixels in size:
[SKSpriteNode spriteNodeWithTexture:texture size:CGSizeMake(256.02, 256.02)];
Only adding .02 pixel per side will overlap your tiles automatically and remove the line glitches, depending on your camera speed and frame rate.
If the problem is really bad, you can even go so far as to add half a pixel (+0.5) or an entire pixel to remove the glitches, yet the user will not be able to see the difference. (Since a one pixel difference on a retina screen is hard to distinguish).

Cross Viewport oclussion culling

I'm currently working on a XNA project where I need to create a Picture in picture Overlay to displaying a 3D scene from multiple angles. currently I'm trying to use 2 viewports to do this. The main one fills whole screen and is working as desired. The second is placed in one of the corners of the first (overlapping that corner) and is less than a 5th of the size as the first. apart from the size and placement of the viewports the only thing really difference between the 2 is the placement of their cameras with-in the scene.
As long as the second viewport is drawn second and there are no objects close to the camera of the first viewport in the overlapping corner this actually works greate. However if there is an object close to the camera and in the corner of the first viewport objects seem to experience occlusion culling as a result of the first viewport's object. The occluding object of the first viewport is not shown in the second viewports space though.
My question is how would one prevent the "cross viewport culling" from happening? I've searched all over and the closes threads I could find suggest drawing the second viewport to a RenderTarget2D and using a SpriteBatch to display the resulting texture. Though doing so does fix the occlusion issue it does mayhem on the z ordering, CCW culling, and my water effects all of which I've never had issue with using the default render target.
This issue, at least in my case, Seems to have been resolved by simply clearing the depth buffer between drawing the 2 Viewports. I did this by adding the following line between the function calls that draws the individual Viewports
GraphicsDevice.Clear(ClearOptions.DepthBuffer, Color.Black, 1, 0);
I ran in to this solution while trying to use the Stencil buffer to prevent the Main viewport from drawing under the second, but everything I did before this discovery did not really have a noticeable effect. After I removed all the Stencil code and it looks like I'm getting the desired effect. Sorry but I can't really explain how or why this works without messing up the first Viewport cause I have no clue myself.

Can I draw and animate vectors in iOS without resorting to bitmap images?

Can I do this?
My question arises from the need of a button that I'm animating when a user touches it.
This animation has been made with a set of 30 png images (half a second of animation # 60FPS). This totals 60 images for regular and retina screens. It works quite well this way, but I'm not happy about it.
My goals are:
1 - Drastically reduce the size of my app (e.g my background is a 400KB png file, but with quartz I can do it with a dozen lines of code).
2 - Do with it with a perfect, smooth animation, as light on the CPU/GPU as I can.
So, is there anyway I can do this?
I have the images in pure vector, and I can draw them with Quartz. But not animate it without having to redraw everything for every frame. (Well, the animation is a "2 way street", it's the coming back that would be problematic to redraw)
Are there any APIs/Frameworks that would help me do this? How would I go about it?
Thank you!
Take a look at CAShapeLayer. It's path property is animatable. For an animation to look good it's important that the from & to shape in the animation have the same amount of points. So depending on your shapes this might or might not work.

WebGL z-buffer artifacts?

We are working on a Three.js based WebGL project, and have trouble understanding how transparency is handled in WebGL. The image shows a doublesided surface drawn with alpha = 0.7, which behaves correctly on its right side. However closer to the middle strange artifacts appear, and on the left side the transparency does not seem to work at all.
http://emilaxelsson.se/sandbox/vis1/alpha.png
The problem can also be seen here:
http://emilaxelsson.se/sandbox/vis1/
Has anyone seen anything similar before? What could the reason be?
Your problem is that transparent objects needs to be sorted and rendered in a back-to-front order (if you try to change the opacity of your mesh from 0.7 (transparent) to 1.0 (opaque), you can see that the z-buffer works just fine).
See:
http://www.opengl.org/wiki/Transparency_Sorting
http://www.opengl.org/archives/resources/faq/technical/transparency.htm (15.050)
In your case it might be less trivial to solve, since I assume that you only have one mesh.
Edit: Just to summarize the discussion below. It is possible to achieve correct rendering of such a double-sided transparent mesh. To do this, you need to create 6 versions of the mesh, corresponding to 6 sides of a cube. Each version needs to be sorted in a back-to-front order based on the 'side of the cube' (front, back, left, right, top, bottom).
When rendering choose the correct mesh (based on the camera viewing direction) and render that single mesh.
The easy solution for your case (based on the picture you attached), without going to expensive sorting and multiple meshes, is to disable depth test and enable face culling. That produces acceptable results if you do not have any opaque objects in front of the mesh.

Xna game development - Game background issue

Im starting with XNA and i need an advice about the following.
I have a .jpg file with my space ship game background with the following size:
width: 5000px
height: 4800px
When i try to load the texture i get the following error:
Texture width or height is larger than the device supports
What is the most used technique to move the background at the same time that your ship is moving?
Thanks a lot.
Kind Regards.
Josema.
One way would be to separate your image into smaller tiles and draw the visible ones.
However this technique suffers from a problem when bilinear sampling is used, because the colors bleeds from the one side of the texture to the other. You can probably compensate by disabling texture WRAP sampling or by grabbing a single of pixels from the tiles next to.
For example if you want 256x256 textures, you would only display 255x255 tiles, because one line (right and bottom) is a copy from the tiles next to it.
Hope it makes sense, otherwise I'll have to paint a picture :-)
The texture limit is determined by graphics card, I believe.
You want to break the texture down to smaller images.
Try something like this. He's tiling a simple 40x40, but you might use it a a guideline on how to tile yours.
http://forums.xna.com/forums/p/19835/103704.aspx
To move the background at the same time that your ship is moving you can implement a camera.
The following links might help-
http://adambruenderman.wordpress.com/2011/04/05/create-a-2d-camera-in-xna-gs-4-0/
http://www.dreamincode.net/forums/topic/237979-2d-camera-in-xna/

Resources