I'm drawing triangles and they're invisible from the view they should be visible, so how can I flip the direction?
Thanks
Invert the vertex order in the vertex or index buffer. Or change the backface culling settings. E.g. set the CullMode of the RasterizerState to CullMode.None.
Additionally, make sure that there are no problems with lighting that make your triangles black / invisible.
Related
I am trying to learn WebGL and would like to have shader that gives a mesh a gradient effect from top to bottom. For example, the bottom of a ball or wall having no blue color and the top having all blue. I know I need to modify the fragment color with the y component of gl_Position but my implementations have thus far given me a black screen. Any help would be appreciated.
Are you sure that the fragments are getting actually drawn on the screen (disable culling, depth test), no GL errors ? If yes, only issue can be the Alpha setting with blending enabled. Try disabling GL_BLEND, or changing the value of Alpha to 1.0, like below, setting RGB to your colors:
gl_FragColor = vec4(R,G,B, 1.0);
I'm developing a 3D game for iOS with openGL ES2.
the 3D sprites should be semi-transparent with an alpha channel of about 0.5 to show the background.
The problem is that I want the back side of the 3D sprites to be completely not visible. In other words i want to see only the front side of the sprite (just like it would appear with an alpha channel = 1) but with the background visible through it.
Is there any blend function or some shader setting to obtain this effect?
Presumably your sprites are textured onto geometry (quads drawn using triangles or triangle strips)? All you need to do is enable face culling:
glEnable(GL_CULL_FACE);
This will prevent drawing the "back" side of any polygon well before it gets to the blending stage of the graphics pipeline -- so you get a performance win in addition to the visual effect your after.
You do need to make sure that your "front" and "back" sides are defined consistently, though. By default, OpenGL considers any polygon whose vertices are in counter-clockwise order to be front-facing (and vice versa). If enabling face culling makes all your sprites disappear, it's because their vertices are in clockwise order. Either reorder your vertices, or tell OpenGL that they're all backwards with glFrontFace(GL_CW).
I'm building a thick line from triangles. The problem i'm having is that when the curve is semi transparent and some triangles overlap i get the effect in the picture. I would like for the triangles alphas to not get added.
I'm using this blend function:
glBlendFuncSeparate(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA, GL_CONSTANT_ALPHA, GL_CONSTANT_ALPHA);
You may render the curve to separate render target with full opacity and then draw that target with custom alpha. Otherwise you should avoid overlapping.
you can use stencil test for blocking the drawing of already drawn fragments (which prevent blending from the first place)
I am rendering point sprites (using OpenGL ES 2.0 on iOS) as a user's drawing strokes. I am storing these points in vertex buffer objects such that I need to perform depth testing in order for the sprites to appear in the correct order when they're submitted for drawing.
I'm seeing an odd effect when rendering these drawing strokes, as shown by the following screenshot:
Note the background-coloured 'border' around the edge of the blue stroke, where it is drawn over the green. The user drew the blue stroke after the green stroke, but when the VBOs are redrawn the blue stroke gets drawn first. When it comes to draw the green stroke, depth testing kicks in and sees that it should be behind the blue stroke, and so does this, with some success. It appears to me to be some kind of blending issue, or to do with incorrectly calculating the colour in the fragment shader? The edges of all strokes should be transparent, however it appears that the fragment shader combines it with the background texture when processing those fragments.
In my app I have created a depth renderbuffer and called glEnable(GL_DEPTH_TEST) using glDepthFunc(GL_LEQUAL). I have experimented with glDepthMask() to no avail. Blending is set to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA), and the point sprite colour uses premultiplied alpha values. The drawing routine is very simple:
Bind render-to-texture FBO.
Draw background texture.
Draw point sprites (from a number of VBOs).
Draw this FBO's texture to the main framebuffer.
Present the main framebuffer.
EDIT
Here is some code from the drawing routine.
Setup state prior to drawing:
glDisable(GL_DITHER);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
Drawing routine:
[drawingView setFramebuffer:drawingView.scratchFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:[self textureForBackgroundType:self.backgroundType]];
glUseProgram(programs[PROGRAM_POINT_SPRITE]);
// ...
// Draw all VBOs containing point sprite data
// ...
[drawingView setFramebuffer:drawingView.defaultFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:drawingView.scratchTexture];
[drawingView presentFramebuffer:drawingView.defaultFramebuffer];
Thanks for any help.
If you want to draw non opaque geometries you have to z-sort them from back to front. This has been the only way to get a proper blending for many years. These days there are some algorithms for order independent transparency like Dual Depth Peeling but they are not applicable to iOS.
I am writing simple hex engine for action-rpg in XNA 3.1. I want to light ground near hero and torches just as they were lighted in Diablo II. I though the best way to do so was to calculate field-of-view, hide any tiles and their's content that player can't see and draw special "Light" texture on top of any light source: Texture that is black with white, blurred circle in it's center.
I wanted to multiply this texture with background (as in blending mode: multiply), but - unfortunately - I do not see option for doing that in SpriteBatch. Could someone point me in right direction?
Or perhaps there is other - better - way to achive lighting model as in Diablo II?
If you were to multiply your light texture with the scene, you will darken the area, not brighten it.
You could try rendering with additive blending; this won't quite look right, but is easy and may be acceptable. You will have to draw your light with a fairly low alpha for the light texture to not just over saturate that part of the image.
Another, more complicated, way of doing lighting is to draw all of your light textures (for all the lights in the scene) additively onto a second render target, and then multiply this texture with your scene. This should give much more realistic lighting, but has a larger performance overhead and is more complex.
Initialisation:
RenderTarget2D lightBuffer = new RenderTarget2D(graphicsDevice, screenWidth, screenHeight, 1, SurfaceFormat.Color);
Color ambientLight = new Color(0.3f, 0.3f, 0.3f, 1.0f);
Draw:
// set the render target and clear it to the ambient lighting
graphicsDevice.SetRenderTarget(0, lightBuffer);
graphicsDevice.Clear(ambientLight)
// additively draw all of the lights onto this texture. The lights can be coloured etc.
spriteBatch.Begin(SpriteBlendMode.Additive);
foreach (light in lights)
spriteBatch.Draw(lightFadeOffTexture, light.Area, light.Color);
spriteBatch.End();
// change render target back to the back buffer, so we are back to drawing onto the screen
graphicsDevice.SetRenderTarget(0, null);
// draw the old, non-lit, scene
DrawScene();
// multiply the light buffer texture with the scene
spriteBatch.Begin(SpriteBlendMode.Additive, SpriteSortMode.Immediate, SaveStateMode.None);
graphicsDevice.RenderState.SourceBlend = Blend.Zero;
graphicsDevice.RenderState.DestinationBlend = Blend.SourceColor;
spriteBatch.Draw(lightBuffer.GetTexture(), new Rectangle(0, 0, screenWidth, screenHeight), Color.White);
spriteBatch.End();
As far as I know there is no way to do this without using your own custom shaders.
A custom shader for this would work like so:
Render your scene to a texture
Render your lights to another texture
As a post process on a blank quad, sample the two textures and the result is Scene Texture * Light Texture.
This will output a lit scene, but it won't do any shadows. If you want shadows I'd suggest following this excellent sample from Catalin Zima
Perhaps using the same technique as in the BloomEffect component could be an idea.
Basically what the effect does is grabbing the rendered scene, calculates a bloom image from the brightest areas in the scene, the blurs and combines the two. The result is highlighting areas depending on color.
The same approach could be used here. It will be simpler since you won't have to calculate the bloom image based on the background, only based on the position of the character.
You could even reuse this further to provide highlighting for other light sources as well, such as torches, magic effects and whatnot.