Is it possible in OpenGL ES 2.0 to use GL_LINE_STRIP to draw a shape and then apply a texture to that shape ?
E.g if I draw a triangle can I then apply a triangle texture ?
GL_LINE_STRIP draws only lines. If you want polygons filled in (whether with color, lighting, or texturing), you need one of the solid polygon modes: GL_TRIANGLES, GL_TRIANGLE_STRIP, or GL_TRIANGLE_FAN.
If you want to both fill and stroke your polygons, you'll need two draw calls, one with each mode. And if you're using depth testing, you'll probably want to look into glPolygonOffset to avoid z-fighting.
Related
I'm looking for a most efficient way of drawing a 2-dimensional background in metal. This requires rendering a textured rectangle.
The basic geometry example shows an example on how to draw a triangle. Is there an easy and non-bloated way to draw a rectangle (a polygon with 4 corners)?
The Basic Texturing sample draws a textured rectangle
I know how to draw line on 2d surface.But I can't find a way to draw a line in space.
I have wrote a demo
and now I want to draw line in space.
and finish it like this:
I have finished the 2d surface rotate in space use CATransform3D already. But I don't know how to draw line in space.
Thanks a lot.
Normal drawing on iOS is 2D. Core Animation is "2.5D", where it can draw flat images with fake 3D perspective. It doesn't let you "draw in space."
If you want real 3D perspective drawing you should use OpenGL, SceneKit, Metal, or some other 3D API.
Your trying to draw a 3d image on a 2d surface. Therefore you need some sort of mapping
https://en.m.wikipedia.org/wiki/3D_projection
Has some options for you. Orthographic projection is probably what you want though
https://upload.wikimedia.org/math/8/3/a/83a402b37056afa1dd4c8d706a9a2d75.png
Is the equation you would want to use where s is a scaling factor and c is an offset
I am wondering if there is a way to draw a polygon on top of a MKMapView where each vertex in the polygon could potentially have a different color. The triangles of the polygon would correctly blend the colors together when rendered.
I know I can do this with OpenGL, but I would like to try to use CoreGraphics or a drawing function of MapKit first, as I don't know how well OpenGL works on top of a MKMapView.
I know there is support to fill a polyline with a gradient. Is there a similar way to fill a polygon/triangle with multiple gradients to achieve the desired effect?
I'm trying to load an image into a texture, then draw into the texture.
Am I right in thinking that to display a texture you always need an array of vertices and texture coordinates? So, to draw a square texture I would need to draw 2 triangles to make a square and attach the texture to them?
Yes, that's right. For a square texture, the alternative is to use point sprites with GL_POINTS.
I am rendering point sprites (using OpenGL ES 2.0 on iOS) as a user's drawing strokes. I am storing these points in vertex buffer objects such that I need to perform depth testing in order for the sprites to appear in the correct order when they're submitted for drawing.
I'm seeing an odd effect when rendering these drawing strokes, as shown by the following screenshot:
Note the background-coloured 'border' around the edge of the blue stroke, where it is drawn over the green. The user drew the blue stroke after the green stroke, but when the VBOs are redrawn the blue stroke gets drawn first. When it comes to draw the green stroke, depth testing kicks in and sees that it should be behind the blue stroke, and so does this, with some success. It appears to me to be some kind of blending issue, or to do with incorrectly calculating the colour in the fragment shader? The edges of all strokes should be transparent, however it appears that the fragment shader combines it with the background texture when processing those fragments.
In my app I have created a depth renderbuffer and called glEnable(GL_DEPTH_TEST) using glDepthFunc(GL_LEQUAL). I have experimented with glDepthMask() to no avail. Blending is set to glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA), and the point sprite colour uses premultiplied alpha values. The drawing routine is very simple:
Bind render-to-texture FBO.
Draw background texture.
Draw point sprites (from a number of VBOs).
Draw this FBO's texture to the main framebuffer.
Present the main framebuffer.
EDIT
Here is some code from the drawing routine.
Setup state prior to drawing:
glDisable(GL_DITHER);
glEnable(GL_BLEND);
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
glEnable(GL_DEPTH_TEST);
glDepthFunc(GL_LEQUAL);
Drawing routine:
[drawingView setFramebuffer:drawingView.scratchFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:[self textureForBackgroundType:self.backgroundType]];
glUseProgram(programs[PROGRAM_POINT_SPRITE]);
// ...
// Draw all VBOs containing point sprite data
// ...
[drawingView setFramebuffer:drawingView.defaultFramebuffer andClear:YES];
glUseProgram(programs[PROGRAM_TEXTURE]);
[self drawTexture:drawingView.scratchTexture];
[drawingView presentFramebuffer:drawingView.defaultFramebuffer];
Thanks for any help.
If you want to draw non opaque geometries you have to z-sort them from back to front. This has been the only way to get a proper blending for many years. These days there are some algorithms for order independent transparency like Dual Depth Peeling but they are not applicable to iOS.