We are profiling our application and we're noticing that most of the cpu time is spent on calls to texImage2D which is what we use to populate a texture. An example is shown below. I'd like to know if there are faster methods available in WebGL 1/2 or propriatary browser extensions that make this faster?
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texImage2D(gl.TEXTURE_2D,
0,
0,
0,
width,
height,
gl.RED,
gl.FLOAT,
data);
gl.bindTexture(gl.TEXTURE_2D, null);
Related
I am creating a framebuffer and attaching a texture to it. Here is the texture that I would like to attach(but is not working):
gl.texImage2D(gl.TEXTURE_2D, 0, gl.R32F, sphere_texture.width, sphere_texture.height, 0, gl.RED, gl.FLOAT, null);
However, when I use this as the texture format, it works:
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGB, sphere_texture.width, sphere_texture.height, 0, gl.RGB, gl.UNSIGNED_BYTE, null)
Does anyone know how I could render to a framebuffer float texture?
This is how I am creating the framebuffer:
framebuffer = gl.createFramebuffer();
gl.bindFramebuffer(gl.FRAMEBUFFER, framebuffer);
gl.framebufferTexture2D(gl.FRAMEBUFFER, gl.COLOR_ATTACHMENT0, gl.TEXTURE_2D, scale_factor_texture, 0);
gl.bindFramebuffer(gl.FRAMEBUFFER, null);
For WebGL2 contexts (which I assume you're working with, going by your intended use of the R32F format), you need to enable the EXT_color_buffer_float extension for those formats to be renderable:
if (!ctx.getExtension('EXT_color_buffer_float'))
throw new Error('Rendering to floating point textures is not supported on this platform');
For WebGL1 context there's WEBGL_color_buffer_float, as well as implicit support when enabling OES_texture_float (that one can probe for by attaching such a texture to render target and checking its completeness), however (with WebGL 1) rendering to single channel textures is not supported either way.
These OpenGL ES formats are driving me nuts... I upgraded my project to ES 3 from ES 2, so apparently you have to declare the internal format with a sized type... According to https://www.khronos.org/opengles/sdk/docs/man3/docbook4/xhtml/glTexImage2D.xml these combinations are perfectly valid:
glTexImage2D(GL_TEXTURE_2D, 0, GL_R8, width, height, 0, GL_RED, GL_UNSIGNED_BYTE, NULL);
...
glTexImage2D(GL_TEXTURE_2D, 0, GL_R16F, width, height, 0, GL_RED, GL_HALF_FLOAT, NULL);
But they give me GL_INVALID_OPERATION. Single channel textures in ES are poorly documented by Khronos/Apple and the community barely uses them. If there is another soul out there that attempted to use them and succeeded please let me know. I wish I could just use Metal.
Meh, being on a chip < A7 was the explanation of the errors. No OpenGL 3.0 on that.
i have a question about a circle which has a texture mapping. My code works well but i have not antialised edges so it is not smooth and looks not good. I have read now about 3 hours and found some solutions but i don't know how can i implement them in my code. There were two solutions which sounds pretty good.
First was a blurry texture which should bind instead of a non blurry to have smooth edges.
Second add color vertices on the edges with opacity to have smooth edges. My currently draw function looks like this:
CC_NODE_DRAW_SETUP();
[self.shaderProgram use];
ccGLBindTexture2D( _texture.name );
glTexParameterf( GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
ccGLBlendFunc( _blendFunc.src, _blendFunc.dst);
ccGLEnableVertexAttribs( kCCVertexAttribFlag_Position | kCCVertexAttribFlag_TexCoords );
// Send the texture coordinates to OpenGL
glVertexAttribPointer(kCCVertexAttrib_TexCoords, 2, GL_FLOAT, GL_FALSE, 0, _textCoords);
// Send the polygon coordinates to OpenGL
glVertexAttribPointer(kCCVertexAttrib_Position, 2, GL_FLOAT, GL_FALSE, 0, _triangleFanPos);
// Draw it
glDrawArrays(GL_TRIANGLE_FAN, 0, _numOfSegements+2);
I am currently using cococs2d version 3. I asked a similar question and found only the solution of enable the multisampling on cocos2d but this break my fps to 30.
So maybe there is someone how can help me.
I'm trying to get a game I made for iOS work in OSX. And so far I have been able to get everything working except for the drawing of some random generated hills using a glbound texture.
It works perfectly in iOS but somehow this part is the only thing not visible when the app is run in OSX. I checked all coords and color values so I'm pretty sure it has to do with OpenGL somehow.
glDisable(GL_TEXTURE_2D);
glDisableClientState(GL_TEXTURE_COORD_ARRAY);
glDisableClientState(GL_COLOR_ARRAY);
glBindTexture(GL_TEXTURE_2D, _textureSprite.texture.name);
glColor4f(_terrainColor.r,_terrainColor.g,_terrainColor.b, 1);
glVertexPointer(2, GL_FLOAT, 0, _hillVertices);
glTexCoordPointer(2, GL_FLOAT, 0, _hillTexCoords);
glDrawArrays(GL_TRIANGLE_STRIP, 0, (GLsizei)_nHillVertices);
glEnableClientState(GL_COLOR_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glEnable(GL_TEXTURE_2D);
You're disabling the texture coordinate (and color) array along with the texturing unit, yet are binding a texture coordinate pointer.
Is this really what you intend to do?
Appearantly it was being drawn after all, only as a 1/2 pixel line. Somehow there is some scaling on the vertices in effect, will have to check my code.
check out the following test:
http://binks.knobbits.org/webgl/texture3.html
It's a simple test of cube textures. It also has a 2D texture in there for good measure.
I discovered that in some browsers (so far, chrome) The image is not displayed properly if I re-use the same texture unit for drawing the cube texture as for the 2D texture.
There is a checkbox at the bottom marked "Use separate texture units for the cube texture on the sphere and the 2D texture on the floor" that shows this.
Is this a bug in chrome or in my code?
I don't see anything wrong with your code but
1) You can't use the same texture for 2 different targets. In other words you can't do this
tex = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex);
gl.bindTexture(gl.TEXTURE_CUBE_MAP, tex);
2) You can't use both a TEXTURE_2D and a CUBE_MAP on a texture unit AT THE SAME TIME.
You can assign both, but when you render you're only allowed to reference one of them in your shaders. In other words.
gl.activeTexture(gl.TEXTURE0);
tex1 = gl.createTexture();
tex2 = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, tex1);
gl.bindTexture(gl.TEXTURE_CUBE_MAP, tex2);
Is okay but a shader that tried use both textures from texture unit 0 would fail.
I have ordered a bit the code of the drawing functions and now are working.
Square:
TexturedSquare.prototype.draw = function() {
gl.bindBuffer(gl.ARRAY_BUFFER,this.v);
gl.enableVertexAttribArray(gl.va_vertex);
gl.enableVertexAttribArray(gl.va_normal);
gl.enableVertexAttribArray(gl.va_tex1pos);
gl.vertexAttribPointer(gl.va_vertex,4,gl.FLOAT,false,10*4,0);
gl.vertexAttribPointer(gl.va_normal,4,gl.FLOAT,false,10*4,4*4);
gl.vertexAttribPointer(gl.va_tex1pos,2,gl.FLOAT,false,10*4,4*8);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D,this.texture);
gl.bindTexture(gl.TEXTURE_CUBE_MAP,null);
gl.uniform1i(shader.textures,1);
gl.uniform1i(shader.texture1,0);
gl.uniform1i(shader.cube_textures,0);
gl.uniform1i(shader.cubeTexture0,1);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER,this.e);
gl.drawElements(gl.TRIANGLES,this.l,gl.UNSIGNED_SHORT,0);
gl.disableVertexAttribArray(gl.va_tex1pos);
}
Sphere:
GLHTexturedSphere.prototype.draw = function() {
gl.bindBuffer(gl.ARRAY_BUFFER,this.vbuf);
gl.enableVertexAttribArray(gl.va_vertex);
gl.enableVertexAttribArray(gl.va_normal);
gl.enableVertexAttribArray(this.va_cubetex0pos);
gl.vertexAttribPointer(gl.va_vertex,4,gl.FLOAT,false,8*4,0);
gl.vertexAttribPointer(gl.va_normal,4,gl.FLOAT,false,8*4,4*4);
gl.vertexAttribPointer(this.va_cubetex0pos,4,gl.FLOAT,false,8*4,4*4);
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D,null);
gl.bindTexture(gl.TEXTURE_CUBE_MAP,this.texture);
gl.uniform1i(shader.textures,0);
gl.uniform1i(shader.texture1,1);
gl.uniform1i(shader.cube_textures,1);
gl.uniform1i(shader.cubeTexture0,0);
gl.bindBuffer(gl.ELEMENT_ARRAY_BUFFER,this.ebuf);
gl.drawElements(gl.TRIANGLES,this.l,gl.UNSIGNED_SHORT,0);
gl.disableVertexAttribArray(gl.va_cubetex0pos);
}
Both of them are using now TEXTURE0. Please check WebGL states and uniform values.
Original code is a bit hard for me, sorry. But I think the problem is that texture1 and cubeTexture0 uniforms are been setted with the same value.