Hi everyone, i have been trying Instanced drawing in OpenGLES2.0, in IOS platform. My rendering code
glEnableVertexAttribArray(...);
glVertexAttribPointer(...)
glDrawElementsInstancedEXT(GL_TRIANGLES,IndicesCount, GL_UNSIGNED_SHORT, 0, 5);
And my Vertex Shader
attribute vec4 VertPosition;
uniform mat4 mvpMatrix[600];
void main()
{
gl_Position = (mvpMatrix[gl_InstanceID]) * VertPosition;
}
I'm getting ERROR: Use of undeclared identifier 'gl_InstanceID'
my glsl version is 1.0, if version is the issue then how can i upgrade ? Any other way to use "gl_InstanceID" in GLSL ?
gl_InstanceID is only available starting from GLSL ES 3.0 as stated here.
So this is, as you already suspected, a version issue. As far as I know, the only available GLSL ES version in OpenGL ES 2.0 is GLSL ES 1.0, and if you want to use a higher GLSL ES version you have to upgrade to OpenGL ES 3.0. (more details here)
Edit: I was thinking about what you want to achieve with the usage of gl_InstanceID. This variable does only make sense when using one of the instanced draw commands (glDrawArraysInstanced etc.), which are also not available in ES 2.0.
Apparently, there is a possibility to use instanced rendering in OpenGL ES 2.0 by using the GL_EXT_draw_instanced extension. This extension provides one with two additional draw commands for instanced drawing (glDrawElementsInstancedEXT and glDrawArraysInstancedEXT). When using the extension, one has to enable it in the shader
#extension GL_EXT_draw_instanced : enable
and use gl_InstanceIDEXT instead of gl_InstanceID.
Related
We're upgrading (!) our code on IOS from OpenGLES2.0 to OpenGLES3.0 so we can use texture arrays.
We build against OpenGLES3.0 defines and open a GL3.0 context, and I can initialise a texture array fine, but the shader code gives an error on this line when I compile it at runtime.
uniform sampler2DArray mySamplerName;
It complains about a "syntax error" on mySamplerName as if it doesn't understand sampler2DArray as a keyword
If I use
uniform sampler3D mySamplerName;
it compiles ok, but that's not what I want.
Yes, I know OpenGL is deprecated, but does anyone know if this should work on IOS ?
Thanks
Shaun
I'm trying to understand Apple's Metal and converting some old opengl shaders. I'm stuck in an error that appears only in one project and not the others. I wanted to ask if there is a compiler option or something like that, that I do not know and can cause this error.
So... I've got an audio visualizer for a music player that I wrote sometime ago using opengl on ndk for Android. I converted the shader to metal on an empty project. The fragment method signature is like this:
fragment float4 spectrum_fragment_func(
Vertex vert [[stage_in]],
device Fragment *uniforms [[buffer(0)]],
device float *left [[buffer(1)]],
device float *right [[buffer(2)]]
)
I'm updating the Fragment object in code. It has a "time" value and needs to be updated for the effects to take place, meaning I cannot use constant values.
Shader compiles and works without any problems on the test application, that simply has a ViewController and my MTKView class.
When I copy the classes and shaders as-is to another project, I start getting this error:
/Volumes/Additional/Projects/.../Visuals/Shaders.metal:31:189:
Pointer argument to fragment function must be const
If I convert the Fragment variable to constant, this time error appears for the next value in the signature. It seems that in this project, something is changed and does not accept any other types of variables but only consts...
If anyone had a similar problem, or know how to solve this I'm stuck and need some help.
I'm working with Qt(especially 5.5) on iOS device not simulator
I'm just add Video object to QML code for play HLS stream like below.
Video {
id: livePlayer
anchors.fill: parent
source: "http://content.jwplatform.com/manifests/vM7nH0Kl.m3u8"
autoPlay: true
}
But Qt returns me bug with log like below
Failed to find shader ":/qtmultimediaquicktools/shaders/rgbvideo.vert"
Failed to find shader ":/qtmultimediaquicktools/shaders/rgbvideo.frag"
QOpenGLShader::link: "ERROR: Compiled vertex shader was corrupt.\nERROR: Compiled fragment shader was corrupt.\n"
shader compilation failed:
"ERROR: Compiled vertex shader was corrupt.\nERROR: Compiled fragment shader was corrupt.\n"
QOpenGLShader::link: "ERROR: Compiled vertex shader was corrupt.\nERROR: Compiled fragment shader was corrupt.\n"
QOpenGLShaderProgram::uniformLocation( qt_Matrix ): shader program is not linked
QOpenGLShaderProgram::uniformLocation( rgbTexture ): shader program is not linked
QOpenGLShaderProgram::uniformLocation( opacity ): shader program is not linked
I was doing somethings what I can -Clean, Run QMake, etc-
But its useless.
Give me some help plz.
Thanks have a good day.
This is a temporary bug in the Qt 5.5 branch.
To fix it with the current 5.5 snapshots, add this to your main() function:
Q_INIT_RESOURCE(qtmultimediaquicktools);
The issue is already fixed in the current 5.5 branch of Qt, find the fix here or wait for the next snapshot.
Working on an API for iOS, I wrote a couple of shaders that are very important for the API. These shaders are in the form of vsh and fsh files. Now, working on real projects with the API , I discovered that when we add the folder with the API source to an Xcode iOS project, we have to do an additional step, consisting in adding the file to "Copy Bundle Resources" in the Build Phases tab of the project configuration. So, we are thinking in a best way to handle this issue (and with "best" I meant that step shouldn't exist at all!) since the shaders are loaded as strings to the GPU, so, what will be the best way to handle these shaders in the API?
One way to avoid having to bundle the shader files with your framework or static library is to embed them as string constants. I do this in this project using the following macros:
#define STRINGIZE(x) #x
#define STRINGIZE2(x) STRINGIZE(x)
#define SHADER_STRING(text) # STRINGIZE2(text)
This lets me then do something like the following:
NSString *const kGPUImagePassthroughFragmentShaderString = SHADER_STRING
(
varying highp vec2 textureCoordinate;
uniform sampler2D inputImageTexture;
void main()
{
gl_FragColor = texture2D(inputImageTexture, textureCoordinate);
}
);
and use that NSString constant to provide the shader for my programs where needed. Shader files are then not needed, which simplifies the distribution process.
You don't get as specific of syntax highlighting in Xcode as you do for dedicated vertex and fragment shader files, but you do get basic C coloring.
Apple introduced a new shader extension called GL_APPLE_shader_framebuffer_fetch, which allows fully programmable blending. There is also a wwdc video explaining the functionallity. It's the video 513 of wwdc 2012.
Sadly this extension doesn’t work for me.
F-Shader:
#extension GL_APPLE_shader_framebuffer_fetch : require
varying lowp vec4 colorVarying;
void main(void) {
gl_FragColor = gl_lastFragData[0] + vec4(colorVarying.x, colorVarying.y, colorVarying.z, 1.0);
}
Debug output:
extension ‘GL_APPLE_shader_framebuffer_fetch’ is not supported
Tried to run it on the iOS 6.0 iPad Simulator ‘n on an actual iPad with 6.0
How can that be? What do I have to do to actually use this extension?
Try GL_EXT_shader_framebuffer_fetch it was called GL_APPLE_shader_framebuffer_fetch in the Beta, but it got renamed in the final release (according to the iOS6 release notes).