GLKTexture is not correctly mapped since iOS6 - ios

I got a strange behavior since Xcode 4.5 and the iOS 6 SDK when using textures on my 3D objects.
The problem also appears on my mac application when building against OS X 10.8 SDK.
I am using OpenGL ES 2.0 on iOS and OpenGL legacy profile ( < 3.0 ) on OS X 10.8.
The textures are not placed at there correct coordinates anymore and i have lots of artifacts. The VAOs are correctly uploaded and they look good without texturing. When using XCode 4.4.1 and iOS 5.1 SDK everything is fine.
The VAO is exactly the same (checked with OpenGL ES frame capture) and also the texture uniforms are binded correct.
Xcode 4.4 VAO Overview
Xcode 4.5 VAO Overview
XCode 4.4.1 (iOS 5.1 SDK)
XCode 4.5 (iOS 6 SDK)
Code / Shader Snippet
Relevant parts for uploading and processing the texture. I had to strip the shaders to the minium.
Vertex shader
precision highp float;
attribute vec2 a_vTextureCoordinate;
uniform mat4 u_mModelViewMatrix;
uniform mat4 u_mModelViewMatrixInverse;
uniform mat4 u_mProjectionMatrix;
uniform mat4 u_mNormalMatrix;
void main()
{
....
// Transform output position
gl_Position = u_mProjectionMatrix * u_mModelViewMatrix * a_vPosition;
// Pass through texture coordinate v_texcoord = a_vTextureCoordinate.xy;
v_vPosition = vec3(u_mModelViewMatrix * a_vPosition);
v_vTextureCoordinate = a_vTextureCoordinate.xy;
....
}
Fragment Shader
precision highp float;
// location 1
uniform sampler2D u_diffuseTexture;
varying vec2 v_vTextureCoordinate;
varying vec3 v_vPosition;
....
void main() {
....
vec4 base = texture2D(u_diffuseTexture, v_vTextureCoordinate);
gl_FragColor = base;
....
}
Texture loading
NSDictionary *options = #{GLKTextureLoaderOriginBottomLeft: #(YES), [NSNumber numberWithBool:YES] : GLKTextureLoaderGenerateMipmaps};
NSError *error;
path = [path stringByReplacingOccurrencesOfString:#"/" withString:#""];
path = [[NSBundle mainBundle] pathForResource:[path stringByDeletingPathExtension] ofType:[path pathExtension]];
GLKTextureInfo *texture = [GLKTextureLoader textureWithContentsOfFile:path options:options error:&error];
Render loop (Only sending the uniform of the active texture)
....
[self setShaderTexture:[[materials objectForKey:#"diffuse"] objectForKey:#"glktexture"]
forKey:#"u_diffuseTexture"
withUniform1i:0
andTextureUnit:GL_TEXTURE0+0];
....
#pragma mark - Texture communication
-(void)setShaderTexture:(GLKTextureInfo*)texture forKey:(NSString*)key withUniform1i:(int32_t)uniform andTextureUnit:(int32_t)unit {
glActiveTexture(unit);
glBindTexture(texture.target, texture.name);
[self.shaderManager sendUniform1Int:key parameter:uniform];
}
Had anyone a close problem to mine since iOS 6?

You should report the bug to bugreport.apple.com as already mentioned. As an aside, if you are suggesting that GLKTextureLoader is maybe the problem (seems like a good theory) then you might narrow things down in one of two ways off the top of my head...
1) I would render the texture shown to a trivial quad and see if the render results are what you expect., i.e., is it rendering vertically flipped from what you expect? Is the source texture partially garbled in some way that you weren't expecting?
2) You could try converting your image to a different size/color depth/image type and see if the problem still exists. What I'm thinking is, and it seems unlikely, but maybe it's not being reported because you are hitting an unusual edge case due to something with the image format. Knowing this would be of huge help to anyone trying to fix this at apple.
Probably not much help but without having access to all your source and assets, pretty hard to know what to suggest. FWIW, I have some samples that do similar things to what you are doing and haven't noticed anything under GLKit v. 10.8.

Related

ARKit, metal shader for ARSCNView

Trying to figure out how to solve my issue of applying shaders to my ARSCNView.
Previously, when using a standard SCNView, i have successfully been able to apply a distortion shader the following way:
if let path = Bundle.main.path(forResource: "art.scnassets/distortion", ofType: "plist") {
if let dict = NSDictionary(contentsOfFile: path) {
let technique = SCNTechnique(dictionary: dict as! [String : AnyObject])
scnView.technique = technique
}
}
Replacing SCNView with ARSCNView gives me the following error(s):
"Error: Metal renderer does not support nil vertex function name"
"Error: _executeProgram - no pipeline state"
I was thinking it's because that ARSCNView uses a different renderer than SCNView. But logging ARSCNView.renderingAPI tells me nothing about the renderer, and i can't seem to choose one when i construct my ARSCNView instance. I must be missing something obvious, because i can't seem to find a single resource when scouring for references online.
My initial idea was instead use a SCNProgram to apply the shaders. But i can't find any resources of how to apply it to an ARSCNView, or if it's even a correct/possible solution, SCNProgram seems to be reserved for materials.
Anyone able to give me any useful pointers of how to solve vertex+fragment shaders for ARSCNView?
SCNTechnique for ARSCNView does not work with GLSL shaders, instead Metal functions need to be provided in the technique's plist file under the keys metalVertexShader and metalFragmentShader.
To the contrary, documentation says any combination of shader should work:
You must specify both fragment and vertex shaders, and you must
specify either a GLSL shader program, a pair of Metal functions, or
both. If both are specified, SceneKit uses whichever shader is
appropriate for the current renderer.
So it might be a mistake, but I guess the documentation is outdated. Since all ARKit running devices also run Metal, GLSL support has not been added to ARSCNViews.
As iOS12 deprecates OpenGL this looks like planned.
I had this issue in ARKit iOS11.4 and 12 and it came down to a series of miss-spelt shaders. I hope this might help someone.

webgl replace program shader

I'm trying to swap the fragement-shader used in a program. The fragment-shaders all have the same variables, just different calculations. I am trying to provide alternative shaders for lower level hardware.
I end up getting single color outputs (instead of a texture), does anyone have an idea what I could be doing wrong? I know the shaders are being used, due to the color changing accordingly.
//if I don't do this:
//WebGL: INVALID_OPERATION: attachShader: shader attachment already has shader
gl.detachShader(program, _.attachedFS);
//select a random shader, all using the same parameters
attachedFS = fragmentShaders[~~(Math.qrand()*fragmentShaders.length)];
//attach the new shader
gl.attachShader(program, attachedFS);
//if I don't do this nothing happens
gl.linkProgram(program);
//if I don't add this line:
//globject.js:313 WebGL: INVALID_OPERATION: uniform2f:
//location not for current program
updateLocations();
I am assuming you have called gl.compileShader(fragmentShader);
Have you tried to test the code on a different browser and see if you get the same behavior? (it could be standards implementation specific)
Have you tried to delete the fragment shader (gl.deleteShader(attachedFS); ) right after detaching it. The
previous shader may still have a pointer in memory.
If this does not let you move forward, you may have to detach both shaders (vertex & frag) and reattach them or even recreate the program from scratch
I found the issue, after trying about everything else without result. It also explains why I was seeing a shader change, but just getting a flat color. I was not updating some of the attributes.

How to debug WebGL uncaught type error

I'm getting
Uncaught TypeError: Type error
When I have the WebGL Inspector enabled (in Chrome), this error originates in a file that starts with 'VM' and ends in a sequence of digits (not sure what code owns that -- is it core browser behavior or the WebGLInspector?). This is the line.
// Call real function
var result = originalFunction.apply(context.rawgl, arguments);
I enabled the debug context and am logging all WebGL calls. This is the call that breaks:
uniform1i(3, 0)
In the WebGL inspector, I see that the uniform at index 3 is my uniform sampler2D uSampler in my fragment shader. The API documentation says that this is a GLint, so the type is correct. I also tried setting some other uniforms first and they also fail with the same error.
I'm reworking some existing code I wrote after following tutorials and one of the things I'm adding is interleaved vertex data. I'm sure that that is the root cause, however, this is the third time I've come across an error like this and my only recourse has been to massage the code until it goes away. It feels random and it's frustrating.
Are there any more debugging techniques? I assume it's an error in the shaders. Is there some way to get a stack trace from them?
uniform1i(3, 0)
Is not valid WebGL. The uniform functions require a WebGLUniformLocation object which can only be gotten by calling gl.getUniformLocation
This is different from OpenGL. The reason is you are not allowed to do math on uniform locations. In OpenGL developers often make that mistake. They'll do something like this
--in shader--
uniform float arrayOfFloats[4];
--in code--
GLint location = glGetUniformLocation(program, "arrayOfFloats");
glUniform1f(location, 123.0f);
glUniform1f(location + 1, 456.0f); // BAD!!!
That second line is not valid OpenGL but it might work depending on the driver.
In WebGL they wanted to make that type of mistake impossible because web pages need to work everywhere whereas OpenGL programs only need to work on the platform they are compiled on. To make it work everywhere they had gl.getUniformLocation return an object so you can't do math on the result.
The correct way to write the code above in OpenGL is
--in shader--
uniform float arrayOfFloats[4];
--in code--
GLint location0 = glGetUniformLocation(program, "arrayOfFloats[0]");
GLint location1 = glGetUniformLocation(program, "arrayOfFloats[1]");
glUniform1f(location0, 123.0f);
glUniform1f(location1, 456.0f);
And in WebGL is
--in shader--
uniform float arrayOfFloats[4];
--in code--
var location0 = gl.getUniformLocation(program, "arrayOfFloats[0]");
var location1 = gl.getUniformLocation(program, "arrayOfFloats[1]");
gl.uniform1f(location0, 123.0);
gl.uniform1f(location1, 456.0);

XNA 4.0 and unsolvable (by me) depth curious rendering

Big headhache on XNA 4.0 concerning a depth problem:
I've already found many answers to similar problems, but no one work for me...
The device is set like this:
xnaPanel1.Device.BlendState = BlendState.Opaque;
xnaPanel1.Device.DepthStencilState = DepthStencilState.Default;
xnaPanel1.Device.PresentationParameters.DepthStencilFormat = DepthFormat.Depth24Stencil8;
[...]
Projection = Matrix.CreatePerspectiveFieldOfView(MathHelper.PiOver4, 4.0f / 3.0f, 0.1f, 1000f);
As a brutal problem resolver, I have tried most DepthStencilFormat and DepthStencilState possibilities... No one works like i want.
Concerning the projection matrix, I've tried many nearclip and farclip value too. (cube width: 10f) but can't get the correct result.
I've tested this with many different texture, all opaque.
I don't use a BasicEffect but an effect using texture + normal map, can it be the source of the problem?
CubeEffect.fx
[...]
sampler2D colorMap = sampler_state
{
Texture = <colorMapTexture>;
MagFilter = Linear;
MinFilter = Anisotropic;
MipFilter = Linear;
MaxAnisotropy = 16;
};
sampler2D normalMap = sampler_state
{
Texture = <normalMapTexture>;
MagFilter = Linear;
MinFilter = Anisotropic;
MipFilter = Linear;
MaxAnisotropy = 16;
};
[...]
Edit: I tried with a BasicEffect and problem is the same...
So... Thanks for any help ;)
Ok that's it.
pp.DepthStencilFormat = DepthFormat.Depth24Stencil8;
have to be before device creation call.
So I don't know at this time why this:
Device.PresentationParameters.DepthStencilFormat = DepthFormat.Depth24Stencil8;
previously called in my main Draw function, doesn't work...
Conclusions?
PresentationParameters pp = new PresentationParameters();
pp.IsFullScreen = false;
pp.BackBufferHeight = this.renderControl.Height;
pp.BackBufferWidth = this.renderControl.Width;
pp.DeviceWindowHandle = renderControl.Handle;
pp.DepthStencilFormat = DepthFormat.Depth24Stencil8;
this.graphicsDevice = new GraphicsDevice(GraphicsAdapter.DefaultAdapter, GraphicsProfile.HiDef, pp);
Now working fine!
PresentationParameters is a structure that defines how the device is created. You've already seen that when you create the graphics device you need to pass the structure in, which is only used for initial configuration.
The device stores the presentation parameters on it, but changing it does nothing unless you call Reset on the device, which will reinitialize the device to use whatever parameters you've changed. This is an expensive operation (so you won't want to do it very often).
Basically GraphicsDevice.PresentationParameters is an output - writing to it doesn't actually change the device state. It gets updated whenever the device is set-up or reset.
Generally you will be setting up the GraphicsDevice using GraphicsDeviceManager - it handles setting-up, resetting, and tearing-down the device for you. It is part of the default XNA Game template project.
The correct way to modify states is to set the desired values on GraphicsDeviceManager. In your case, you can simply set PreferredDepthStencilFormat.
Once you do this, you need to either set-up the device (ie: specify your settings in your game constructor and XNA will do the rest), or reset the device by calling GraphicsDeviceManager.ApplyChanges - which you should usually only do in response to user input (and obviously not every frame). See this answer for details.
You'll note that there are some presentation parameters that are not directly settable on the GraphicsDeviceManager. To change these you would have to attach an event handler to GraphicsDeviceManager.PreparingDeviceSettings. The event argument will give you access to a version of the presentation parameters you can usefully modify (e.GraphicsDeviceInformation.PresentationParameters) - the settings in there are what gets used when the graphics device is created.

Getting 'PERFORMANCE WARNING' messages in Chrome

I just recently starting getting these messages, and was wondering if anyone has seen them, or know what may be causing them. I'm using Three.js with Chrome version '21.0.1180.57' on MacOS. I don't get these messages with Safari or FireFox.
PERFORMANCE WARNING: Attribute 0 is disabled. This has signficant performance penalty
WebGL: too many errors, no more errors will be reported to the console for this context.
Same message on Firefox is : "Error: WebGL: Drawing without vertex attrib 0 array enabled forces the browser to do expensive emulation work when running on desktop OpenGL platforms, for example on Mac. It is preferable to always draw with vertex attrib 0 array enabled, by using bindAttribLocation to bind some always-used attribute to location 0."
This is not only a performance drawback, but will also result in bad output.
PROBLEM: This message occurs if a JS tries to run a WebGL shader that is expecting color information in gl_Color on a mesh not providing a color array.
SOLUTION: Use a WebGL shader with constant color not accessing gl_Color or provide a color array in the mesh to be shaded.
If using lightgl.js from Evan Wallace, try to add the option colors:true in the new GL.Mesh statement and provide a propper mesh.colors array of same size as your vertices array. Or try this shader:
blackShader = new GL.Shader(
'void main() { gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; }',
'void main() { gl_FragColor = vec4(0.0, 0.0, 0.0, 1.0); }'
);
Sorry, I never used Three.js but the problem should be similar, provide color to your mesh before shading.
Looks like a Chrome bug:
http://code.google.com/p/chromium-os/issues/detail?id=32528

Resources