I'm trying to upscale a texture using MetalFX's MTLFXSpatialScaler. The input texture has transparency (it's rgba8Unorm) but in the scaled texture, the transparency has been removed so that previous transparent areas are now rendered as black.
I've confirmed that the scaler's output texture is also rgba8Unorm. Also to confirm this isn't a problem drawing the scaled textures, I used the Metal debug tools to inspect each texture. Here's a pixel from the alpha part of the input texture:
And here's a pixel from the same area in the scaled output (notice how the A is now 1):
Does MTLFXSpatialScaler or MTLFXTemporalScaler support scaling a texture with alpha? Is there some additional setting or pixel for I need to use to enable this?
Related
I've got two YUV images (YUV 420 SP), which have been loaded in fragment shader as textures.
These textures are having an overlapping area.
Now, I am trying to blend these two textures such that there should not be any difference visible in the brightness over the overlapping area. (Basically stitching the images)
[]
Can you suggest any method on how I can mix/blend/stitch these two images.
I have come across the concept of Alpha Masking in OpenCV. I am not sure how it is applicable in OpenGL. Also, since I am using YUV images, which are loaded as textures, as R and RG component (for Y and UV), and further converting it to RGB Color space in the fragment shader.
Note: I am not considering any geometrical alignment for now. I know after the blending/stitching the image will not quite good looking as the images are not oriented or placed properly. My question is only regarding the photometric/color alignment on how I can handle the overlapping area.
I an rendering a simple box:
MDLMesh(boxWithExtent: ...)
In my draw loop when I turn off back-face culling:
renderCommandEncoder.setCullMode(.none)
All depth comparison is disabled and sides of the box are drawn completely wrong with back-facing quads in front of front-facing.
Huh?
My intent is to include back-facing surfaces in the depth comparison not ignore them. This is important for when I have, for example, a shape with semi-transparent textures that reveal the shape's internals which have a different shading style. How to I force depth comparison?
UPDATE
So Warren's suggestion is an improvement but it is still not correct.
My depthStencilDescriptor:
let depthStencilDescriptor = MTLDepthStencilDescriptor()
depthStencilDescriptor.depthCompareFunction = .less
depthStencilDescriptor.isDepthWriteEnabled = true
depthStencilState = device.makeDepthStencilState(descriptor: depthStencilDescriptor)
Within my draw loop I set depth stencil state:
renderCommandEncoder.setDepthStencilState(depthStencilState)
The resultant rendering
Description. This is a box mesh. Each box face uses a shader the paints a disk texture. The texture is transparent outside the body of the disk. The shader paints a red/white spiral texture on front-facings quads and a blue/black spiral texture on back-facing quads. The box sits in front of a camera aligned quad textured with a mobil image.
Notice how one of the textures paints over the rear back-facing quad with the background texture color. Notice also that the rear-most back-facing quad is not drawn at all.
Actually it is not possible to achieve the effect I am after. I basically want to do a simple composite - Porter/Duff - here but that is order dependent. Order cannot be guaranteed here so I am basically hosed.
Opengl es2 on the iphone. Some objects are made with multiple layers of sprites with alphas.
Then I also have UI elements that are also composited together from various sprites that I fade in/out over top of everything else. I do the fading by adjusting the alpha in the shader.
My textures are PNG with alpha made in photoshop. I don't premultply them on purpose. I want them to be straight alpha, but in my app they're acting as if they're premultiplied where I can see a dark edge around a white sprite drawn over a white background.
If I set my blend mode to:
glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA);
The elements composite nicely together, no weird edges. But when elements are fading out they POP at the end. They will start to fade but won't go all the way down to alpha zero. So at the end of the fadeout animation when I remove the elements they "pop off" cause they're not completely faded out.
If I switch my blend mode to:
glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA);
The elements fade up and down nicely. But any white on white element has a dark edge around it from what looks like alpha premultiplication. See the white circle drawn over top of the white box:
But the other blends in the scene look good to me. The other transparent objects blend nicely.
Another important note is that my shader handles opacity for elements and colorizing elements. For each thing that is drawn I multiply by an element color and the final alpha with an opacity value:
void main(void)
{
gl_FragColor = texture2D(u_textureSampler,v_fragmentTexCoord0);
gl_FragColor = vec4(gl_FragColor.r*u_baseColor.r,gl_FragColor.g*u_baseColor.g,gl_FragColor.b*u_baseColor.b,gl_FragColor.a*u_baseColor.a * u_opacity);
}
This allows me to take a white object in my sprite sheet and make it any color I want. Or darken objects by using a baseColor of grey.
Every time I think I understand these blend modes something like this comes up and I start doubting myself.
Is there some other blend mode combo that will have smooth edges on my sprites and also support alpha fading / blending in the shader?
I'm assuming the GL_SRC_ALPHA is needed to blend using alpha in the shader.
Or is my problem that I need to use something other than PSD to save my sprite sheet? Cause that would be almost impossible at this point.
UPDATE:
I think the answer might just be NO that there is no way to do what I want. The 2nd blend mode should be correct. But it's possible that it's double multiplying the RGB with the alpha somewhere, or that it's premultiplied in the source file. I even tried premultiplying the alpha myself in the shader above by adding:
gl_FragColor.rgb *= glFragColor.a;
But that just makes the fades look bad as things turn grey as they fade out. If I premultiply the alpha myself and use the other blend mode above, things appear about the same as in my original. They fade out without popping but you can still see the halo.
Here's a great article on how to avoid dark fringes with straight alpha textures http://www.realtimerendering.com/blog/gpus-prefer-premultiplication/
If you're using mip-maps, that might be why your straight alpha textures have dark fringes -- filtering a straight alpha image can cause that to happen, and the best solution is to pre-multiply the image before the mip-maps are created. There's a common hack fix as well described in that article, but seriously consider pre-multiplying instead.
Using straight alpha to create the textures is often necessary and preferred, but it's still better to pre-multiply them as part of a build step, or during the texture load than to keep them as straight-alpha in memory. I'm not sure about OpenGL ES, but I know WebGL lets you pre-multiply textures on the fly during load by using gl.pixelStorei with a gl.UNPACK_PREMULTIPLY_ALPHA_WEBGL argument.
Another possibility is if you're compositing many elements, your straight alpha blending function is incorrect. In order to do a correct "over operator" with a straight alpha image, you need to use this blending function, maybe this is what you were looking for:
gl.blendFuncSeparate(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA, gl.ONE, gl.ONE_MINUS_SRC_ALPHA);
The common straight alpha blending function you referred to (gl.blendFunc(gl.SRC_ALPHA, gl.ONE_MINUS_SRC_ALPHA)) does not correctly handle the destination alpha channel, you have to use separate blend functions for color and alpha when blending a straight alpha image source, if you intend to composite many layers using an "over operator". (Think about how you probably don't want to interpolate the alpha channel, it should always end up more opaque than both the source & dest.) And take special care when you blend, because the result of the straight-alpha blend is a premultiplied image! So if you use the result later, you still have to be prepared to do premultiplied blending. For a longer explanation, I wrote about this here: https://limnu.com/webgl-blending-youre-probably-wrong/
The nice thing about using premultiplied images & blending is that you don't have to use separate blend funcs for color & alpha, and you automatically avoid a lot of these issues. You can & should create straight alpha textures, but then pre-multiply them before or during load and using premult blending (glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA)) throughout your code.
AFAIK, glBlendFunc(GL_ONE, GL_ONE_MINUS_SRC_ALPHA); is for premultiplied alpha, and it should work well if you use colors as is. glBlendFunc(GL_SRC_ALPHA, GL_ONE_MINUS_SRC_ALPHA); is for straight alpha.
Many texture loading frameworks implicitly convert images into premultiplied-alpha format. This is because many of them are doing image re-drawing into a new image, and CGBitmapContext doesn't support straight-alpha (non-multiplied) image. Consequently, they will usually generate premultiplied-alpha image. So, please look in your texture loading code, and check whether it was converted into premultiplied format.
Also, Photoshop (of course Adobe's) implicitly erases color informations on fully transparent (alpha=0) pixels when exporting to PNG. If you use linear or other texture filtering, then GPU will sample over neighboring pixels, and colors in transparent pixels will affect pixels at edges. But Photoshop already erased the color information so they will have some random color values.
Theoretically, this color bleeding can be fixed by keeping correct color values on transparent pixels. Anyway, with Photoshop, we have no practical way to export a PNG file with keeping correct color value because Photoshop doesn't respect invisible stuffs. (it's required to write a dedicated PNG exporter Photoshop plug-in to export them correctly, I couldn't fine existing one which support this)
Premultiplied alpha is good enough just to display the image, but it won't work well if you do any shader color magics because colors are stored in integer form, so it usually doesn't have enough precision to restore original color value. If you need precise color magic stuffs, use straight alpha — avoid Photoshop.
Update
Here's my test result with #SlippD.Thompson's test code on iPhone simulator (64-bit/7.x)
<Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 24 bits/pixel; 3-component color space; kCGImageAlphaNone; 2048 bytes/row.
cgContext with CGImageAlphaInfo 0: (null)
cgContext with CGImageAlphaInfo 1: <CGContext 0x1092301f0>
cgContext with CGImageAlphaInfo 2: <CGContext 0x1092301f0>
<Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 32 bits/pixel; 3-component color space; kCGImageAlphaLast; 2048 bytes/row.
cgContext with CGImageAlphaInfo 3: (null)
<Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 32 bits/pixel; 3-component color space; kCGImageAlphaFirst; 2048 bytes/row.
cgContext with CGImageAlphaInfo 4: (null)
cgContext with CGImageAlphaInfo 5: <CGContext 0x1092301f0>
cgContext with CGImageAlphaInfo 6: <CGContext 0x1092301f0>
<Error>: CGBitmapContextCreate: unsupported parameter combination: 8 integer bits/component; 24 bits/pixel; 0-component color space; kCGImageAlphaOnly; 2048 bytes/row.
cgContext with CGImageAlphaInfo 7: (null)
I'm using a PNG texture image to control the opacity of fragments in an Opengl es 2.0 shader (on iOS). The result I am after is light grey text on top of my scene's medium grey background (the fragments shader is applied to a triangle strip in the scene). The problem is that there are dark edges around my text -- they look like artifacts. I'm using PNG transparency for the alpha information -- but I'm open to other approaches. What is going on and how can I do this correctly?
First, look at this answer regarding PNG transparency and premultiplied alpha. Long story short, the pixels in the PNG image that have less than 100% opacity are being premultiplied, so they are in effect getting darker as they get more transparent. Hence the dark artifacts around the edges.
Even without PNG and premultiplied transparency, you may still run into the problem if you forget to set your fragment shader's color before applying transparency.
A solution to this problem (where you want text to be a light grey color, and everything in the texture map that's not text to be transparent) would be to create a texture map where your text is white and the background is black.
This texture map will control the alpha of your fragment. The RGB values of your fragment will be set to your light grey color.
For example:
// color used for text
lowp vec4 textColor = vec4(.82,.82,.82,1.0);
gl_FragColor = textColor;
// greyscale texture passed in as uniform
lowp vec4 alphaMap = texture2D(u_alpha_texture,v_texture);
// set the alpha using the texture
gl_FragColor.w = alphaMap.x;
In cases where your color texture varies, this approach would require two separate texture map images (one for the color, and one for the alpha). Clearly, this is less efficient then dealing with one PNG that has alpha transparency baked-in. However, in my experience it is a good tradeoff (premultiplied pixels can be counter-intuitive to deal with, and the other approaches to loading PNG transparency without pre-multiplication introduce added complexity).
An upside to this approach is that you can vary the color of the text independently of the texture map image. For instance if you wanted red text, you could change the textColor value to:
lowp vec4 textColor = vec4(1.0,0.0,0.0,1.0);
You could even vary the color over time, etc, all independently of the alpha. That's why I find this approach to be flexible.
I have an image that is totally white in its RGB components, with varying alpha -- so, for example, 0xFFFFFF09 in RGBA format. But when I load this image with either UIImage or CGImage APIs, and then draw it in a CGBitmapContext, it comes out grayscale, with the RGB components set to the value of the alpha -- so in my example above, the pixel would come out 0x09090909 instead of 0xFFFFFF09. So an image that is supposed to be white, with varying transparency, comes out essentially black with transparency instead. There's nothing wrong with the PNG file I'm loading -- various graphics programs all display it correctly.
I wondered whether this might have something to do with my use of kCGImageAlphaPremultipliedFirst, but I can't experiment with it because CGBitmapContextCreate fails with other values.
The ultimate purpose here is to get pixel data that I can upload to a texture with glTexImage2D. I could use libPNG to bypass iOS APIs entirely, but any other suggestions? Many thanks.
White on a black background with an alpha of x IS a grey value corresponding to x in all the components. Thats how multiplicative blending works.