Why Huawei Mate Series WebGL Max Array Texture Layers set at 256? This affect WebXR experience - webgl

Huawei Mate Series & VR Glass WebGL Issue - Max Array Texture Layers set at 256
This issue has impacts on Huawei VR experiences but the bug is in the WebGL2 implementation so affecting all WebGL2 experiences if you use Huawei VR Glass.
To test, go to https://webglreport.com/?v=2
Huawei Matebook laptop with Windows 10 (2022 year model) shows Max Array Texture Layers: 2048
2013 iMac shows Max Array Texture Layers: 2048
Meta Quest 2 with QC XR2 SOC shows Max Array Texture Layers: 2048
iPhone 13 Pro Max shows Max Array Texture Layers: 2048
Nova 9 with EMUI 12 show Max Array Texture Layers: 2048
Mate 40 Pro with Harmony OS shows Max Array Texture Layers: 256
Mate 40E with Harmony OS shows Max Array Texture Layers: 256
Mate 30 Pro with Harmony OS shows Max Array Texture Layers: 256
Mate Series - 30 Pro, 40 Pro, 40E Huawei VR Glass has negative WebXR experience rendering 256+ texture layers.
https://constructarcade.com/game/archery-dungeon/
Huawei Mate 30 Pro/VR Glass - Wolvic
Meta Quest 2 - Wolvic
https://constructarcade.com/game/get-off-my-lawn/
Huawei Mate 30 Pro/VR Glass - Wolvic
Meta Quest 2 - Wolvic
Questions:
Why Huawei Mate Series WebGL Max Array Texture Layers set at 256?
Is this a bug in the OS (HarmonyOS) or the chipset (Kirin 990, 9000)?
Is there a way to fix it? 256 max layers are really low.

Related

How to set resolution of image

I am using OpenCV to generate images with depth of 1 bit to cut in a laser cutter (Github repo here). I save them with:
cv2.imwrite(filepath, img, [cv2.IMWRITE_PNG_BILEVEL, 1])
Each pixel corresponds to 0.05mm (called "scan gap" in the laser cutter). A sample image has 300 x 306 pixels and appears in the laser cutter software (LaserCut Pro 5) with size 30 mm x 30 mm. This corresponds to a resolution of 254 pixels per inch and the uncommon value could be from the software. I want a size of 15 mm x 15.3 mm and want to set a higher resolution to achieve that. I could resize by hand, but if I make a mistake, the pixels are no longer exactly aligned with the scan gap of the laser, resulting in inaccuracies in the engraving.
Does OpenCV have a way to set the resolution or final size of the image?

WebGL: Are there cases where gl.MAX_TEXTURE_IMAGE_UNITS == 1

Sorry to ask such a strange question, but I'm working on some logic for a WebGL visualization and would like to know, are there cases where:
gl.getParameter(gl.MAX_TEXTURE_IMAGE_UNITS)
equals 1?
I ask because I'm trying to figure out how many vertices I can draw in each draw call, and each vertex needs some content from one of several textures. The minimal case I'm wanting to support is one in which I load two textures for each draw call, but if there are cards that don't support multiple textures per draw call I'll need to rethink my life.
The minimum value for MAX_TEXTURE_IMAGE_UNITS WebGL is required to support is 8. You can look up the limits in the spec section 6.2. Note: Search for "MAX TEXTURE IMAGE UNITS" (with the spaces not underscores)
That said WebGL has a different limit for textures used in a fragment shader vs textures used in a vertex shader.
For a vertex shader the minimum requires is 0 on WebGL1. You can check the number of textures supported in a vertex shader by looking at MAX_VERTEX_TEXTURE_IMAGE_UNITS.
Fortunately most machines support at least 4 in the vertex shader
There is also yet another limit MAX_COMBINED_TEXTURE_IMAGE_UNITS which is how many textures total you can use combined. In other words if MAX_COMBINED_TEXTURE_IMAGE_UNITS is 8, MAX_VERTEX_TEXTURE_IMAGE_UNITS is 8 and MAX_VERTEX_TEXTURE_IMAGE_UNITS is 4 that means you could use 8 textures at once of which up to 4 could be used in the vertex shader. You could not use 12 textures at once.
Other minimums
MAX VERTEX ATTRIBS 8
MAX VERTEX UNIFORM VECTORS 128
MAX VARYING VECTORS 8
MAX COMBINED TEXTURE IMAGE UNITS 8
MAX VERTEX TEXTURE IMAGE UNITS 0
MAX TEXTURE IMAGE UNITS 8
MAX FRAGMENT UNIFORM VECTORS 16

OpenCV: charuco (diamonds) not working on bigger images. Which parameters to tune?

somehow detecting charuco diamonds does not work with bigger images for me. With my original images of 1920x1080 it neither recognizes the ids reliably (the diamond ids elements are switching places every time). In the first image, you can see it recognizes (7, 9, 45, 2).
Then I tried downsampling the images to 960x540, and dividing the calibration params, f, c, to half, and it works! The id is correctly recognized as (2,7,45,9) and the pose estimation is accurate.
How to make it work for bigger images? I tried changing the detection parameters depending on absolute pixel units (not relative to image size). Here is a list of my current parameters. I realized increasing the Window size for threasholding helps recognizing the squares, but not for id or pose estimation.
nmarkers: 1024
adaptiveThreshWinSizeMin: 13
adaptiveThreshWinSizeMax: 113
adaptiveThreshWinSizeStep: 10
adaptiveThreshWinSize: 42
adaptiveThreshConstant: 7
minMarkerPerimeterRate: 0.1
maxMarkerPerimeterRate: 4.0
polygonalApproxAccuracyRate: 0.05
minCornerDistance: 10.0
minDistanceToBorder: 10
minMarkerDistance: 10.0
minMarkerDistanceRate: 0.05
doCornerRefinement: false
cornerRefinementWinSize: 5
cornerRefinementMaxIterations: 30
cornerRefinementMinAccuracy: 0.1
markerBorderBits: 1
perspectiveRemovePixelPerCell: 8
perspectiveRemoveIgnoredMarginPerCell: 0.13
maxErroneousBitsInBorderRate: 0.04
minOtsuStdDev: 5.0
errorCorrectionRate: 0.6
Any hints?
thank you!
At the end I needed to patch the opencv aruco module. It was a matter of a certain treshold escalating too fast (to the 4th) to the image size (closestCandidateDistance in refineDetectedMarkers). The solution was to make minRepDistance in detectCharucoDiamond to only scale linearly with the image size.
Full answer and patch in the opencv forum.

Is WebGL's MAX_TEXTURE_SIZE the max area, diameter, or bytes?

It's kind of interesting how much documentation avoids disambiguating what WebGLRenderingContext#getParameter(WebGLRenderingContext.MAX_TEXTURE_SIZE) means. "Size" is not very specific.
Is it the maximum storage size of textures in bytes, implying lowering bit-depth or using fewer color channels increases the maximum dimensions? Is it the maximum diameter in pixels of textures, implying you are much more limited in terms of addressable-area if your textures are highly rectangular? Is it the maximum number of pixels?
As it says in the WebGL spec section 1.1
The remaining sections of this document are intended to be read in conjunction with the OpenGL ES 2.0 specification (2.0.25 at the time of this writing, available from the Khronos OpenGL ES API Registry). Unless otherwise specified, the behavior of each method is defined by the OpenGL ES 2.0 specification
The OpenGL ES 2.0.25 spec, section 3.7.1 says
The maximum allowable width and height of a two-dimensional texture image must be at least 2^(k−lod) for image arrays of level zero through k, where k is the log base 2 of MAX_TEXTURE_SIZE and lod is the level-of-detail of the image array.
It's the largest width and/or height you can specify for a texture. Note that this has nothing to do with memory as #Strilanc points out. So while you can probably create a 1 x MAX_TEXTURE_SIZE or a MAX_TEXTURE_SIZE x 1 texture you probably can not create a MAX_TEXTURE_SIZE x MAX_TEXTURE_SIZE texture as you'd run out of memory
It's the maximum diameter in pixels. If M is the maximum texture size, then you can create textures of size M x M, M/2 x M/4, M x 1, and so on; but you can't make a texture of size 2M x 2M or 1 x 2M.
Consider that the largest MAX_TEXTURE_SIZE reported in this opengl capability report is 16384 (2^15). If that was the maximum number of pixels (nevermind bytes), instead of the maximum diameter, you'd be unable to create 256x256 textures. Which is really small.
(Note that limits besides MAX_TEXTURE_SIZE apply. For example, my machine returns a maximum texture size of 2^15. But a 2^15 x 2^15 texture with rgba float pixels would take 16 gibibytes of space. It wouldn't fit in the available memory.)

Pixels of an image

I have a stupid question:
I have a black circle on white background, something like:
I have a code in Matlab that gets an image with a black circle and returns the number of pixels in the circle.
will I get the same number of pixels in a camera of 5 mega pixel and a camera of 8 mega pixel?
The short answer is: Under most circumstances, No. 8MP should have more pixels than 5MP, However...
That depends on many factors related to the camera and the images that you take:
Focal length of the cameras, and other optics parameters. Consider a fish-eye lens to understand my point.
Distance of the circle from the camera. Obviously, closer objects appear larger.
What the camera does with the pixels from the sensor. For example, 5MP cameras that works in a down-scaled regime, outputting 3MP instead.
its depends on the Resolution is how many pixels you have counted horizontally or vertically when used to describe a stored image.
Higher mega pixel cameras offer the ability to print larger images.
For example a 6mp camera offers a resolution of 3000 x 2000 pixels. If you allow 300dpi (dots per inch) for print quality, this would give you a print of approx 10 in x 7 in. 3000 divided by 300 = 10, 2000 divided by 300 = approx 7
A 3.1mp camera offers a resolution of 2048 x 1536 pixels which gives a print size of 7in x 5in

Resources