How to render WebGL content using high DPI devices? - webgl

What is the right way to setup a WebGL to render to all native pixels on a high dots-per-inch display (such as a macbook retina or pixel chromebook)?

for WebGL it's relatively simple.
var desiredCSSWidth = 400;
var desiredCSSHeight = 300;
var devicePixelRatio = window.devicePixelRatio || 1;
canvas.width = desiredCSSWidth * devicePixelRatio;
canvas.height = desiredCSSHeight * devicePixelRatio;
canvas.style.width = desiredCSSWidth + "px";
canvas.style.height = desiredCSSHeight + "px";
See http://www.khronos.org/webgl/wiki/HandlingHighDPI
There are conformance tests that these rules are followed. Specifically that the browser is not allowed to change the size of the backingstore for the canvas for a WebGL canvas.
For regular 2D canvas it's less simple but that was not the question asked.

Related

Issue when drawing image on canvas on iPad

I'm working on an HTML5 project, that will run in a WKWebView on iPad.
I'm doing everything programatically.
My WKWebView frame takes the full screen, viewport is 1180x820,
so I set my canvas size and height to 1180x820 too.
Here is my original picture I'd like to display:
But when I'm displaying an 1920x1080 image on my canvas with the drawImage function,
the image does not fit on the screen (obsviously) but is really well displayed (not blurred).
_canvasContext.drawImage(imgMenu,0,0,1920,1080).
So I rescale the image when drawing it, still with the drawImage function.
_canvasContext.drawImage(imgMenu,0,0,1920/2,1080/2)
The image fits in the screen, but is blurred.
The downscaling is really really bad (it really could be better, this example is not the worst one).
I already tried the parameters
_canvasContext.imageSmoothingEnabled = true;
_canvasContext.webkitImageSmoothingEnabled = true;
_canvasContext.mozImageSmoothingEnabled = true;
_canvasContext.imageSmoothingQuality = "high";
It does not help.
Maybe I do something wrong, I don't understand what to do.
Screen resolution of my ipad is 2360x1640, so displaying a 1920x1080 picture should not be a problem.
If anyone could help me, that would save my life :)
Best regards,
Alex
This is one of those annoying and confusing things about canvas. What you need to do is size the canvas using devicePixelRatio. This will increase the actual size of the canvas to match the pixel density of the device. This could be 2, or 1.5 etc... a retina screen is often 2.
For drawing your image, the smart way is to support any image size and fit it into the canvas area (usually scaling down). With a tiny image, this code will scale up and lose resolution.
const IMAGE_URL = 'https://unsplash.it/1920/1080';
const canvas = document.createElement('canvas');
const _canvasContext = canvas.getContext('2d');
// you will probably want this, unless you want to support many screen sizes in which case you will actually want the window size:
/*
const width = 1180;
const height = 820;
*/
const width = window.innerWidth
const height = window.innerHeight
const ratio = window.devicePixelRatio;
// size the canvas to use pixel ratio
canvas.width = Math.round(width * ratio);
canvas.height = Math.round(height * ratio);
// downsize the canvas with css - making it
// retina compatible
canvas.style.width = width + 'px';
canvas.style.height = height + 'px';
document.body.appendChild(canvas);
_canvasContext.fillStyle = 'gray'
_canvasContext.fillRect(0, 0, canvas.width, canvas.height);
function drawImage(url) {
const img = new Image()
img.addEventListener('load', () => {
// find a good scale value to fit the image
// on the canvas
const scale = Math.min(
canvas.width / img.width,
canvas.height / img.height
);
// calculate image size and padding
const scaledWidth = img.width * scale;
const scaledHeight = img.height * scale;
const padX = scaledWidth < canvas.width ? canvas.width - scaledWidth : 0;
const padY = scaledHeight < canvas.height ? canvas.height - scaledHeight : 0;
_canvasContext.drawImage(img, padX / 2, padY / 2, scaledWidth, scaledHeight);
})
img.src = url;
}
drawImage(IMAGE_URL);
body,html {
margin: 0;
padding: 0;
}
As mentioned in a comment in the code snippet. If your want your canvas to always use 1180x820... be sure to change the width and height variables:
const width = 1180;
const height = 820;
For the purposes of the snippet I used the window size. Which may be better for you if you wish to support other device sizes.

Empty WebGL context uses a lot of memory

For example, for my 940M video card, the canvas created with the following code takes 500 MB of video memory
var c = document.createElement('canvas');
var ctx = c.getContext('webgl');
c.width = c.height = 4096;
At the same time, the OpenGL context of the same sizes uses only 100 MB of video memory:
glutInit(&argc, argv);
glutInitDisplayMode(GLUT_SINGLE);
int s = 4096;
glutInitWindowSize(s, s);
glutCreateWindow("Hello world :D");
Why does the WebGL use so much memory? Is it possible to reduce the amount of used memory for the same sizes of the context?
As LJ pointed out, canvas is double buffered, antialiased, has alpha and a depth buffer by default. You made the canvas 4096 x 4096 so that's
16meg * 4 (RGBA) or 64meg for one buffer
You get that times at least 4
front buffer = 1
antialiased backbuffer = 2 to 16
depth buffer = 1
So that's 256meg to 1152meg depending on what the browser picks for antialiasing.
In answer to your question you can try to not ask for a depth buffer, alpha buffer and/or antialiasing
var c = document.createElement('canvas');
var ctx = c.getContext('webgl', { alpha: false, depth: false, antialias: false});
c.width = c.height = 4096;
Whether the browser actually doesn't allocate an alpha channel or does but just ignores it is up to the browser and driver. Whether it will actually not allocate a depth buffer is also up to the browser. Passing antialias: false should at least make the 2nd buffer 1x instead of 2x to 16x.

Adobe AIR Stage.contentsScaleFactor always 1 on iPhone4s and iPad mini

My application is just a template AIR Mobile AS3 project from FlashDevelop: application.xml file, and a Main class.
In the Main class, I create a text field with stage.contentsScaleFactor value as a text after the first Event.RESIZE:
var textField:TextField = new TextField();
textField.appendText("Size: " + stage.stageWidth + " x " + stage.stageHeight + "\n");
textField.appendText("Scale: " + stage.contentsScaleFactor + "\n");
addChild(textField);
On my iPhone with retina support, i get
Size: 960 x 640
Scale: 1
for
<requestedDisplayResolution>high</requestedDisplayResolution>
and
Size: 480 x 320
Scale: 1
for
<requestedDisplayResolution>standard</requestedDisplayResolution>.
Almost the same for iPad,
Size: 2048 x 1536
Scale: 1
for high, and
Size: 1024 x 768
Scale: 1
for standard.
I'm compiling with the latest AIR SDK 18.0.0.142 (beta), -swf-version=29.
Same results for release AIR SDK 18.
For AIR 14 SDK and -swf-version=25, I get some garbage values for size (it looks like my swf width and height multiplied by what contentsScaleFactor should be), but still 1 for contentsScaleFactor.
Edit:
I have encountered various questions which mentioned contentsScaleFactor over the web (for example, this one). They claim that contentsScaleFactor should be 2 on Retina display
That is how this property documented:
Specifies the effective pixel scaling factor of the stage. This value
is usually 1 on standard screens and 2 on HiDPI (a.k.a Retina)
screens. When the stage is rendered on HiDPI screens the pixel
resolution is doubled; even if the stage scaling mode is set to
StageScaleMode.NO_SCALE. Stage.stageWidth and Stage.stageHeight
continue to be reported in classic pixel units. Note: this value can
change dynamically depending if the stage is on a HiDPI or standard
screen.
Also, with
<requestedDisplayResolution>standard</requestedDisplayResolution>
and configureBackBuffer with wantsBestResolution set to true, I still get 1024*768 buffer on iPad. I verified that by drawing sharp 256*256 texture on 128*128 quad - result is blurry. Then I doing the same with
<requestedDisplayResolution>high</requestedDisplayResolution>
I get sharp looking image of the same physical size.
My actual questions are:
Is contentsScaleFactor supposed to be 2 on Retina iPad/iPhone? If so, are there some compiler/packager options I'm missing?
How I can determine for requestedDisplayResolution=standard that my stage was scaled?
And if contentsScaleFactor doesn't work on mobile, for what purpose this property is? Should it work, for example, on Mac with Retina display?
Edit 2:
On Mac with Retina display, contentsScaleFactor works just fine, reporting 2.
Use Capability.screenDPI to get the DPI ratio, divide stage.stageWidth by screenDPI. On Ipad you'll get around 8 (one DPI unit = around 2 fingers of real estate), DPI gives you your density. Now compare both, non retina Ipad = 8 DPI units for 132 pixel/DPI, retina Ipad = 8 DPI units for 264 pixel/DPI (HD), etc ...
From the comments:
Yes, that is how i'm doing it now for IOS devices, but unfortunately,
Capability.screenDPI is unreliable in general. It is always 72 on
desktop, some "random" values on android, and doesn't take into
account current monitor DPI
Maybe its not so much Dots Per Inch that you need but instead you are looking for Pixels Per Inch?
DPI applies when printing as ink dots on paper. PPI means how many pixels against a real-world inch. Don't worry even the Wikipedia article admits "... It has become commonplace to refer to PPI as DPI, even though PPI refers to input resolution" (now we understand that DPI is printing resolution)
This means you should forget Capability.screenDPI and instead start including the Capabilities.screenResolutionX (display width) and Capabilities.screenResolutionY (display height) numbers in your calculations.
Here's some code for feedback. Hopefully it will be useful to you in some way: PS: Double-check the Flash traces by using a ruler against computer screen.
var scrn_W : uint = Capabilities.screenResolutionX;
var scrn_H : uint = Capabilities.screenResolutionY;
var scrn_DPI : uint = Capabilities.screenDPI;
var init_stageW : int = stage.stageWidth; var init_stageH : int = stage.stageHeight;
//Diagonal in Pixels
var scrn_Dg_Pix : Number = int( Math.sqrt( scrn_W * scrn_W + scrn_H * scrn_H ) );
//Diagonal in Inches
var scrn_Diag : Number = int( Math.sqrt( scrn_W * scrn_W + scrn_H * scrn_H ) ) / 100;
var scrn_PPI : uint = int( Math.sqrt( scrn_W * scrn_W + scrn_H * scrn_H ) ) / scrn_Diag;
//or scrn_PPI = scrn_Dg_Pix / scrn_Diag; //gives same result as above line of code
var dot_Pitch : Number = scrn_W / ( Math.sqrt( scrn_W * scrn_W + scrn_H * scrn_H ) ) / (scrn_Diag * 25.4) / scrn_W;
var scrn_Inch_W : Number = ( scrn_W / scrn_PPI ); var scrn_Inch_H : Number = ( scrn_H / scrn_PPI );
var result_Num : Number = 0; var temp_Pix_Num:Number = 0;
var testInch : Number = 0; var my_PixWidth : Number = 0;
stage.scaleMode = StageScaleMode.NO_SCALE;
stage.align = StageAlign.TOP_LEFT;
stage.addEventListener(Event.RESIZE, resizeListener); //just in case
////////////////////
//// FUNCTIONS
function inch_toPixels( in_Num : Number) : Number
{
temp_Pix_Num = in_Num;
result_Num = scrn_PPI * temp_Pix_Num;
return result_Num;
}
function pixels_toInch( in_Num : Number) : Number
{
temp_Pix_Num = in_Num;
result_Num = temp_Pix_Num / scrn_PPI;
return result_Num;
}
function cm_toInch( in_CM : Number) : Number
{
//inch = cm * 0.393700787; //from centimetre to inch
result_Num = in_CM * 0.393700787; return result_Num;
}
function inch_toCM( in_Inch : Number) : Number
{
//cm = inch * 2.54; //from inch to centimetre
result_Num = in_Inch * 2.54; return result_Num;
}
function resizeListener (e:Event):void
{
// Handle Stage resize here (ie. app window's Scale Drag / Minimize etc)
//trace("new stageWidth: " + stage.stageWidth + " new stageHeight: " + stage.stageHeight);
}
///////////////
//// TRACES
trace("Stage Init Width : " + init_stageW);
trace("Stage Init Height : " + init_stageH);
trace("Screen Width : " + scrn_W);
trace("Screen Height : " + scrn_H);
trace("Screen DPI : " + scrn_DPI);
trace("Screen PPI : " + scrn_PPI);
trace("Screen Diag : " + scrn_Diag);
trace("Screen Diag Pix : " + scrn_Dg_Pix);
trace("Dot Pitch : " + dot_Pitch);
trace("Disp Width (inches) : " + scrn_Inch_W );
trace("Disp Height (inches) : " + scrn_Inch_H );
How about make use of Application.applicationDPI & Application.runtimeDPI to calculate the actual scale factor?

Draw image on canvas with the Retina display

It is a web App in phonegap
I have used a 320 by 480 image to draw but it fuzzy.
html
<canvas id="canvas" height=480 width=320>
Your browser does not support the HTML5 canvas tag.</canvas>
javascript
var canvas = document.getElementById('canvas');
var ctx = canvas.getContext('2d');
ctx.drawImage(images[index],0,0,320,480);
How to draw clearly on the retina display?
If you have access to a larger versions of your images, you can double the visible resolution.
The source images would need to be 640x960:
This is the code to "pixel double" the resolution of an image.
canvas.width = 640;
canvas.height = 960;
canvas.style.width = "320px";
canvas.style.height = "480px";
If not, you could use the same "pixel doubling" effect and present a smaller but clearer version using your existing images:
canvas.width = 320;
canvas.height = 480;
canvas.style.width = "160px";
canvas.style.height = "240px";
This is a general answer on how to draw (anything) on canvas to make it look sharp on retina displays or any high DPI display.
Get the screen density with:
var screenDensity = window.devicePixelRatio
Then just multiply your path, stroke, font size, canvas size, etc. with screenDensity.
Exaple:
var canvas = document.getElementById('canvas');
var ctx = canvas.getContext('2d');
var screenDensity = window.devicePixelRatio;
// Make it visually fill the positioned parent
canvas.style.width = '100%';
canvas.style.height = '100%';
// ...then set the internal size to match
canvas.width = canvas.offsetWidth * screenDensity;
canvas.height = canvas.offsetHeight * screenDensity;
ctx.beginPath();
ctx.moveTo(42 * screenDensity, 0);
ctx.lineTo(42 * screenDensity, 4 * screenDensity);
// (...more stuff goes here...)
ctx.closePath();
ctx.strokeStyle = '#000000';
ctx.lineWidth = 2 * screenDensity;
ctx.stroke();
Alternatively, you can set the canvas scale to screenDensity. See here for more info and examples: https://developer.mozilla.org/en-US/docs/Web/API/Window/devicePixelRatio

Webgl gl.viewport change

I have a problem with canvas resizing and gl.viewport sync.
Let's say that I start with both the canvas 300x300 canvas, and the initialization of gl.viewport at the same sizes (gl.vieport(0, 0, 300, 300)).
After that, in browser's console I make my tests:
I'm changing size of my canvas, using jquery, calling something like $("#scene").width(200).height(200)
After this, i'm calling my resizeWindow function:
function resizeWindow(width, height){
var ww = width === undefined? w.gl.viewportWidth : width;
var h = height === undefined? w.gl.viewportHeight : height;
h = h <= 0? 1 : h;
w.gl.viewport(0, 0, ww, h);
mat4.identity(projectionMatrix);
mat4.perspective(45, ww / h, 1, 1000.0, projectionMatrix);
mat4.identity(modelViewMatrix);
}
function that's synchronizing viewport with required dimensions.
Unfortunatly, my gl.viewport after this call takes only a part of my canvas.
Could anyone tell me what is going wrong?
There is no such thing is gl.viewportWidth or gl.viewportHeight
If you want to set your perspective matrix you should use canvas.clientWidth and canvas.clientHeight as your inputs to perspective. Those will give you the correct results regardless of what size the browser scales the canvas. As in if you set the canvas auto scale with css
<canvas style="width: 100%; height:100%;"></canvas>
...
var width = canvas.clientHeight;
var height = Math.max(1, canvas.clientHeight); // prevent divide by 0
mat4.perspective(45, width / height, 1, 1000, projectionMatrix);
As for the viewport. Use gl.drawingBufferWidth and gl.drawingBufferHeight. That's the correct way to find the size of your drawingBuffer
gl.viewport(0, 0, gl.drawingBufferWidth, gl.drawingBufferHeight);
Just to be clear there are several things conflated here
canvas.width, canvas.height = size you requested the canvas's drawingBuffer to be
gl.drawingBufferWidth, gl.drawingBufferHeight = size you actually got. In 99.99% of cases this will be the same as canvas.width, canvas.height.
canvas.clientWidth, canvas.clientHeight = size the browser is displaying your canvas.
To see the difference
<canvas width="10" height="20" style="width: 30px; height: 40px"></canvas>
or
canvas.width = 10;
canvas.height = 20;
canvas.style.width = "30px";
canvas.style.height = "40px";
In these cases canvas.width will be 10, canvas.height will be 20, canvas.clientWidth will be 30, canvas.clientHeight will be 40. It's common to set canvas.style.width and canvas.style.height to a percentage so that the browser scales it to fit whatever element it is contained in.
On top of that there are the 2 things you brought up
viewport = generally you want this to be the size of your drawingBuffer
aspect ratio = generally you want this to be the size your canvas is scaled to
Given those definitions the width and height used for viewport is often not the same as the width and height used for aspect ratio.

Resources