A client wants to have my app use real world dimensions (it's a modelling app). However, I am finding it difficult to get the dimensions of the screen itself, not the entire ipad unit (which are available from apple). I'll need this for all ipad versions including the mini. Does anyone have a source for these?
7.76 inches high and 5.82 inches wide.
197 mm high by 147 mm wide
Related
I'm trying to programmatically adapt my website's image sizes for differently sized devices. But now I am having trouble telling what sizes I actually need. In Google Chrome emulator, I'm seeing some of my images upsized, e.g. on iPhone 6 from 230x230 natural to 357x357 displayed. The image takes up nearly the entire width of the emulated screen, and looks just slightly degraded, suggesting iPhone 6's width isn't much larger than 357 pixels.
But Apple says the iPhone 6 has a resolution of 750x1334! If that were true, the image should look much worse, I would think.
I've found some contradictory information on iPhone 4 as well.
This site talks about iPhone 4 at 640x960 pixels. Chrome emulator again shows it at half those dimensions, 320x480.
This stackoverflow question says that "the iPhone screen is 320x480 definitely."
What am I missing here? Why do some sources (including Apple) supply dimensions that are twice what Chrome emulator (and my images) say?
Relax, you're about to understand this mess. Just notice that 2 * 375x667 = 750x1334.
A pixel is not a pixel
The key thing is: one device pixel is different from one CSS pixel.
They are the same in low pixel density devices like your computer screen (96 dpi). However, high pixel density devices like smartphones and printers (upwards of 160 dpi) try to obey the general W3C CSS3 spec understanding that one CSS pixel should always be close to 1/96th of an inch (or 0.26 mm) when viewed from usual distance (arm's length).
They don't obey the spec to the letter, since that would imply 1px being exactly 1/96th of one real inch in high DPI settings, which wasn't ever implemented in any browser AFAIK. However, they try to make their CSS pixels not so minuscule despite very high pixel densities by making one CSS pixel equal to two or more device pixels.
Chrome Device Mode works with CSS pixels, which is what you should use to design text, navbars, headings etc, but not high-resolution images. For these, read the next section.
If you didn't notice, the image above shows that Chrome Device Mode does show you the device scale (how many device pixels equal one CSS pixel).
Fixing image resolution
As you already know, this affects images negatively, since the browser scales the image as well. Your 230x230 CSS pixels picture becomes 460x460 device pixels, using the same quality. To fix that, use the srcset attribute to give the browser links to different resolution files of the same image.
Example (adapted from the link above):
<img src="wolf-400.jpg" srcset="wolf-400.jpg 400w, wolf-800.jpg 800w, wolf-1600.jpg 1600w">
An iPhone 6 will look at that and think "oh, I pretend to be 375px wide but I'm actually 750px, so I'll download wolf-800.jpg."
Just don't forget to use src="" for compatibility. Also, unless you use sizes="", the browser will default to the full width of the device.
Apple writes in the user interface guidelines that a tappable element should be at least 44 x 44 points. They provide an example of the calculator app for reference. When I measured the tappable buttons I got the height of 10 mm/38 px. Using tools I found online I converted this to approximately 29 postscript points.
My question is what does Apple mean by 44 points? How do I convert this correctly?
Apple's use of 'point' is not the same as a typographical point. It's an abstract unit that doesn't have a fixed correlation to pixels, nor to any physical measurement (e.g., millimeters).
The first part of that is easy to see; a point width on a retina iPhone covers two pixels, twice as many as on a non-retina iPhone.
The iPad Mini makes the second part of that first statement pretty clear. From an iOS developer's point of view, 44 points is the same measurement on an iPad 2 as it is on the mini. But to a user, those represent very different physical (millimeter) sizes.
There's a bit more written under Points vs Pixels in their drawing guide.
It does seem a bit strange for the HIG to prescribe sizes that don't correspond to, well, humans. In practice, for a UX designer, your approximations are probably fine.
A point is 1/72 of an inch, so 44 points is 0.6 inch or 1.5 cm. I don't think Apple can change this definition all by themselves.
I designed my first ipad app.
And when i run it on the simulator resolution is really small. Smaller than really ipad's resolution. Right now i have not ipad, and cant test app on it.
What the problem with resolution?
UPD I connect big display to my macbook. But resolution is still small. I think thats because macbook's display is primary and big monitor is the second. How can i change it?
The problem is that the physical resolution of your screen is lower than the physical resolution of the iPad’s screen. For example my iMac’s screen has 1920 pixels per ~19 inches of width, or roughly 100 pixels per inch. The iPad’s screen density is about 130 ppi, one third higher. If you’re serious about your app, you have to test on the real device anyway. (By the way, if you are simply talking about the window size, that can be changed in the Window menu or using ⌘1–⌘3 shortcuts.)
I'm planning on making my first game in xna (simple 2d game) and i wonder which screen resolution that would be appropriate to target the game against.
Resolution for a 2D game is a difficult issue.
Some people ignore it. World of Goo (for PC), for one very famous example, simply always runs at 800x600 on the PC, no matter what. And look how successful it was.
It helps to think about what kind of device you will be targeting. Here are some common resolutions and the devices they apply to:
1280x720 (720p, Xbox 360 "safe" resolution - free hardware scaling, works everywhere)
1920x1080 (1080p, Xbox 360 maximum resolution - can't auto-scale to all resolutions)
800x480 (Windows Phone 7)
1024x768 (iPad)
480x320 (iPhone 3GS and earlier)
960x640 (iPhone 4 retina display)
Android devices also have similar resolutions to WP7 and iOS devices.
(Note that consoles require you to render important elements inside a "title-safe" area or "action-safe" area. Typically 80% and 90% of the full resolution.)
Here is the Valve Hardware Survey, which you can see lists the common PC resolutions (under "Primary Display Resolution").
Targeting 800x480 for a mobile game, or 1280x720 for a desktop/console game, is a good place to start.
If you do want to support multiple resolutions, it is important to think about aspect ratio. Here is an excellent question that lists off some options. Basically your options are letter/pillar-boxing or bleeding (allowing for extra rendering outside "standard" screen bounds - like a title-safe area), or some combination of the two.
If your graphics need to be "pixel perfect" and simply scaling them won't work, then I would recommend targeting a series of base resolutions, and then boxing/bleeding to cover any excess screen on a particular device. When I do this, I usually provide assets for these target screen heights: 320, 480, 640, 720, 1080. Note that providing 5 versions of each asset is a huge amount of work - so try to use scaling wherever possible.
Many choices about resolution handling will depend on what style of game you are making. For example: whether you try to match a horizontal or vertical screen size will depend largely on what direction your game will mostly scroll in.
When I first started with c++ graphics I used 320*240, or 800*600 when I had to use larger images. But it's really up to you, whatever you prefer. As long as you don't use stupid values like 123*549 or something.
'normal' resolutions include but are not limited to:
160*120
320*240
640*480 (probably the most common)
800*600
1024*768
I'm developing a reader application for Bada and have a silly question.
Does a smooth way to convert pt size to pixel size exist?
I found something like this, but I'm still hoping you could apply some formula and be happy with it.
Points are a "real-world" length unit (they are generally defined as 1/72 in), but pixels do not have a definite real world size, since this depends on the resolution of the device.
For example, the pixels on my screen are about 0.3 mm wide, while the ones of my phone are about 0.15 mm, and the "pixels" of my laser printer are 0.02 mm wide. Thus, to go from pixels to real world units, you need the resolution of the specific device, i.e. the pixels/real world unit ratio, which, most often, is expressed in DPI (dots per inch, where "dot" is intended as "pixel" for devices that work with pixels).
When dealing with printing/scanning devices the "real world size" is important, so it's almost always provided by the OS in some way and is correct; on the other hand, with screens the situation is quite different.
In most situations you don't really care about the "real world size" of stuff displayed on screens, since no one is ever really measuring anything on the screen. Also, onscreen layouts are often partly done in pixels for a variety of reasons (simplicity being the first).
On the other way, text and other elements' sizes are often specified in points, twips and other "real world units", and in general good window layouts should be done in "real world units" to be easily adapted to screens with high pixel densities, where pixel-based layouts would be unreadable.
For this reason, the OS usually provides a DPI value for the screen, but in general it's left to the same default value (usually 72 DPI) regardless of the real attached screen (also to avoid breaking badly designed interfaces), but leaving it configurable to the user, to let him adjust it to a comfortable value.
As for Bada, I read here that the OS does not provide neither a real neither a "fake" DPI value, so there's no real way to convert from points to pixels. On the other hand, you could simply use the usual "default" 72 DPI value for your conversions. Notice that 72 DPI wasn't chosen by chance: since there are 72 pixels per inch, and 72 points per inch, you simply assume that a point is equal to a pixel. Not correct, but in your case "good enough".
Assuming 72 DPI for bada is not a good choice since modern mobile devices have DPIs around 200-300.
Unfortunately, Bada wanted to go iPhone way, and have a few devices and you would release your application on each device, which has fixed features. This way, you can visit samsung web, read real size of their screen. Then compute real DPI yourself for each device and store it to table. Runtime, you can get device name and try to search your table.
AFAIK you have to upload your application to bada shop for each device separately. And they assume from Bada SDK that you will compile for each target different application. Target is specified by screen resolution, and I guess you can expect real screen to be this size.
Well, i think think this design is stupid, but might be really the way how they expect you to develop for their platform.