Apple writes in the user interface guidelines that a tappable element should be at least 44 x 44 points. They provide an example of the calculator app for reference. When I measured the tappable buttons I got the height of 10 mm/38 px. Using tools I found online I converted this to approximately 29 postscript points.
My question is what does Apple mean by 44 points? How do I convert this correctly?
Apple's use of 'point' is not the same as a typographical point. It's an abstract unit that doesn't have a fixed correlation to pixels, nor to any physical measurement (e.g., millimeters).
The first part of that is easy to see; a point width on a retina iPhone covers two pixels, twice as many as on a non-retina iPhone.
The iPad Mini makes the second part of that first statement pretty clear. From an iOS developer's point of view, 44 points is the same measurement on an iPad 2 as it is on the mini. But to a user, those represent very different physical (millimeter) sizes.
There's a bit more written under Points vs Pixels in their drawing guide.
It does seem a bit strange for the HIG to prescribe sizes that don't correspond to, well, humans. In practice, for a UX designer, your approximations are probably fine.
A point is 1/72 of an inch, so 44 points is 0.6 inch or 1.5 cm. I don't think Apple can change this definition all by themselves.
Related
I spent hours of searching about this topic, but nothing is exactly dealing with my problem. Maybe you guys can help me better.
I want to ask what is the proper way of designing a layout to properly support all the iPhones. Also the easiest way for me as possible.
My style is that I am designing for the smallest device first and then scale things up for the other devices. Is that alright or is this not how it is supposed to be done?
My problem is that I am creating a layout that should scale up according to display size.
I will give an example of Original iPhone which has screen of 320 x 480 points (which equals 320 x 480 pixels) and iPhone 6, which has screen of 375 x 667 points (which is 750 x 1334 pixels because of the different pixel density).
Now I want to make a button or textbox or rectangle (whatever) on the Original iPhone that will get bigger if I launch it on bigger device like iPhone 6.
So if I have a button having height of 30 (pt) on Original iPhone, it should be bigger on iPhone 6. But how much bigger?
Do I have to calculate those 30 pt into pt on bigger device myself every time, or is there better method that works automatically so I don't have to re-count all the dimensions for every device? What is the correct way?
Btw: I am aware of that when designing the icons, buttons, or anything that is a .png file, I have to create them in dimensions of 1x, 2x and 3x so it can scale on all the devices, but how to work with these while constructing the actual layout?
What is the logic or how is it ment to be done?
THIS PICTURE is showing the dimension proportions I found on the web and little bit of idea of my problem below.
EDIT: Please look at this website: https://designcode.io/iosdesign-guidelines
It seems like there are some Apple's measurement standarts in (pt).
For example on one of the pictures they are showing that the margin from the sides should be 8pt. Since [pt] is an universal unit, I would expect it to adapt on other devices. So if I set the margin of 8 in editor, will it make different ammount of pixels on the other devices so it looks visually the same?
If you calculate every view like this one , it will not work.
Just set constrains on that view which you want same in other devices.
Like if you fix a view's height is 30 then it will 30 in every device. It is not going to be bigger or smaller in other devices.
But if you think that, a view is 20 pixel from right and 20 pixel from left, 10 pixel from top bar and bottom layout then this view gonna same ratio in every device.
I hope you understand what i want too say.
Please comment if i am unable to make you understand. :)
A client wants to have my app use real world dimensions (it's a modelling app). However, I am finding it difficult to get the dimensions of the screen itself, not the entire ipad unit (which are available from apple). I'll need this for all ipad versions including the mini. Does anyone have a source for these?
7.76 inches high and 5.82 inches wide.
197 mm high by 147 mm wide
We are currently working with a design who is supplying Retina images to us with odd dimensions i.e. 28 x 15 px which I believe is incorrect as when you divide it you get an odd number like 14 x 7.5 px.
This is a rule I have always worked on but the designer is not getting the point and I thought I should double check what the exact rules are.
I've had add look on the web but cannot seem to find any references on this so it would be great to hear what everyone thinks on this matter.
Thanks
Yes you can, but NOT recommend.
For example, if you have an #2x image with 28 x 15px , your normal image will be 14 x 8px.
If you look close into the normal image, the pixels are not aligned well.
It is always recommended to use even number of pixel in dimension.
The #2x image needs to be exactly twice the width and twice the height of the standard image, or the automatic loading of it won't happen - your app will load and pixel-double the non-Retina image.
The standard image file will as a matter of course be a whole number of pixels wide and high, so you'll need the #2x to be even in its dimensions.
Tell your designer to catch on ;)
It's not possible because in Xcode you design your application with classic resolution pictures, and you can't use a float for width or height. So, you will have a one pixel gap difference between your classic and retina design. Maybe the easiest way to solve your problem is to add a transparent line of pixels in your high resolution picture.
When programming for the iPad, font (and other) sizes are specified in "points." I have seen reference to a point as a pixel that is independent of screen resolution. But I am having trouble finding definite confirmation of how big a point is in real terms (that is, in terms of inches). Is a point equal to one pixel on the standard iPad screen, so 1pt = 1/132in? And then, to confirm, this means that an "iOS point" is a different unit than the printer's point = 1/72in?
Thanks.
See here and here (scroll down to points vs. pixels) for the official word. Basically, a point is one pixel on a non-retina device (so the size varies between the iPad and the iPhone - it isn't related to a printer's point) and 2 pixels on a retina device (which has twice the number of pixels in each direction).
Drawing and positioning is done in points to allow the same code to run on both types of device - the frameworks will fill in the gaps to make drawing smoother on retina devices.
An iPad point is different to an iPhone point, which is different to a printers point, to answer your question.
I'm developing a reader application for Bada and have a silly question.
Does a smooth way to convert pt size to pixel size exist?
I found something like this, but I'm still hoping you could apply some formula and be happy with it.
Points are a "real-world" length unit (they are generally defined as 1/72 in), but pixels do not have a definite real world size, since this depends on the resolution of the device.
For example, the pixels on my screen are about 0.3 mm wide, while the ones of my phone are about 0.15 mm, and the "pixels" of my laser printer are 0.02 mm wide. Thus, to go from pixels to real world units, you need the resolution of the specific device, i.e. the pixels/real world unit ratio, which, most often, is expressed in DPI (dots per inch, where "dot" is intended as "pixel" for devices that work with pixels).
When dealing with printing/scanning devices the "real world size" is important, so it's almost always provided by the OS in some way and is correct; on the other hand, with screens the situation is quite different.
In most situations you don't really care about the "real world size" of stuff displayed on screens, since no one is ever really measuring anything on the screen. Also, onscreen layouts are often partly done in pixels for a variety of reasons (simplicity being the first).
On the other way, text and other elements' sizes are often specified in points, twips and other "real world units", and in general good window layouts should be done in "real world units" to be easily adapted to screens with high pixel densities, where pixel-based layouts would be unreadable.
For this reason, the OS usually provides a DPI value for the screen, but in general it's left to the same default value (usually 72 DPI) regardless of the real attached screen (also to avoid breaking badly designed interfaces), but leaving it configurable to the user, to let him adjust it to a comfortable value.
As for Bada, I read here that the OS does not provide neither a real neither a "fake" DPI value, so there's no real way to convert from points to pixels. On the other hand, you could simply use the usual "default" 72 DPI value for your conversions. Notice that 72 DPI wasn't chosen by chance: since there are 72 pixels per inch, and 72 points per inch, you simply assume that a point is equal to a pixel. Not correct, but in your case "good enough".
Assuming 72 DPI for bada is not a good choice since modern mobile devices have DPIs around 200-300.
Unfortunately, Bada wanted to go iPhone way, and have a few devices and you would release your application on each device, which has fixed features. This way, you can visit samsung web, read real size of their screen. Then compute real DPI yourself for each device and store it to table. Runtime, you can get device name and try to search your table.
AFAIK you have to upload your application to bada shop for each device separately. And they assume from Bada SDK that you will compile for each target different application. Target is specified by screen resolution, and I guess you can expect real screen to be this size.
Well, i think think this design is stupid, but might be really the way how they expect you to develop for their platform.