[personal profile] mjg59
In the past I used to argue that accurate desktop DPI was important, since after all otherwise you could print out a 12 point font and hold up the sheet of paper to your monitor and it would be different. I later gave up on the idea that accurate DPI measurement was generally useful to the desktop, but there's still something seductively attractive about the idea that a 12 point font should be the same everywhere, and someone recently wrote about that. Like many seductively attractive ideas it's actually just trying to drug you and steal your kidneys, so here's why it's a bad plan.

Solutions solve problems. Before we judge whether a solution is worthwhile or not, we need to examine the problem that it's solving. "My 12pt font looks different depending on the DPI of my display" isn't a statement of a problem, it's a statement of fact. How do we turn it into a problem? "My 12pt font looks different depending on which display I'm using and so it's hard to judge what it'll look like on paper" is a problem. Way back in prehistory when Apple launched the original Mac, they defined 72dpi as their desktop standard because the original screen was approximately 72dpi. Write something on a Mac, print it and hold the paper up to the monitor and the text would be the same size. This sounds great! Except that (1) this is informative only if your monitor is the same distance away as your reader will be from the paper, and (2) even in the classic Mac period of the late 80s and early 90s, Mac monitors varied between 66 and 80dpi and so this was already untrue. It turns out that this isn't a real problem. It's arguably useful for designers to be able to scale their screen contents to match a sheet of paper the same distance away, but that's still not how they'll do most of their work. And it's certainly not an argument for enforcing this on the rest of the UI.

So "It'll look different on the screen and paper" isn't really a problem, and so that's not what this is a solution for. Let's find a different problem. Here's one - "I designed a UI that works fine on 100DPI displays but is almost unusable on 200DPI displays". This problem is a much better one to solve, because it actually affects real people rather than the dying breed who have to care about what things look like when turned into ink pasted onto bits of dead tree. And it sounds kind of like designing UIs to be resolution independent would be a great solution to this. Instead of drawing a rectangle that's 100 pixels wide, let me draw one that's one inch wide. That way it'll look identical on 100dpi and 200dpi systems, and now I can celebrate with three lines of coke and wake up with $5,000 of coffee table made out of recycled cinema posters or whatever it is that designers do these days. A huge pile of money from Google will be turning up any day now.

Stop. Do not believe this.

Websites have been the new hotness for a while now, so let's apply this to them. Let's imagine a world in which the New York Times produced an electronic version in the form of a website, and let's imagine that people viewed this website on both desktops and phones. In this world the website indicates that content should be displayed in a 12pt font, and that both the desktop and phone software stacks render this to an identical size, and as such the site would look identical if the desktop's monitor and the phone were the same distance away from me.

The flaw in this should be obvious. If I'm reading something on my phone then the screen is a great deal closer to me than my desktop's monitor usually is. If the fonts are rendered identically on both then the text on my phone will seem unnecessarily large and I won't get to see as much content. I'll end up zooming out and now your UI is smaller than you expected it to be, and if your design philosophy was based on the assumption that the UI would be identical on all displays then there's probably now a bunch of interface that's too small for me to interact with. Congratulations. I hate you.

So "Let's make our fonts the same size everywhere" doesn't solve the problem, because you still need to be aware of how different devices types are used differently and ensure that your UI works on all of them. But hey, we could special case that - let's have different device classes and provide different default font sizes for each of them. We'll render in 12pt on desktops and 7pt on phones. Happy now?

Not really, because it still makes this basic assumption that people want their UI to look identical across different machines with different DPI. Some people do buy high-DPI devices because they want their fonts to look nicer, and the approach Apple have taken with the Retina Macbook Pro is clearly designed to cater to that group. But other people buy high-DPI devices because they want to be able to use smaller fonts and still have them be legible, and they'll get annoyed if all their applications just make the UI larger to compensate for their increased DPI. And let's not forget the problem of wildly differing displays on the same hardware. If I have a window displaying a 12pt font on my internal display and then drag that window to an attached projector, what size should that font be? If you say 12pt then I really hope that this is an accurate representation of your life, because I suspect most people have trouble reading a screen of 12pt text from the back of an auditorium.

That covers why I think this approach is wrong. But how about why it's dangerous? Once you start designing with the knowledge that your UI will look the same everywhere, you start enforcing that assumption in your design. 12pt text will look the same everywhere, so there's no need to support other font sizes. And just like that you've set us even further back in terms of accessibility support, because anyone who actually needs to make the text bigger only gets to do so if they also make all of your other UI elements bigger. Full marks. Dismissed.

The only problem "A 12pt font should be the same everywhere" solves is "A 12pt font isn't always the same size". The problem people assume it solves is "It's difficult to design a UI that is appropriate regardless of display DPI", and it really doesn't. Computers aren't sheets of paper, and a real solution to the DPI problem needs to be based on concepts more advanced than one dating back to the invention of the printing press. Address the problem, not your assumption of what the problem is.

Thank You!

Date: 2012-07-13 05:10 am (UTC)
From: (Anonymous)
A very thoughtful and, in my opinion, accurate analysis of the problem. In headed to bed now as it is quite late but the first thing I'm doing in the morning is making everyone I know read this post

DPI and distance

Date: 2012-07-13 06:09 am (UTC)
From: [personal profile] jjsarton
I have a notebook with a 132 DPI display. Due to this normal web pages are very small, and the content must be zoomed so that I get a sufficient font size. Regarding this, I would say that settting the font size for example to 12 dpi (or x EM) will help. My desktop is configured in order to get a nice size for text output. If I attach a second display with a lower DPI, all textes showed on the external screen will be to big, modern systems don't look for the DPI setting for each display, this is wrong.
The problem about you write about is primary the viewing distance / output media and secondary the user preferences.
If the systems will primary look at the display DPI, the type of output device (computer display, television, tablet, book (viewing display / distance) and finally at the user prferences (small, medium big size) the problems can be solved in an easy way.
UNfortunatelly most developper don't think about this.

Date: 2012-07-13 06:20 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
Excellent post. I hope you're cross-publishing this elsewhere because it's too good to stay on Dreamwidth and Dreamwidth alone. I actually thought I was reading a re-post from one of the bigger design websites and had to keep glancing up at your username to check that I wasn't.

This probably isn't too helpful to the overall point you're making, but my personal best answer to "12pt isn't 12pt everywhere you look", as someone who gets to decide what the font size is in certain situations, is to ignore the problem and just scale fonts by percentage. With the sheer preponderance of different resolutions out there now, that lets you basically pick your own font-size.

(I'm personally not a great fan of high DPI - just picked up an ancient laptop that has incredibly high screen res considering how small and old it is and what I don't like is how I have to keep it at very high res or everything looks so blurry. So I have sites of my own that work and read best at 1024 x 768 or 1440 x 900 displaying at, I don't know, I guess 1600 x 900 on this thing, and the text is so small that my eyes are straining - even with fonts scaled by percentage in the style sheets. Makes me wish there was a way to keep Firefox permanently zoomed in to compensate for it. I think a Retina display would actually kill me.)

Date: 2012-07-13 06:34 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
Doubling pixel size? How does one do that? And how does the renderer have "more" pixels - aren't they just "double the size" but still the same amount, if I'm following what you said correctly? Unfortunately I don't own an Apple anything so I haven't kept up with their past or current pixel plays - but after reading this I will read up on the Retina display (which I always look at at Best Buy, and I swear, for my own personal use, it looks like it would kill me - even Android is getting pretty bad in that regard, with the new tablets they have coming out).

Date: 2012-07-13 07:02 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
No, you've probably explained it not too badly. So sites I've designed to work best at 1440 x 900 will display as they would on a regular unRetina'd 1440 x 900 display unless I give the Retina the magic flag...I wonder if double pixel density in that case makes font hinting go though the roof. I've never been fond of Apple's dark, blurry fonts, but this is just making me envision more of the same.

That's right

Date: 2012-07-13 11:05 am (UTC)
From: (Anonymous)
As with most things, it's a little more complicated then that. You have the effective resolution, say 1440x900 and the physical resolution, 2880x1800. Apple has decoupled them. By default the effective resolution is half the physical resolution. The OS renders everything at twice the effective resolution and scales it to the physical resolution. For an effective resolution of 1440x900, there's no scaling. But you can set the effective to higher values (though not all the way up to 2880x1800). The result is a screen that remains easy to read but can "dig down" into higher resolution when needed.

Every GUI element has a scaling factor, on a retina display it's set to 2. But apps can set it to 1. So iPhoto (the Mac photo handling software) displays thumbnails and photos at much higher resolution. So does Aperture (their professional photo software). And, so does Final Cut X (their video software). That means a video editor can have a full 1080p copy of video running in a "standard-sized" UI.

It sounds like overkill until you look at what Windows does in comparison. Windows lets you adjust the scaling of fonts and reports DPI to applications. But when you run Windows on a retina display you get pretty, high-resolution, readable fonts (assuming you've upped the scaling factor) and tiny little GUI elements. Turns out Windows and Windows apps ignore the DPI completely. The result is a hot mess.

Re: That's right

Date: 2012-07-14 06:16 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
You know what's an even hotter mess? That devs for various software and hardware outfits never came to an agreement years ago, long before all the possible resolutions came to be, on what DPI is. On how big a pixel is. On how a screen ought to exactly render anything you can throw at it.

If there had been manufacturer-agnostic standards in place to begin with (which I think is the better if much subtler point of mjg's article) then we wouldn't have so much sheer chaos and confusion today (it would also be easier for designers to design, which right now is a sort of difficult task considering the sheer amount of cross-res testing we must do anywhere).

Google has one way, Apple another, MS yet another, so no one ever can or will design anything to an agreed-upon spec. It's frustrating, really, that the look of an app, program or web page on one size screen can't be easily scaled to another, with everyone knowing ahead of time exactly what the outcome will be, then making adjustments accordingly off of a wholly-expected result, not the current manufacturer-specific standard (in other words, this isn't just about MS, sorry to burst any bubble you're having over that).

They didn't think to scale app DPI for cross-OS compatibility? Big deal - meanwhile every software company on Earth screwed this up years ago by not agreeing in the first place on what the standards should be.
Edited (typos/cleanup) Date: 2012-07-14 06:20 am (UTC)

Date: 2012-07-13 08:03 am (UTC)
ideological_cuddle: (Default)
From: [personal profile] ideological_cuddle
The NoSquint addon may be your friend here.

Date: 2012-07-13 08:59 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
Yes, you're right, I forgot about that, but I did see it out in the wild years ago, long before this issue cropped up. Thanks for mentioning it! And at a glance I like your DW, so I'll be subscribing, if you don't mind.

Date: 2012-07-17 09:22 pm (UTC)
maco: pink sakura (Default)
From: [personal profile] maco
The problem you described in your last paragraph looks to me like the problem that is SOLVED by having correct DPI.

My netbook has 136DPI. The default fonts for the taskbar in Kubuntu Netbook are 7pt. X assumed my screen was 96DPI. Result? Impossibly small text (about a millimeter high!). As soon as I informed X that my display is 136DPI, the text became legible.

This is why I don't understand hating on getting DPI correct. Getting DPI correct is what makes fonts legible on high DPI screens. The alternative is to be constantly hitting Ctrl+ to try to make the fonts bigger and bigger (which is only useful on the content anyway, not the UI elements).

Date: 2012-07-18 03:54 pm (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
I hadn't thought of trying to adjust the DPI on my laptop but after reading your comment I went ahead and did that (since the person here who recommended NoSquint for Firefox was really solving only half my problem - I've got a tiny and nearly unusable desktop, too). Turns out my DPI was also set to 96, which kind of surprises me. So I ratcheted it up to 132, and while that solves the problem of the fonts being too small, the default MS icons don't scale at all, making the whole setup look kind of ridiculous. It's just *gah* kind of frustrating.
Edited (typos) Date: 2012-07-18 03:55 pm (UTC)

96dpi

Date: 2012-07-13 06:49 am (UTC)
From: (Anonymous)
> But other people buy high-DPI devices because they want to be able to use smaller fonts and still
> have them be legible, and they'll get annoyed if all their applications just make the UI larger to
> compensate for their increased DPI

Or get annoyed because someone decided that Xorg was at 96dpi.

Re: 96dpi

Date: 2012-07-13 07:34 am (UTC)
From: (Anonymous)
Only if you believe that because we can't get it perfect on all displays, we should give up completely. The perfect has definitely murdered the good here.

Back In The Day, Xorg detected the DPI of all my laptop and desktop monitors perfectly, set its DPI value accordingly, and everything using any modern rendering technology would scale appropriately. As a result, I ended up with usable font and display element sizes out of the box on every system I used.

Then someone decided that because DPI doesn't completely solve every UI scaling problem in the known universe, X should give up and always assume a hardcoded DPI.

Now, Xorg detects the DPI of all my laptop and desktop monitors perfectly, throws that information away, and sets a hardcoded DPI. As a result, I end up with fonts way too small on my laptop and most monitors, the right size on others, and too small on a few, until I manually set GNOME2's DPI setting, at which point a subset of the apps that used to scale properly now do so. (GNOME3, naturally, removed this setting so I can't set DPI anymore without creating an xorg.conf.)

So, because DPI doesn't solve the problem perfectly, all the cases that used to work now no longer do. In what way does this represent an improvement?

Re: 96dpi

Date: 2012-07-13 07:42 am (UTC)
From: (Anonymous)
> until I manually set GNOME2's DPI setting

...to the autodetected value read out of Xorg.0.log... :)

Re: 96dpi

Date: 2012-07-13 07:54 am (UTC)
From: (Anonymous)
But it's not set to 96, it's set to this-year's-currently-assumed-standard-dpi * gnome-preferences-scale-factor.

Maybe it would have been better to let X set it to something sensible and then invent a toolkit-agnostic method to coordinate the scale factor.

Date: 2012-07-13 07:41 am (UTC)
From: (Anonymous)
So, rather than measuring a video output's pixel density in dots-per-inch, what about measuring it in arc-seconds-per-pixel, based on the expected average position of the user?

That would handle phones, monitors, TVs, and projectors, and nicely sidesteps all the issues you mentioned in your post. Then, a 12pt font really would look the same everywhere in a much more useful sense: a 12pt font on your laptop screen at the distance you view a laptop screen will have the same size as a 12pt font on your phone screen at the distance you view your phone screen, and the same size as a 12pt font on your projector at the distance the average audience member sees the projector output.

Yes, I do realize that we can't autodetect field-of-view in all cases. X couldn't always autodetect DPI, but it still represented a good way to calibrate monitor scaling even if you had to set it manually. And some devices could autodetect field-of-view properly.

As for the issue of accessibility, that seems completely orthogonal to the issue of DPI or field-of-view. Both accessibility and personal preference on font sizes could work via a separate scaling factor that affects the actual sizes of fonts: 12pt should look like 12pt everywhere, but the user might want to explicitly prefer 9pt (because they have good vision) or 16pt (because they have poor vision).

Arc-seconds

Date: 2012-07-13 09:55 am (UTC)
ewen: (Default)
From: [personal profile] ewen
Yes, size in arc-seconds at assumed viewing position was what came to my mind reading the original description; it's sort of a "gamma curve for sizing". If combined with an accessibility option to allow users to choose scaling ratios from a default "normal vision" size (eg, super vision at 75% or 50% the size with twice as much visible at once; "assistance required" at 120% or 150% the size, potentially with less visible at once), it would seem to get much closer to the desired result.

Stealing a printing term (viz, points), and then using it to mean things less and less related to the original concept (by now about all you can say is: bigger numbers mean bigger text, smaller numbers mean smaller text, no direct relationship to real world sizes implied) seems rather unfortunate. The abuse of "pixel" sizing in web design standards is getting nearly as bad. Both of these due to a poor choice of units in the first place, for anything intended to be portable.

Ewen
Edited (Fixed typo) Date: 2012-07-13 10:01 am (UTC)

Date: 2012-07-14 06:49 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
I like this. It seems limited for now to how Apple handles raster graphics but that doesn't stop me from thinking it could be the next Big Thing (it would absolutely solve the problem I keep envisioning of how to get to the next level in graphics design and screen rendering - by finally giving us a good reason to use hard and fast measurements, then depending on the software to do the scaling for us in a platform-agnostic fashion).

That idea ties to the point I was making in my first comment - that there is no answer to the problem we're having with using pixel-based/DPI scaled measurements cross-platform, and no way to make either platform-agnostic, so let's stop doing that. Where ever possible I've mindfully converted (with fonts, especially) from using pixel measurements to fluid values - percentages, ems, picas, anything that can or will scale. Because it's the only "somewhat better than pixels" answer I can come up with. But it's not the best answer there is.

Brilliant

Date: 2012-07-13 08:15 am (UTC)
From: (Anonymous)
Yes, there are situations where having a 1:1 relationship between your screen and reality really makes sense. There are even situations where having a 1:1 relationship between your finger and your screen makes sense.

But "designers" should not be allowed to dictate these choices.

I worked in print media as a systems analyst and beta tester for 20 years--the tools needed to do accurate print media work are still at the stone knife stage in Linux. We still do kill a lot of trees and paste things on them. Think of it as sequestering CO2.

Fonts do need to appear on paper in the size they purport to be.

On a screen, not so much, unless that screen is being used to produce paper or other real world objects. Then accuracy in scaling is quite handy. As you point out it is not, and should not be, a universal requirement.

The UI problem is caused by "designers" whose primary qualifications appear to be umbrage and arrogance these days. They design the perfect UI for themselves and their kitten without consulting anyone at all. The end product is in fact a UI--an Unusable Interface. See Gnome-3 for cellphones and kittens to get the full modern UI experience.

I tried it for a month. Being able to pick my nose while I admired the big shiny icons that turned my efficient multitasking platform into an appliance for cretins just didn't outweigh being able to use my computer productively.

I'm sure there are kittens who think Gnome-3 is simply fabulous, but to most human beings who use computers that don't have fruit icons on them and strictly require a one finger-one click interface the end result is a productivity failure of immense proportions.

I did a quick survey of mad scientists at Los Alamos National Labs and they assured me that Gnome-3 was in fact a very successful marketing effort by the KDE team. I am writing this from one of my very cool and easily customized KDE desktops.

The "designers" tell us we just don't understand sophisticated fingerpainting and the efficiency of clicking three times to do what we used to do literally at the press of a single button. When we tell them we do not work 20 hours a day on a cellphone updating tweets and self-promotional FaceBook pages with our trusty kitten at our side while drinking yummy espresso they tell us we are in a tiny minority who just don't appreciate good design when we see it.

The egocentric need to protect the design itself from being modified by people, you know who I mean, to fit their purposes and needs is central to the UI design process these days.

The users are entirely left out of the design process as the "designers" work to make something cool and edgy and different that will look good in a portfolio rather than producing a useful and boring old UI people are familiar with. We all love additional learning curves heaped on our plates by "designers."

The "designers" obsess about making something like the Gnome-3 desktop fingerprint-ready and ignore the people who use their computers in productive ways rather than as cool fruit shaped nightlights. Kittens around the world rejoiced when they could run their supercomputing clusters using only their noses while still playing with their balls of yarn--I know I did.

The problem you speak of so lucidly is more often than not the result of untreated OCD and control issues rather than any legitimate design factor. It is a psychological problem with designers that Freud would say has something to do with domineering mothers and nothing at all to do with valid design considerations.

It is MY design and don't you dare change any of it you, you, you... users. I want MY design to look the same way everywhere. It is all about ME! ME! ME! and MY design! If they designed shoes they would all be the same size, style, and color they wear themselves.

If all fonts are not 12 pts all of the time I will get on my pony and take my kitten and my computer shaped night light and all of my edgy coolness and leave! So there!

Would that we could drive them to that point at Gnome.

Devices should be flexible and UI's should be boring and transparent. No, I don't mean rife with translucent panels and gels! Don't go there.

The Gnome team should try designing some amazing shoes or fabulous fall fashions for Lady Gaga and get out of the UI and waterboarding business. At the very least having to go to Milan to foist their coolness on the world would get them out of their mom's basement from time to time.

Perhaps we should all ponder pixels, cubits and other arbitrary and hardware specific units of measure that cause more trouble than they are worth...

Again thanks for the great analysis of why all of those good ideas about fonts don't hold up to the light of day.

If any of the Gnome folks show up I've found that alcohol is better than gasoline for dealing with them. There is less environmental impact and it doesn't leave those telltale spots on the concrete after the fire goes out.

Re: Brilliant

Date: 2012-07-14 05:29 am (UTC)
From: (Anonymous)
wow… rant much?

you seem to be conflating a few unrelated things here – like the point of the article and your utter distaste for Gnome3's UI.

I haven't heard the Gnome3 UI/UX designers clamoring for "12pt needs to be 12pt everywhere". And to provide a counterpoint, I'm a happy Gnome 3 user on both a small laptop screen and dual screen desktop.

More on topic: perfect resolution independence seems indeed very tempting, and at one point I was in this camp too. But I realised that what I really wanted is a scale factor for the entire UI, not necessarily matching inches or centimeters on screen with the real world.

Point is that we have wildly different resolutions for approximately the same size class of screens (from 1024x600 until 2880x1800 for laptops for example – I'm counting phones, 27"/30" monitors, TVs and projectors in a different class), and it would be nice if we had better/less crude tools than pixel doubling to scale our UIs.

Re: Brilliant

Date: 2012-07-14 07:03 am (UTC)
mm_writes: Sheep go to heaven, goats go to hell (Default)
From: [personal profile] mm_writes
you seem to be conflating a few unrelated things here – like the point of the article and your utter distaste for Gnome3's UI

Don't forget his distaste for "designers" whom he thinks are a rather specious lot. Us designers spoiling his boringly transparent UI party? No way.

I'm sure no one wants (or should want) 12pt to render (look exactly the same) on both a smartphone and a huge wide screen. Most designers might want something much more rational (that should be attainable but still is not - not without hacking or adding special, platform-specific instructions): text that's perfectly readable on any size screen without distorting it too far from its original scale - I mean, duh.
Edited (typos) Date: 2012-07-14 07:10 am (UTC)

What is the point of points then?

Date: 2012-07-13 08:17 am (UTC)
From: (Anonymous)
Point is 1/72", period. If you think that physical dimensions are not suitable for UI and web design use pixels or degrees or whatever. On the other hand, fixing this *regression* in X server is a matter of one-line combination of xrandr and sed in .xsession, no need for lengthy flame wars ;)

zimous

Re: What is the point of points then?

Date: 2012-07-13 08:42 am (UTC)
From: (Anonymous)
The beauty of using points or EMs on computer systems is that the measure is abstracted from the constraints of the hardware. So a point isn't just 1/72" of an inch, period. It is a means of insuring consistency across myriad platforms.

For the designer who wants consistent presentation across a diverse range of devices the use of traditional font measures provides a far more reliable mechanism than using pixels or percentages in underlying style sheets.

At the very least it prevents us from having to rely on Microsoft, Webkit, or Gecko to do math and get it right.

Re: What is the point of points then?

Date: 2012-07-13 10:22 am (UTC)
From: (Anonymous)
No, 1 pt is roughly 1/72" (well, today it might be exactly, but originally it was a different unit). We can "overload" the unit to mean something else, but saying a text size is 12pt is the same as saying it is 1/6 of an inch.

Now, it is my opinion that operating systems should offer reasonable font sizes to their users, and 12pt is probably good for desktops (hum, how about using webcam to estimate the distance of the person's head [using the average human face width :]), whereas something else is good for phones and projectors.

All applications should use relative font sizes for everything else that's not in the default font size.

Gtk+ applications scale beautifully on the desktop (imho, at least, though I am sure it's easy to find counter-examples) with differing font sizes (and text lengths, very common with localization), but they do not scale well to smaller devices (projectors, otoh, which are just enlarged desktops, are usually fine): it seems to work well with a certain minimum size.

- Danilo

Re: What is the point of points then?

Date: 2012-07-13 10:08 pm (UTC)
From: [identity profile] thub.myopenid.com
I'm definitely in the "points should measure physical size" camp, but that may be because I have a desktop publishing background and when HTML started using point sizes I found it baffling and still do. I'm perfectly happy to have 12 pt text look the same size on my 10" netbook as it does on my 22" desktop monitor. However, when I plug my netbook (with old Fedora 12 and Gnome 2) into my 720P, 37" TV, my previously legible text is now a 3 pixel high smudge. Clearly a better experience can be had here.

I think all text should be one of two things (unless I'm missing some important use case that doesn't fit with either):
a) Relative to an arbitrary size the individual user finds comfortable and usable or
b) relative to actual point sizes with an user controlled scaling factor such as when working with a document for printing.

In the first case, as developers, we (you/they/me/whoever) could either ask the user to select a comfortable text size or use some magical set of rules to choose a reasonable default text size. I imagine the rules would be based on things like the size and pixel depth relative to some common font type (e.g. 70% of the base size should be at least eight pixels high, ensuring some legibility for many fonts at small size; 500% of base size should allow a reasonable amount of text across the overall screen width; etc.). The user should also be able to set this base size to the equivalent of, say, 12 pt real-life print-on-paper text. This base size would be the reference against which all other text is scaled. Perhaps a percentage could be used for this scaling, as is done in HTML and CSS.

In the second case, it could be assumed that if the developer or designer is specifying point size, they are also implying that this text is in reference to printed text and text will default to being rendered based on pixel density in accurate physical point size, subject to application- or document-specific user controls to control scaling, such as viewing a word processor document at 150% or zooming to the width of the page and scaling the text as one would expect.

I realise what I'm really asking for is for all desktop and mobile platforms to abandon their current standards and conform to the product of my brilliance :-) but really I'm just whining about the misappropriation of a unit of measurement.

Some unrealistic demands, just for fun:
1) Hardware manufacturers, shape up and provide good quality EDID information in all your monitors, TVs, and other displays.
2) Platform developers, use that EDID data and a healthy dose of prescience to determine ideal default text sizes algorithmically and provide APIs based around my brilliant scheme detailed above.
3) Designers, stop being such control freaks and use relative text sizes wherever possible to take advantage of the APIs in demand #2.

Whew! Being right all the time takes a lot of typing! :-)
- Thub

Re: What is the point of points then?

Date: 2012-07-14 10:21 am (UTC)
From: [personal profile] jjsarton
I think also that 1 pt, inch cm, ... shall have the physical wanted size.
Your unrealistic demand what made unrealistic by people and companies which ignore all about others. the web developper which mean that all must be expressed in pixel, Microsoft,..., Apple which decided that a new inch has a size of 72/96 on the display in order to render wrong web page correctly.
1) The OS can read the display EDID and assume according the the vertical and horizontal size the type of display, but this can't be perfect. The type of display and accordingly the viewing distance may be dependant of the screen width/height as reported by EDID, what is the width/height for which the device shall be considered as a tv (large distance or tablet (low distance) ?
2) This may be reported by X which can report the physical size of the display with an accuracy of +/- 5 mm if the EDID are taken into account.
3) Not only text size but also pictures sizes.

An other point which has to be taken into account and wihch is always ignored idt the width of a text area. In order to allow a good readability the width shall not be larger as approx 20 cm (~ 8 inch) for a typical viewing distance of ~60 cm / ~30 inch on a computer display.

Re: What is the point of points then?

Date: 2012-07-13 10:37 am (UTC)
From: (Anonymous)
1. Web designers create non-scalable designs and use 16px font.
2. "Hey guys, don't use pixels, it's device dependent, which is bad. Use points."
3. Web designers create non-scalable designs and use 12pt font.
4. Web browsers on Linux mess up webpages.
5. "Hey guys, we need to standardize point on Linux, let's make it equal to 96/72 pixels. M$ does it that way and look at their market share!"

Sorry for the sarcasm, but it seems so braindead to me. Fixed DPI is just lesser evil regarding the current state. By technical means it is a regression.

z.

Re: What is the point of points then?

Date: 2012-07-14 11:22 am (UTC)
From: [personal profile] jjsarton
1. There use mostly 12 px some times also 10 px.
2. Actually a point is the same as pixel, only a little bit larger (1.333 px).
3. noc-scalable bit 12px font with Arial/Helvitica in order to make the textes less readable.
3. ?
5. No physical size is pysical size and not a size dependant of the display dpi.

No tell developer that there do a wrong job.

Thank you

Date: 2012-07-13 08:26 am (UTC)
From: (Anonymous)
Thank you 1000 times for writing down this topic that nicely. The problem seems so incredible obvious the way you wrote it down, still many people (including me) failed to properly express this observation.

Now the big question is: Will your initial target audience finally get this?

Yes, but...

Date: 2012-07-13 11:01 am (UTC)
From: [personal profile] nacho
Nice post, but I personally disagree. Since everyone in the blogosphere seems to be against "accurate" DPI handling ;-), I thought I'd write up a couple of reasons.

As a user, I do expect to be able to hold up a piece of paper, and it to match what I see perfectly. I've needed this numerous times to make replicas of door signs, letter heads, etc.. I've got used to the fact that I have to set the zoom level to ~90% on Windows, Linux, and to 130% on Mac. That's ridiculous, and as a user I want that fixed. I don't really care how it is implemented.
  • Maybe you can redefine 130% -> 100% in the word processor.
  • Maybe you can just declare that the whole UI is shrunk by 77% from "paper size" per default.

As someone interested in design and typography, it makes me cringe when I see someone set their brand new LCD to a non-native resolution because their "fonts are too small". Also I see people choose poor font sizes, because what they see on screen is different than what the get on paper. When I try to help and change the font size on screen, I've seen adults throw a fit because I "changed the perfectly good fonts on screen", not in the "wrong" printout.

As a developer, its frustrating to have no control over pixels and display sizes. Sometimes I want real pixels, sometimes I want "virtual pixels" (like MacOS HiDPI or in mobile browsers). Sometimes I want "old fashioned pt" (such that 12 pt on mac == 9 pt on linux), sometimes I want real world pt (such that I can hold up a piece of paper - unless the user has zoomed or is using accessibility features).

As a consumer, I'm angry that it is 2012 and we still don't have high resolution displays (>> 200 dpi). The main reason is that there is no software support - it's like chicken and egg. I think this is the single most important argument for proper DPI handling.

Btw. I think people often purposely misunderstand arguments in this discussion (a la build up the straw man, tear down the straw man). For example, nobody wants 12pt do be the same size on projectors, cell phones and displays. Thats crazy. But I want to put two different dpi monitors next to each other, drag a window over, and don't have the physical size of everything jump.

I think the way forward would be to:
  • Have all new programs use units. Distinguish between "traditional units" and "physical units".
  • Have the OS acknowledge the screen DPI. Leave an option to override it (the infamous on-screen ruler). Warn the user not to change it just to make things larger. (This changes physical units.)
  • Have a "scale factor for tradition's sake" that makes 12 pt large on Linux, and normal on Mac (for "pretend 72, 96dpi"), so users get the pt size they expect. Use this also as a screen-wide zoom, and for accessibility. (This changes traditional units.)
  • Make themes render everything smoothly with vectors, except 1or 2px sharp highlights.
  • Provide good tools to allow people to test their applications.

Something similar has already been implemented once for Gtk (but not included), so I think it shouldn't be too hard to do.
Edited Date: 2012-07-13 11:04 am (UTC)

Re: Yes, but...

Date: 2012-07-16 11:55 am (UTC)
From: (Anonymous)
No. We are digital. Everything should be in pixels. If you are developing a program that is expecting output to a printed page, then incude a calibration utility with it that will allow your user to put a page to a screen and calibrate the 'as on paper' scale factor.

For get about DPI *completely*, leave that to the print driver.

Screens are in pixels. All sizes for things to be displayed on screen must be in pixels. Laptop screen at 1080p is equal to a 100 foot projector screen at 1080p. Anything else is just madness and must be killed with fire. As a convenience web designers might use 'em' to describe sizes of non-textual elements when those must line up with some textual elements and that would be translated to pixels, depending on the width of the 'm' letter in the used font type and size.

Re: Yes, but...

Date: 2012-07-16 05:24 pm (UTC)
From: [personal profile] nacho
This is 2012, and we shouldn't be bound by the pixel grid anymore. With so many devices with different resolutions and sizes, it's just an artifact that shouldn't matter to the user or the developer (unless he or she is designing UI themes, icons, or working with low-res screens).

Computer interfaces nowadays use a broken, leaky abstraction. We have objects like windows, buttons, and so on, but they don't always seem to be "real". ("real" as genuine 2D objects in their own right, not just as skeumorphic copies of physical things.) We see the technology beneath leaking through. Two reasons for this are:
1) Computers are not aware of the physical display size, so screen elements change physical size willy-nilly between comparable devices (like two different PCs).
2) High resolution hasn't spread out much yet.

Touch-technology, "Retina"-Displays and the ubiquitousness of small computing devices have come a long way to rectify this, but we're not quite there yet. That's one reason I think we need dpi awareness.

Imagine what would be possible with a fully dpi aware, vector based system (crazy ideas):
- A document on the screen could look exactly like on paper (except shining, and with less details).
- You could use a tablet as an extension to your PC, in a whole new way. Objects on both would have the same physical size, and you could move them back and forth, or between two screens.
- You'd still have font hinting and pixel-snapping so stuff looks crisp on legacy devices.
- High-DPI screens would suddenly be viable. You buy a new screen and things don't get smaller, but prettier. Imagine UI elements with very fine structured backgrounds, like leather or fine paper (or just absolutely fine and pretty gradients if you're not into skeuomorphism).
- You could zoom in as much as you like, without things getting ugly. Great for people with vision problems.

If anything should be killed with fire, then it is the pixel.
Edited Date: 2012-07-16 05:25 pm (UTC)

Date: 2012-07-13 02:12 pm (UTC)
From: (Anonymous)
http://dev.w3.org/csswg/css3-values/#reference-pixel

You are both right and wrong.

Date: 2012-07-13 03:08 pm (UTC)
From: (Anonymous)
When you talk about resolution independent interfaces you assume they are meant to enable UI elements to have identical sizes across devices. That's not the only reason they are so important. They are important because they enable elements to have *any* size. And *any* solves *all* your problems.

From the moment you have a resolution independent UI, you can set element sizes to be different depending on form factor (viewing distance). Now, in the same form factor, you may have screens that have high DPI and screens that don't. But because interface elements are independent of both resolution and pixel density, users can simply adjust to their liking. It's really that simple.

Re: You are both right and wrong.

Date: 2012-07-13 05:50 pm (UTC)
From: (Anonymous)
It's not. Any UI can be hostile to text size, there's no correlation with the UI itself being resolution independent and there's no intrinsic reason why it should "work badly" regarding good support for different form factors, quite the opposite actually.

Re: You are both right and wrong.

Date: 2012-07-14 07:30 am (UTC)
From: [personal profile] jjsarton
What do you mean with "resolution-independent UI that always has the same association between text size and UI element size" ?
The text may be "save" (English) or "Enregistrez" (French). if the UI element hat a size expressed in pixel, the text may be incomplet. This is only an indirectly accessibility issue.

Re: You are both right and wrong.

Date: 2012-07-16 05:35 pm (UTC)
From: [personal profile] nacho
Right, resolution independence doesn't solve the problem of having to design different interfaces for different device classes and form factors. There's no magic bullet for that, and that's why I'm skeptical about Microsoft's Metro (using almost the same interface for 3" phones and 23" PCs).

There are at least three (almost orthogonal) problems here:
- resolution independence (better: "awareness")
- designing the proper interface for a certain device class
- making UI elements scale with text (because we're not using VB6 anymore, and because German words are really long), but not necessarily fixed to the text (bad for the reasons you mentioned)

Re: You are both right and wrong.

Date: 2012-07-17 09:36 pm (UTC)
maco: pink sakura (Default)
From: [personal profile] maco
How is it hostile to accessibility? The thing I see as hostile to accessibility is when I hit ctrl+ to get a larger font size and suddenly the end of the text gets cut off because the div was sized in px (hardcoded to the display) instead of em (variable, based on the text).

equate "pt" with "viewing angle"...

Date: 2012-07-17 08:12 am (UTC)
From: (Anonymous)
... and you are good step closer to a sane solution. 1pt = 2minutes works quite well usually.

Profile

Matthew Garrett

About Matthew

Power management, mobile and firmware developer on Linux. Security developer at Nebula. Member of the Linux Foundation Technical Advisory Board. Ex-biologist. @mjg59 on Twitter. Content here should not be interpreted as the opinion of my employer.

Page Summary

Expand Cut Tags

No cut tags