Nice post, but I personally disagree. Since everyone in the blogosphere seems to be against "accurate" DPI handling ;-), I thought I'd write up a couple of reasons.
As a user, I do expect to be able to hold up a piece of paper, and it to match what I see perfectly. I've needed this numerous times to make replicas of door signs, letter heads, etc.. I've got used to the fact that I have to set the zoom level to ~90% on Windows, Linux, and to 130% on Mac. That's ridiculous, and as a user I want that fixed. I don't really care how it is implemented.
Maybe you can redefine 130% -> 100% in the word processor.
Maybe you can just declare that the whole UI is shrunk by 77% from "paper size" per default.
As someone interested in design and typography, it makes me cringe when I see someone set their brand new LCD to a non-native resolution because their "fonts are too small". Also I see people choose poor font sizes, because what they see on screen is different than what the get on paper. When I try to help and change the font size on screen, I've seen adults throw a fit because I "changed the perfectly good fonts on screen", not in the "wrong" printout.
As a developer, its frustrating to have no control over pixels and display sizes. Sometimes I want real pixels, sometimes I want "virtual pixels" (like MacOS HiDPI or in mobile browsers). Sometimes I want "old fashioned pt" (such that 12 pt on mac == 9 pt on linux), sometimes I want real world pt (such that I can hold up a piece of paper - unless the user has zoomed or is using accessibility features).
As a consumer, I'm angry that it is 2012 and we still don't have high resolution displays (>> 200 dpi). The main reason is that there is no software support - it's like chicken and egg. I think this is the single most important argument for proper DPI handling.
Btw. I think people often purposely misunderstand arguments in this discussion (a la build up the straw man, tear down the straw man). For example, nobody wants 12pt do be the same size on projectors, cell phones and displays. Thats crazy. But I want to put two different dpi monitors next to each other, drag a window over, and don't have the physical size of everything jump.
I think the way forward would be to:
Have all new programs use units. Distinguish between "traditional units" and "physical units".
Have the OS acknowledge the screen DPI. Leave an option to override it (the infamous on-screen ruler). Warn the user not to change it just to make things larger. (This changes physical units.)
Have a "scale factor for tradition's sake" that makes 12 pt large on Linux, and normal on Mac (for "pretend 72, 96dpi"), so users get the pt size they expect. Use this also as a screen-wide zoom, and for accessibility. (This changes traditional units.)
Make themes render everything smoothly with vectors, except 1or 2px sharp highlights.
Provide good tools to allow people to test their applications.
Something similar has already been implemented once for Gtk (but not included), so I think it shouldn't be too hard to do.
Power management, mobile and firmware developer on Linux. Security developer at nvidia. Ex-biologist. Content here should not be interpreted as the opinion of my employer. Also on Mastodon and Bluesky.
Yes, but...
Date: 2012-07-13 11:01 am (UTC)As a user, I do expect to be able to hold up a piece of paper, and it to match what I see perfectly. I've needed this numerous times to make replicas of door signs, letter heads, etc.. I've got used to the fact that I have to set the zoom level to ~90% on Windows, Linux, and to 130% on Mac. That's ridiculous, and as a user I want that fixed. I don't really care how it is implemented.
As someone interested in design and typography, it makes me cringe when I see someone set their brand new LCD to a non-native resolution because their "fonts are too small". Also I see people choose poor font sizes, because what they see on screen is different than what the get on paper. When I try to help and change the font size on screen, I've seen adults throw a fit because I "changed the perfectly good fonts on screen", not in the "wrong" printout.
As a developer, its frustrating to have no control over pixels and display sizes. Sometimes I want real pixels, sometimes I want "virtual pixels" (like MacOS HiDPI or in mobile browsers). Sometimes I want "old fashioned pt" (such that 12 pt on mac == 9 pt on linux), sometimes I want real world pt (such that I can hold up a piece of paper - unless the user has zoomed or is using accessibility features).
As a consumer, I'm angry that it is 2012 and we still don't have high resolution displays (>> 200 dpi). The main reason is that there is no software support - it's like chicken and egg. I think this is the single most important argument for proper DPI handling.
Btw. I think people often purposely misunderstand arguments in this discussion (a la build up the straw man, tear down the straw man). For example, nobody wants 12pt do be the same size on projectors, cell phones and displays. Thats crazy. But I want to put two different dpi monitors next to each other, drag a window over, and don't have the physical size of everything jump.
I think the way forward would be to:
Something similar has already been implemented once for Gtk (but not included), so I think it shouldn't be too hard to do.