[personal profile] mjg59
A discussion a couple of days ago about DPI detection (which is best summarised by this and this and I am not having this discussion again) made me remember a chain of other awful things about consumer displays and EDID and there not being enough gin in the world, and reading various bits of the internet and wikipedia seemed to indicate that almost everybody who's written about this has issues with either (a) technology or (b) English, so I might as well write something.

The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.

EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).

1366/8 = 170.75. Hm.

Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.

It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.

So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?

But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.

Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.

tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.

So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.

The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.

The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.

The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.

Wrong decision in 1990, not 1930

Date: 2012-01-04 01:32 pm (UTC)
From: [identity profile] ben.bucksch.myopenid.com
The wrong decision was not made in 1930 - back then, there wasn't much choice.

It was made in 1990 when the transition to digital happened. Those guys made the stupid decision to carry on the old limitations instead of getting rid of them, because they're no longer needed.

The right approach would have been:

  • DVI and HDMI are defined to never have overscan. (Most likely, they drive digital displays which simply don't need it.)
  • Digital video material, e.g. DVD, mpg, avi, does not contain overscan. That is already the case, if you look at DVD rips or DVB (TV) recordings. We're good here.
  • If there is odd ancient video material which does contain overscan, the video player software or analog->digital converter software has to remove it. Configurable.
  • If you play digital content on an analog TV (mostly relevant in 1990s, not anymore), the analog output (e.g. Composite or VGA output of the graphics card) adds the overscan, as necessary for your TV. As it happens, graphics cards already do that for VGA monitors, so the existing system would simply by continued.


But no, this was probably all done by Hollywood guys who are a bit blurry in their mind, and we have to suffer from the fallout.

Lesson learned: If you design a new system, do not allow old limitations to creep into the new system, but go head-against-wall to remove them in the new system. Usually, there's a border between old and new systems where you can remove these old oddities. To not simply import the old stuff 1:1 and then deal with it somewhere else in the new system, but do it directly at the border.

Re: Wrong decision in 1990, not 1930

Date: 2012-01-04 03:56 pm (UTC)
ext_267968: bjh (Default)
From: [identity profile] bjh21.me.uk
Unfortunately, your fourth suggestion there is impossible – to be able to add back the overscan you need to know what to put in it, and if you've thrown that information away you can't regenerate it. Of course, you could just fill the overscan with black and hope no-one really cares that their display just shrank by 10%, but I can't help feeling that that approach is likely to be unpopular.

Re: Wrong decision in 1990, not 1930

Date: 2012-01-08 01:05 am (UTC)
From: (Anonymous)
The overscan area is not visible to the user on the device in question. It's off the edges of the visible screen. That's why it would be OK to add black or gray or whatever so that the scan area could completely fill the visible screen instead of being scaled up by the device so part becomes stretched past the edge of the visible screen and lost.

Re: Wrong decision in 1990, not 1930

Date: 2012-01-08 02:22 pm (UTC)
ext_267968: bjh (Default)
From: [identity profile] bjh21.me.uk
"The device in question" here is "an analog TV". My assumption is that this includes CRTs, which will display some of the overscan, but not a predictable amount (that amount varying between devices, with the age of the device, with its temperature, and even with the brightness of the rest of the display). Thus, on a CRT it's useful to have something in the overscan so that the picture goes all the way to the edge of the tube.

Re: Wrong decision in 1990, not 1930

Date: 2012-05-30 10:02 pm (UTC)
From: (Anonymous)
You don't "add back overscan" - you are displaying a digital source that never had overscan on an analog display - you expand the digital image to create overscan, not filled with black or gray, but with the edges of the picture.

Re: Wrong decision in 1990, not 1930

Date: 2012-05-31 10:06 am (UTC)
ext_267968: bjh (Default)
From: [identity profile] bjh21.me.uk
This entire discussion is about how to handle sources that do have overscan, and whether it's better (or was better in 1990) to preserve the overscan in the digital realm or cut it off. If your sources don't have any overscan then obviously preserving it isn't an option. Whether you fill the overscan with black, grey, or something heuristically-generated from the rest of the picture, it's going to look worse than a genuine overscanned picture.

My view is that overscan (or its equivalent) is valuable in any case where the edges of your display are uncertain. For instance, in printing the position of ink with respect to the edge of the page is often uncertain, so full-bleed pictures need an amount of unimportant content at the edge that can be safely lost when the page is trimmed.

Or to put it another way, PDF has overscan and does it right. The problem with digital video is not that it has overscan, but that it does it (badly) wrong.

Profile

Matthew Garrett

About Matthew

Power management, mobile and firmware developer on Linux. Security developer at Google. Ex-biologist. @mjg59 on Twitter. Content here should not be interpreted as the opinion of my employer.

Expand Cut Tags

No cut tags