TVs are all awful
A discussion a couple of days ago about DPI detection (which is best summarised by this and this and I am not having this discussion again) made me remember a chain of other awful things about consumer displays and EDID and there not being enough gin in the world, and reading various bits of the internet and wikipedia seemed to indicate that almost everybody who's written about this has issues with either (a) technology or (b) English, so I might as well write something.
The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.
EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).
1366/8 = 170.75. Hm.
Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.
It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.
So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?
But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.
Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.
tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.
So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.
The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.
The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.
The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.
The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.
EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).
1366/8 = 170.75. Hm.
Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.
It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.
So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?
But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.
Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.
tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.
So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.
The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.
The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.
The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.
1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-01-03 07:04 pm (UTC)(link)1366x768 comes from a quirk of at least one method for manufacturing LCD panels. At one stage in manufacture, they're made as a big sheet of pixels, much, much larger than you want as screens. This is then cut into individual screens, first by cutting into vertical chunks, then cutting the line of screens into single screens.
The trick comes in when you're cutting - defects tend to cluster in individual spots on the screen. Any finished screen with too many defects is a failure and has to be discarded. So, vertically, you want to cut into as small a number of sizes as possible, and cut horizontal chunks avoiding any defects.
There are no common 720 high 4:3 screen sizes - 960x720 is unusual. Lots of people still want 1024x768 4:3 screens for various applications. You may be seeing why 1366x768 is popular at this point - you can now cut your lots by 768 stage into a mix of 1024x768 4:3 and 1366x768 16:9 screens, and can therefore get more usable screens from the same vertical cut.
Plus, TV is all about samples of an idealised continuous function, not pixels, and we have well-understood "perfect" algorithms for scaling 1280x720 samples to 1366x768 samples, such that lighting up an LCD pixel will reproduce the "same" continuous function whether you're lighting up 1280x720 pixels, or 1366x768 pixels, or even 1920x1080 pixels. This is not so good for computers, which are pixel based, not sample based.
Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
That can't possibly be true, unless you assume your continuous function entirely lacks high-frequency components -- you end up seriously degrading areas of high detail, which was the reason you bought a HDTV in the first place. Try enlarging a highly-detailed photo by 7% in the image editor of your choice.
Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-01-03 09:33 pm (UTC)(link)Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-01-03 11:21 pm (UTC)(link)The basic theory is the infinite series form of the Whittaker-Shannon interpolation formula; you know that your input signal is band-limited by the input format (and that this limit is low enough that you can perfectly reconstruct the original analogue signal from your input samples). You can thus filter out any aliases above the sampling rate of the input format; with your filtered infinite series in hand, you sample this continuous signal at the output sample points.
A brute force implementation of this method ends up needing 921,600 MACs per output sample. There are many, many optimisations available (see the literature on spatial scaling) that massively reduce this, and of course an ASIC design is massively parallel anyway.
Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-01-04 09:02 am (UTC)(link)The TV signal is designed as this digital sample of a hypothetical continuous function; the blurring is absolutely fine in this case, as it's just the same image as you'd get if your panel really were 1280x720, not 1366x768 or 1920x1080.
It's a collision of two different worlds. The computer world makes assumptions about pixel addressability to get a picture that's as good as you can achieve in 1280x720 (or however many pixels the computer is outputting), even if the resulting picture cannot scale well. The TV world ignores pixel addressability, to get a picture that can be scaled perfectly for output at a different resolution, at the expense of not producing as good a picture as is theoretically possible.
Trying to explain things across these two worlds is an exercise in driving yourself to drink; computer people explain that pixels are physically squares, and you should be able to address each pixel. TV people respond with "but the signal on the wire is just a series of samples of a continuous function - you should be able to scale it knowing that that's the case". Computer people respond by pointing out that they can exploit pixel addressability to produce an image with information that's above the Nyquist frequency of the TV signal. Which gets hit with a "you can't do that! It's a sample of a continuous function, and if you do that, you can't reconstruct the original signal perfectly".
And round and round you go, with TV people failing to understand the mental model computer people work with, and computer people getting frustrated that TV people don't see the cool things you can do if you treat the link as transferring a framebuffer to the physical domain, and blow this "sample of a continuous function" idea, while the TV people don't understand why you'd throw away all the cool things you can do if you treat it as a sample of a continuous function in favour of these gains for just the set you're connected to right here, right now.
And, of course, cheap TV sets completely ignore the computer people view of the world; they're TVs - why would anyone care about anything other than watching American Idol 2012 Suicidal Fat Maniacs Edition or the latest sports matches on them?
Whisky helps. Beer helps. Giving up on caring is better for the liver, though.
Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-01-03 11:29 pm (UTC)(link)[*] i.e. the kernel you use to 'blur' the infinitely sharp points of your original sampling before re-sampling said blurry continuous signal back into infinitely sharp points.
--
Rich Wareham
(who cannot be bothered to sign in)
Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-06-08 11:44 pm (UTC)(link)Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.
(Anonymous) 2012-05-30 09:58 pm (UTC)(link)Well, in the laptop world, they're popular because you get 16:9 and still have the Exact Same Height as VGA, just giving more horizontal room.
You don't want to be, say ,1024x600 like a Netbook, because that sucks.
And if your screen is small (11" MBA) or cheap (any number of 15" generics) you don't want more pixels very much.
It's a very practical size for non-"video"-related, and non-sheet-cutting reasons as well.