TVs are all awful
A discussion a couple of days ago about DPI detection (which is best summarised by this and this and I am not having this discussion again) made me remember a chain of other awful things about consumer displays and EDID and there not being enough gin in the world, and reading various bits of the internet and wikipedia seemed to indicate that almost everybody who's written about this has issues with either (a) technology or (b) English, so I might as well write something.
The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.
EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).
1366/8 = 170.75. Hm.
Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.
It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.
So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?
But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.
Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.
tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.
So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.
The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.
The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.
The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.
The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.
EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).
1366/8 = 170.75. Hm.
Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.
It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.
So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?
But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.
Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.
tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.
So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.
The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.
The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.
The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.
EDID's and TV resolutions
(Anonymous) 2012-01-03 09:31 pm (UTC)(link)I only see the 1366x768 on older TV's that marketed at HD-Ready. What they meant, and what they state in the fine print, is that can accept an HD broadcast signal. Then the TV internally downscales to normal NTSC or PAL. I haven't seen one of the TV's that you describe where it actually has a reasonable resolution.
Another thing that is very common are 1080i TV's. Most people get the impression that they are getting 1080 rows of resolutions, when these TV's are also designed to accept 1080i resolution, they also downscale (usually to 720i).
In terms of oversampling and active resolutions, you do get a full 1920x1080 of viewable area. Instead the full resolution is much larger, and consequently the pixel clock frequency of your HDMI/DVI cable it much higher than it needs to be for 1920x1080.
Oversampling is just one small dumb point, the whole backwards compatibility thing really goes way to far. As far as I can tell DVI cables are designed for analog signals, but can also carry digital on separate data lines, even though no one ever used the analog part. The fact that both analog and digital went through the cable together, I guess it made sense at the time that the digital part would just be the analog equivalent. IE nothing more than ADC channel. They could then use the same clock freqency, same synchronization pulse times (to let the gun move from the right edge of the screen back to the left). WTF!!! It's digital why are we sending pulse signals to wait for a CRT gun? HMDI was just DVI, but with audio encoded in the dead time while synchronization pulse was been sent.
Where I think video signals really went wrong is with DisplayPort. Since DVI/HDMI were purely analog signals encoded in digital, DisplayPort would give us the chance to drop the analog oddities like sync pulses. No they didn't drop that, as they thought if they kept them, it would be easy for someone to make a DisplayPort to HDMI/DVI/VGA adaptor. Now in 2012 we still have to carry forward all of this CRT specific crud.
Re: EDID's and TV resolutions
DVI-D (and HDMI, which is essentially the same) were never analogue interfaces. They encode pixel data into TMDS which is much better to transfer data on high speeds than plain TTL, and also it requires less wires because of multiplexing which is, in turn, possible because of higher transfer rates.
I think you'd better read some literature on this specific topic.
Re: EDID's and TV resolutions
(Anonymous) 2012-01-03 10:14 pm (UTC)(link)There is no need for a sync pulse on digital display! Sync pulses are need to synchronize data in a DVI/HDMI/VGA cable.
Re: EDID's and TV resolutions
There is need. The pixel data have structure. They're organised in frames, rows and columns. So do real physical pixels. You need pixel synchronisation pulses to synchronise pixel data (no surprise!), and data active/horizontal synchronisation to split data into rows with minimal hardware. Frame synchronisation isn't needed, indeed, by most displays, but here you have two choices: either you leave a big gap between frames, so this can be detected automagically, or you send a synchronisation pulse. That makes it possible to keep the hardware simple and cheap, and introduce as little overhead as possible into protocol.
Also, inside the LCD panel, there *are* synchronisation pulses, they're just done differently. You can dig into this yourself if you want to.
One more thing. Audio should not be sent along with pixel data at all, it's completely unrelated, don't you think so?
Re: EDID's and TV resolutions
(Anonymous) 2012-01-04 01:22 am (UTC)(link)I don't see how that increases the complexity of schematics. For the sake of argument lets say video was done on a dedicated ethernet. A system that allows easy moving of packet data. How are your schematics more complicated? You now have a front end of 4 pairs across the channel, instead of TMDS pairs. From the HW schematic the front end chip isn't that much different from HDMI.
I also said you could put structure into packets, instead of a raw stream with pulses. That is your structure. Same structure, but a more modern representation.
Audio gets sent with video, as HDMI (and TV in general) means audio + video. Today's front end chips separate this, tomorrow will probably be the same. There are a lot of other systems that also try and bundle other things in the cable, but that is another discussion.
Re: EDID's and TV resolutions
Re: EDID's and TV resolutions
the hatter
Re: EDID's and TV resolutions
(Anonymous) 2012-01-03 10:40 pm (UTC)(link)Modern HDTV standards specify the expected 24, 25, 30, 50 and 60 Hz vertical refresh modes for progressive content (and interlaced at 50 and 60 Hz). They also add (for 24, 30 and 60 Hz) the option to multiply the vertical refresh by 1000 / 1001, to get 23.97 Hz, 29.97 Hz and 59.94 Hz.
Why do this? Back in the days of monochrome sets, 60 Hz regions used a genuine 60 Hz signal. When NTSC was designed, the chroma subcarrier was found to result in objectionable beat patterns on monochrome sets; the only way to fix this while staying within a 6 MHz channel bandwidth was to slightly reduce the frame rate, such that older sets (designed to lock to 60 Hz) would still lock to it, but would no longer interpret the chroma subcarrier as beat patterns in the picture.
Why keep it in HD standards? In theory, it means that if you are downscaling your HD content to SD, and then putting it into a monochrome TV set that's unaware of NTSC chroma encoding, you won't get the beat pattern. Colour-aware sets, even if they're monochrome (so virtually all designs since the 60s, and many 1950s designs) won't give the beat pattern at 60 Hz. But, we get to suffer the complications of drop frame, so that if you connect your HD source to a 1940s or 1950s TV, you won't see a beat pattern.
Re: EDID's and TV resolutions
(Anonymous) 2012-01-04 11:50 am (UTC)(link)Re: EDID's and TV resolutions
(Anonymous) 2012-01-04 01:12 pm (UTC)(link)Had we dropped the drop-frame options from HD standards, we would have a nice cut-off point; anything that only does HD can forget drop-frame. Anything that does SD has to be aware of drop-frame, and may have to do FRC if it's intending to mix SD and HD.
Re: EDID's and TV resolutions
You seem to have a weird misunderstanding of how DVI works. And, from the way you compare it to DisplayPort, of DisplayPort.
DVI has analog pins, yes. But the connection is modal. The source sends either analog VGA-like signaling over the analog wires - which is how DVI-VGA adaptors work - or it sends DVI-like signals over the digital wires. They're independent. The only bit that's shared is the DDC wires, and you're supposed to look at the EDID block to determine whether to send analog or digital.
In both VGA and (the digital half of) DVI, the length of time per pixel is variable. In this sense DVI is only semi-digital, what you actually have is a freely-clocked 8/10b bitstream somewhere between 25MHz and 165MHz. If you want to be CRT-compatible - and you might want to, to move the DAC inside the CRT enclosure and thus reduce analog cable noise - then you necessarily have to encode the blanking interval in this bitstream. There weren't many CRTs made like this, but there were a few. For that matter you have to encode a blanking interval for LCDs too, since they have row and column access strobes just like any other memory grid; it just doesn't have to be as long as a CRT.
The essential difference between this and DisplayPort, though, is just that DisplayPort fixes the link clock. Now your pixel interval is one of exactly three durations and the sink hardware gets cheaper as a result. And you also have big honking blanking intervals to stuff dummy symbols into. At that point, what's the difference between "encoding the blank interval" and not? More importantly, if you didn't have horizontal and vertical framing symbols, how would you get the zeroth pixel in the top left corner?
Re: EDID's and TV resolutions
(Anonymous) 2012-05-30 10:05 pm (UTC)(link)DVI-A and DVI-I respectively use and can-use analog signals, and in the computer world it was quite popular for a long time; let you use that shiny DVI plug on your new monitor with a horrible old VGA plug on the computer.
So, people used that all the time - to carry VGA data.