[personal profile] mjg59
A discussion a couple of days ago about DPI detection (which is best summarised by this and this and I am not having this discussion again) made me remember a chain of other awful things about consumer displays and EDID and there not being enough gin in the world, and reading various bits of the internet and wikipedia seemed to indicate that almost everybody who's written about this has issues with either (a) technology or (b) English, so I might as well write something.

The first problem is unique (I hope) to 720p LCD TVs. 720p is an HD broadcast standard that's defined as having a resolution of 1280x720. A 720p TV is able to display that image without any downscaling. So, naively, you'd expect them to have 1280x720 displays. Now obviously I wouldn't bother mentioning this unless there was some kind of hilarious insanity involved, so you'll be entirely unsurprised when I tell you that most actually have 1366x768 displays. So your 720p content has to be upscaled to fill the screen anyway, but given that you'd have to do the same for displaying 720p content on a 1920x1080 device this isn't the worst thing ever in the world. No, it's more subtle than that.

EDID is a standard for a blob of data that allows a display device to express its capabilities to a video source in order to ensure that an appropriate mode is negotiated. It allows resolutions to be expressed in a bunch of ways - you can set a bunch of bits to indicate which standard modes you support (1366x768 is not one of these standard modes), you can express the standard timing resolution (the horizontal resolution divided by 8, followed by an aspect ratio) and you can express a detailed timing block (a full description of a supported resolution).

1366/8 = 170.75. Hm.

Ok, so 1366x768 can't be expressed in the standard timing resolution block. The closest you can provide for the horizontal resolution is either 1360 or 1368. You also can't supply a vertical resolution - all you can do is say that it's a 16:9 mode. For 1360, that ends up being 765. For 1368, that ends up being 769.

It's ok, though, because you can just put this in the detailed timing block, except it turns out that basically no TVs do, probably because the people making them are the ones who've taken all the gin.

So what we end up with is a bunch of hardware that people assume is 1280x720, but is actually 1366x768, except they're telling your computer that they're either 1360x765 or 1368x769. And you're probably running an OS that's doing sub-pixel anti-aliasing, which requires that the hardware be able to address the pixels directly which is obviously difficult if you think the screen is one size and actually it's another. Thankfully Linux takes care of you here, and this code makes everything ok. Phew, eh?

But ha ha, no, it's worse than that. And the rest applies to 1080p ones as well.

Back in the old days when TV signals were analogue and got turned into a picture by a bunch of magnets waving a beam of electrons about all over the place, it was impossible to guarantee that all TV sets were adjusted correctly and so you couldn't assume that the edges of a picture would actually be visible to the viewer. In order to put text on screen without risking bits of it being lost, you had to steer clear of the edges. Over time this became roughly standardised and the areas of the signal that weren't expected to be displayed were called overscan. Now, of course, we're in a mostly digital world and such things can be ignored, except that when digital TVs first appeared they were mostly used to watch analogue signals so still needed to overscan because otherwise you'd have the titles floating weirdly in the middle of the screen rather than towards the edges, and so because it's never possible to kill technology that's escaped into the wild we're stuck with it.

tl;dr - Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.

So you plug your computer into a TV and even though you know what the resolution really is you still don't get to address the individual pixels. Even worse, the edges of your screen are missing.

The best thing about overscan is that it's not rigorously standardised - different broadcast bodies have different recommendations, but you're then still at the mercy of what your TV vendor decided to implement. So what usually happens is that graphics vendors have some way in their drivers to compensate for overscan, which involves you manually setting the degree of overscan that your TV provides. This works very simply - you take your 1920x1080 framebuffer and draw different sized black borders until the edge of your desktop lines up with the edge of your TV. The best bit about this is that while you're still scanning out a 1920x1080 mode, your desktop has now shrunk to something more like 1728x972 and your TV is then scaling it back up to 1920x1080. Once again, you lose.

The HDMI spec actually defines an extension block for EDID that indicates whether the display will overscan or not, but doesn't provide any way to work out how much it'll overscan. We haven't seen many of those in the wild. It's also possible to send an HDMI information frame that indicates whether or not the video source is expecting to be overscanned or not, but (a) we don't do that and (b) it'll probably be ignored even if we did, because who ever tests this stuff. The HDMI spec also says that the default behaviour for 1920x1080 (but not 1366x768) should be to assume overscan. Charming.

The best thing about all of this is that the same TV will often have different behaviour depending on whether you connect via DVI or HDMI, but some TVs will still overscan DVI. Some TVs have options in the menu to disable overscan and others don't. Some monitors will overscan if you feed them an HD resolution over HDMI, so if you have HD content and don't want to lose the edges then your hardware needs to scale it down and let the display scale it back up again. It's all awful. I recommend you drink until everything's already blurry and then none of this will matter.
Page 1 of 2 << [1] [2] >>

This is the reason...

Date: 2012-01-03 06:27 pm (UTC)
oshepherd: (Default)
From: [personal profile] oshepherd
...why, when purchasing smaller HD TVs, you want to find the ones which advertise themselves as a "TV monitor"

At least that way you have a chance that the TV will behave sanely. Thankfully for myself, I picked a TV which has settings which make it sane and not overscan and not apply any stupid sharpening algorithms to my picture (awesome!)

Guess how you activate this feature? You rename the inputs to "PC" or "DVI-PC". So now my TV's input selection has the DTV tuner, "HDMI PC", "DVI DVI-PC", "PC PC" (the VGA input), "Component" and "Ext1" (SCART)

The name also controls where the audio comes from (On the DVI/HDMI inputs, "PC" takes sound from the TMDS link, while "DVI-PC" takes it from the solitary audio input minijack)

So, entirely sane feature design there Samsung! It doesn't at all make me want to consume one of the bottles of Whiskey on my shelf...

(In unrelated news: One of the words in my reCaptcha looks like "Add.l". Are they scanning in a book containing some assembly language? :P)

Re: This is the reason...

Date: 2012-01-04 02:48 am (UTC)
From: (Anonymous)
Samsung, go figure. They're great at making really nice, cheap electronics and then making a few ridiculous design decisions that ruin them. Like my monitor having a non-removable stand, or the horrible, horrible software in everything they've ever made.

Re: This is the reason...

From: (Anonymous) - Date: 2012-01-04 05:24 pm (UTC) - Expand

Re: This is the reason...

From: (Anonymous) - Date: 2012-01-04 06:49 pm (UTC) - Expand
From: (Anonymous)
Matthew almost certainly already knows this, as he's steeped in the many and various ways that consumer electronics manufacturers try and do things on the cheap, but for everyone else:

1366x768 comes from a quirk of at least one method for manufacturing LCD panels. At one stage in manufacture, they're made as a big sheet of pixels, much, much larger than you want as screens. This is then cut into individual screens, first by cutting into vertical chunks, then cutting the line of screens into single screens.

The trick comes in when you're cutting - defects tend to cluster in individual spots on the screen. Any finished screen with too many defects is a failure and has to be discarded. So, vertically, you want to cut into as small a number of sizes as possible, and cut horizontal chunks avoiding any defects.

There are no common 720 high 4:3 screen sizes - 960x720 is unusual. Lots of people still want 1024x768 4:3 screens for various applications. You may be seeing why 1366x768 is popular at this point - you can now cut your lots by 768 stage into a mix of 1024x768 4:3 and 1366x768 16:9 screens, and can therefore get more usable screens from the same vertical cut.

Plus, TV is all about samples of an idealised continuous function, not pixels, and we have well-understood "perfect" algorithms for scaling 1280x720 samples to 1366x768 samples, such that lighting up an LCD pixel will reproduce the "same" continuous function whether you're lighting up 1280x720 pixels, or 1366x768 pixels, or even 1920x1080 pixels. This is not so good for computers, which are pixel based, not sample based.
From: [identity profile] mas90.livejournal.com
we have well-understood "perfect" algorithms for scaling 1280x720 samples to 1366x768 samples, such that lighting up an LCD pixel will reproduce the "same" continuous function whether you're lighting up 1280x720 pixels, or 1366x768 pixels, or even 1920x1080 pixels

That can't possibly be true, unless you assume your continuous function entirely lacks high-frequency components -- you end up seriously degrading areas of high detail, which was the reason you bought a HDTV in the first place. Try enlarging a highly-detailed photo by 7% in the image editor of your choice.

Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.

From: (Anonymous) - Date: 2012-01-03 09:33 pm (UTC) - Expand

Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.

From: (Anonymous) - Date: 2012-01-03 11:29 pm (UTC) - Expand

Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.

From: (Anonymous) - Date: 2012-06-08 11:44 pm (UTC) - Expand

Re: 1366x768 makes sense - it's manufacturers cheaping out. Again.

From: (Anonymous) - Date: 2012-05-30 09:58 pm (UTC) - Expand

Date: 2012-01-03 07:34 pm (UTC)
From: [identity profile] mas90.livejournal.com
For added fun, the manufacturer of a 720p HDTV I used to have had apparently decided that this whole EDID thing was too complicated, so had copied the EDID verbatim from a Thinkpad monitor (the X server log actually printed the display's identity as "Thinkpad"). The Thinkpad monitor in question was apparently a 5:4, 1280x1024 model. So I got the middle 90% of my desktop, stretched horizontally by 40%.

The TV I currently have is detected by my graphics driver as 1360x768, for some reason. I don't understand why the vertical resolution is mysteriously correct. Thankfully it can be told not to scale/overscan at all, so I just have a 6-pixel black bar down one edge of the screen, which I can cope with.

EDID's and TV resolutions

Date: 2012-01-03 09:31 pm (UTC)
From: (Anonymous)
In my job I spend a lot of time reading TV EDIDs and swearing.

I only see the 1366x768 on older TV's that marketed at HD-Ready. What they meant, and what they state in the fine print, is that can accept an HD broadcast signal. Then the TV internally downscales to normal NTSC or PAL. I haven't seen one of the TV's that you describe where it actually has a reasonable resolution.

Another thing that is very common are 1080i TV's. Most people get the impression that they are getting 1080 rows of resolutions, when these TV's are also designed to accept 1080i resolution, they also downscale (usually to 720i).

In terms of oversampling and active resolutions, you do get a full 1920x1080 of viewable area. Instead the full resolution is much larger, and consequently the pixel clock frequency of your HDMI/DVI cable it much higher than it needs to be for 1920x1080.

Oversampling is just one small dumb point, the whole backwards compatibility thing really goes way to far. As far as I can tell DVI cables are designed for analog signals, but can also carry digital on separate data lines, even though no one ever used the analog part. The fact that both analog and digital went through the cable together, I guess it made sense at the time that the digital part would just be the analog equivalent. IE nothing more than ADC channel. They could then use the same clock freqency, same synchronization pulse times (to let the gun move from the right edge of the screen back to the left). WTF!!! It's digital why are we sending pulse signals to wait for a CRT gun? HMDI was just DVI, but with audio encoded in the dead time while synchronization pulse was been sent.

Where I think video signals really went wrong is with DisplayPort. Since DVI/HDMI were purely analog signals encoded in digital, DisplayPort would give us the chance to drop the analog oddities like sync pulses. No they didn't drop that, as they thought if they kept them, it would be easy for someone to make a DisplayPort to HDMI/DVI/VGA adaptor. Now in 2012 we still have to carry forward all of this CRT specific crud.

Re: EDID's and TV resolutions

Date: 2012-01-03 09:46 pm (UTC)
From: [identity profile] https://www.google.com/accounts/o8/id?id=AItOawm9qVCUbxQoGyLJtq0cEvtCsspBzj0m3Ag
Sorry, I don't know your name, but you're completely wrong regarding analogue vs digital video systems. Synchronisation pulses *are* needed for digital displays, they just don't work otherwise. It's unrelated to CRT, completely.
DVI-D (and HDMI, which is essentially the same) were never analogue interfaces. They encode pixel data into TMDS which is much better to transfer data on high speeds than plain TTL, and also it requires less wires because of multiplexing which is, in turn, possible because of higher transfer rates.
I think you'd better read some literature on this specific topic.

Re: EDID's and TV resolutions

From: (Anonymous) - Date: 2012-01-03 10:14 pm (UTC) - Expand

Re: EDID's and TV resolutions

From: (Anonymous) - Date: 2012-01-03 10:40 pm (UTC) - Expand

Re: EDID's and TV resolutions

From: (Anonymous) - Date: 2012-01-04 11:50 am (UTC) - Expand

Re: EDID's and TV resolutions

From: (Anonymous) - Date: 2012-01-04 01:12 pm (UTC) - Expand

Re: EDID's and TV resolutions

From: [identity profile] ajaxxx.livejournal.com - Date: 2012-01-04 04:36 am (UTC) - Expand

Re: EDID's and TV resolutions

From: (Anonymous) - Date: 2012-05-30 10:05 pm (UTC) - Expand

Any TV/monitor recommendations?

Date: 2012-01-03 09:34 pm (UTC)
From: [identity profile] https://www.google.com/accounts/o8/id?id=AItOawkbPfjNvIUlT3QjuE584SdVM9mq5jh-XuE
Can anyone recommend a good LED-backlit display that functions as a TV and a computer monitor (with DVI or Mini-DisplayPort)? I have a MacBook Air and I want a TV/Monitor that behaves like a good TV when I'm watching TV and a good monitor when I'm on my computer, including functional power saving.

Date: 2012-01-03 09:38 pm (UTC)
From: [identity profile] https://www.google.com/accounts/o8/id?id=AItOawm9qVCUbxQoGyLJtq0cEvtCsspBzj0m3Ag
Matthew, I'm sorry to disappoint you, but EDID actually can specify proper pixel resolution. DTD block has parameters named HACTIVE and VACTIVE which have this exact kind of information. More over, that extension you named (DDDB, more precisely) has even more possibilities to specify the exact low-level structure of your display, including subpixel layout and response time.

Date: 2012-01-03 09:41 pm (UTC)
fluffymormegil: @ (Default)
From: [personal profile] fluffymormegil
Well, that's just dandy - but how many displays retailing at the going rate for the resolution etc. they provide actually do so?

None.

From: (Anonymous) - Date: 2012-03-18 04:05 am (UTC) - Expand

(no subject)

From: [identity profile] ajaxxx.livejournal.com - Date: 2012-01-04 04:47 am (UTC) - Expand

HD ready 1080p

Date: 2012-01-03 10:26 pm (UTC)
From: (Anonymous)
HD ready 1080p seems to be the magic keyword we are looking for!
Display 1080p and 1080i video without overscan (1:1 pixel mapping)
HD ready on Wikipedia (https://secure.wikimedia.org/wikipedia/en/wiki/HD_ready#References)

Re: HD ready 1080p

Date: 2012-01-04 11:10 am (UTC)
From: (Anonymous)
I thought HD Ready referred to the marketed ability to understand (at least) 1080i signals, even if the display could not display it? Then we had Full HD for displays that really did have a 1920x1080 resolution. Now we have a new non-certification label?

Meh, I'm still going to buy a large monitor the next time I need a display to hook up to my cinema system. It's not like you need any TV functionality when you have an AV receiver, right?

Recommendations?

Date: 2012-01-03 11:05 pm (UTC)
From: (Anonymous)
Having experienced quite a bit of this, particularly the issues with overscan, and the issues you *didn't* mention with video lag due to internal processing, I personally plan on purchasing a real LCD monitor for my next TV, rather than something intended as a "TV". I don't actually care about having a built-in TV tuner, since I watch TV through a Linux box running MythTV. I do, however, want to have analog inputs (composite and component) for game consoles.

Any recommendations for decent LED/IPS monitors which have composite and component inputs? Looking for something large (>27"), rotatable (yay for portrait coding), and high resolution (1080p or better).

Why am I not surprised?

Date: 2012-01-03 11:22 pm (UTC)
From: (Anonymous)
Ha. I tried plugging in my computer (DVI) to my AV receiver (HDMI, with a simple adapter), and then to my HDTV (HDMI). I assumed that, being all-digital, it would all just work! Unfortunately, the picture was offset by half a screen width for some reason, and no amount of twiddling could make it work right.

I thought it was weird that an all-digital connection wouldn't be perfect. Now I'm impressed that anything showed up at all!

(It works great if I connect my computer directly into the TV, and all other HDMI sources I've tried can pass through my receiver just fine. It's just that particular combination that happens to fail.)

Date: 2012-01-04 12:38 am (UTC)
From: (Anonymous)
And the entire mess you describe is significantly better than all the aspect and chroma-phase clocking issues in legacy analog NTSC signals... progresss.

Thank you note...

Date: 2012-01-04 02:23 am (UTC)
From: (Anonymous)
I just wanted to express how funny and informative I find your posts. I had no idea that the "enlarged" picture on my TV is not my fault but the manufacturer's. I feel relieved and can now happily pass the buck :)

Yeah, there's all sort of bizarreness

Date: 2012-01-04 03:15 am (UTC)
From: [identity profile] notting.id.fedoraproject.org
I have a TV/monitor that claims to support '1080p'.

According to its DVI EDID (and its docs) the max resolution is 1680x1050. However, if i hook up a device (Roku) via HDMI and set it to '1080p' mode, it claims via its info screen to be running in 1920x1080. Presumably that's just a description of the signal, not the resolution. It supposedly has a 1:1 pixel mode, in which case the display shows the 1920x1080 or 1280x720 signal (16:9) taking up the entire screen (1680x1050 == 16:10). Wha...?

Re: Yeah, there's all sort of bizarreness

Date: 2012-01-04 03:44 am (UTC)
From: (Anonymous)
There is an extended section for HDMI after the 128 byte EDID, that occupies the next 128 bytes. 1080p is in there, as well as all the information about your TV's audio (IE surround sound/stereo/etc)

Also quite often every different port of a TV has a different EDID. This is really dumb, but its true. Try reading the EDID from the HDMI port. For a bit of background this is a requirement for analog vs. digital as there is a bit in the EDID that specifies whether the monitor is analog (VGA) or digitial (DVI/HDMI). DVI and HDMI should really be the same though

1080i broadcast kneecapping

Date: 2012-01-04 03:59 am (UTC)
From: (Anonymous)
Also note that 1080i broadcasts are almost never in the full 1920x1080i resolution. ATSC & BBC 1080i broadcasts are 1440x1080i (stretched by receiver to 1920x1080), while satellite services crunch that even further to 1280x1080 for maximum compression efficiency. Cable co's processing standards vary. There are so many levels of up & downsampling that a 1:1 pixel mapping mode on a TV is nearly pointless save Blu-ray or computer sources.

You think TVs a bad...

Date: 2012-01-04 09:35 am (UTC)
From: (Anonymous)
Whilst TVs may be bad, mostly due to consumer price sensitivity I suspect, don't think that the situation is any better in a more controlled space such as cinema.

Whilst there maybe somewhere in the region of 150,000 movie screens worldwide when it comes to digital it is nearer half of that. You would have thought they could have resolved the over scan issues when adapting the intermediate formats used to digitize film, where the % reduction in displayed image (projection safe) vs capture (camera area(s)) is well standardized (SMPTE/ANSI/etc).

When it came to digital projection standards they used the pixel count for the camera area (typically 2048 wide for full aperture) as the projection area meaning you need to up-res similar to what over scanning does.

Ideally they would have used the same proscribed saftey areas and at least allowed one of the origination formats used, to be mapped directly. Instead to achieve this now you have to realign the digitization optics, to sample the 'active' picture at 2048.

(The above discounts the use of over sampled sources (4096 samples) which is typical in high end uses)

Date: 2012-01-04 10:06 am (UTC)
spodlife: Tardis and Tim (Default)
From: [personal profile] spodlife
I feel your pain - for the last 6 years I too have been looking at EDIDs and monitors, and despairing. Not everyday, luckily, but particularly bad examples crop up every few months. One time IT bought a batch of HP monitors with DVI-I sockets. These presented an analogue EDID if it thought the source was analogue, and a digital EDID if it thought the source was digital, and flip-flopped between the two when the source, and cable, could do both.

What should the source do if the EDID is corrupt, or lies, or both lies and is corrupt yet the checksum bytes are "correct"?

HDMI and DVI EDIDs tend to be different because of the audio descriptors in the HDMI EDID extention (of course nothing prevents audio packets being transmitted over DVI because it is all the same signalling). DVI monitors also tend not to advertise TV resolutions and refresh rates (50Hz, anyone?), while as noted HDMI is very TV focused.

Another rage inducing Samsung TV cropped and scaled the HDMI source in every mode (yes, even at 640x480 it didn't display every pixel), and behaved perfectly normally when using the VGA or DVI inputs.

To counter the blurries have you noticed TVs often apply sharpening to the image too? I first saw this when a TV edge enhanced all the JPEG artefacts in my desktop background image.

Wrong decision in 1990, not 1930

Date: 2012-01-04 01:32 pm (UTC)
From: [identity profile] ben.bucksch.myopenid.com
The wrong decision was not made in 1930 - back then, there wasn't much choice.

It was made in 1990 when the transition to digital happened. Those guys made the stupid decision to carry on the old limitations instead of getting rid of them, because they're no longer needed.

The right approach would have been:

  • DVI and HDMI are defined to never have overscan. (Most likely, they drive digital displays which simply don't need it.)
  • Digital video material, e.g. DVD, mpg, avi, does not contain overscan. That is already the case, if you look at DVD rips or DVB (TV) recordings. We're good here.
  • If there is odd ancient video material which does contain overscan, the video player software or analog->digital converter software has to remove it. Configurable.
  • If you play digital content on an analog TV (mostly relevant in 1990s, not anymore), the analog output (e.g. Composite or VGA output of the graphics card) adds the overscan, as necessary for your TV. As it happens, graphics cards already do that for VGA monitors, so the existing system would simply by continued.


But no, this was probably all done by Hollywood guys who are a bit blurry in their mind, and we have to suffer from the fallout.

Lesson learned: If you design a new system, do not allow old limitations to creep into the new system, but go head-against-wall to remove them in the new system. Usually, there's a border between old and new systems where you can remove these old oddities. To not simply import the old stuff 1:1 and then deal with it somewhere else in the new system, but do it directly at the border.

Re: Wrong decision in 1990, not 1930

Date: 2012-01-04 03:56 pm (UTC)
ext_267968: bjh (Default)
From: [identity profile] bjh21.me.uk
Unfortunately, your fourth suggestion there is impossible – to be able to add back the overscan you need to know what to put in it, and if you've thrown that information away you can't regenerate it. Of course, you could just fill the overscan with black and hope no-one really cares that their display just shrank by 10%, but I can't help feeling that that approach is likely to be unpopular.

Re: Wrong decision in 1990, not 1930

From: (Anonymous) - Date: 2012-01-08 01:05 am (UTC) - Expand

Re: Wrong decision in 1990, not 1930

From: [identity profile] bjh21.me.uk - Date: 2012-01-08 02:22 pm (UTC) - Expand

Re: Wrong decision in 1990, not 1930

From: (Anonymous) - Date: 2012-05-30 10:02 pm (UTC) - Expand

Re: Wrong decision in 1990, not 1930

From: [identity profile] bjh21.me.uk - Date: 2012-05-31 10:06 am (UTC) - Expand

How to turn overscan off

Date: 2012-01-04 01:35 pm (UTC)
From: [identity profile] ben.bucksch.myopenid.com
The good news is that most digital TVs allow you to turn off overscan entirely, and I strongly recommend that you do.

The bad news is that it's typically very hard to find, wrongly named, not documented at all, and none of the hotline or sales clerks know about it and will happily and strongly claim that it's not possible, although it is.

On my Samsung, the option is called "Only scan" (Asian->English translation for "No overscan") and only exists on the remote control, not in the video settings menu.

What is the manual workaround?

Date: 2012-01-04 09:44 pm (UTC)
From: (Anonymous)
That is fascinating and depressing but at least I understand why the picture from my laptop is so poor when connected to my TV! Thanks for that.

A series of steps (xrandr commands?) that one should use to workaround the broken EDID data would be very useful!

Awesome post, thanks.

Date: 2012-01-05 05:00 am (UTC)
From: (Anonymous)
I haven't spent tons of time with this stuff, but every time I have, I've been bewildered by why everything sucks so bad. I *thought* the whole point of DVI and HDMI was pixel-perfectness, and you get it with real monitors, but not with TVs. (And I also always wondered where 1366x768 came from, since it's not quite actually 16:9.) I've seen first-hand almost everything you're describing here. Being an anal-retentive neatnik this stuff *really* bugs me but I've somewhat accepted the fact that no matter what is set how, it really doesn't matter too much once you're across the room.

Also, I love 4:3 and hate 16:9 in the first place, but that's a whole other rant. (Short version: with a 4:3 screen, 4:3 content looks perfect, and anything else gets letterboxed to some extent. With a 16:9 screen, 16:9 content--the minority of what I watch--looks good (pixel accuracy notwithstanding) and everything else gets stretched, zoomed, distorted, cropped, letterboxed, or pillarboxed. Lovely. And on a computer, I want height, due to all the menu bars.)

Again, thanks for putting together a great post. Lots of good info in one spot--and a perfectly acceptable solution at the end. :-)

Re: Awesome post, thanks.

Date: 2013-05-25 02:35 am (UTC)
From: (Anonymous)
One thing that helps is to put your taskbar at the side of the screen. This works beautifully in Windows; some Linux taskbars don't support it very well, but most should by now. Not only do you gain back precious vertical space, but you have room for a ton of windows, icons, etc on the taskbar this way.
From: (Anonymous)
and I ended up here OH GOD this is the scariest part of the internet WHO ARE YOU PEOPLE and why in the world are you discussing this at length HOW do you have so much to say on this subject I swear to God you spend more time talking about TVs than any sane human would ever spend WATCHING TV let alone talking about it OH GOD OH GOD GET ME OUT OF HERE WHERE IS THE RIP CORD ON THIS THING

Overscan selection

Date: 2012-01-06 01:15 pm (UTC)
From: (Anonymous)
My TV (a Kogan KGN1080P32VAA - based on a CultraView CV119MA mainboard) has another new and exciting quirk. See rant here:

http://blog.ringerc.id.au/2010/11/kogan-kgn1080p32vaa-avoid-this-tv-and.html

Overscan compensation is used for 1920x1080 input on the HDMI port(s) ... if a HDMI audio signal is detected from the source. You can not get a decent picture and HDaMI audio at the same time. Either you get HDMI audio and a butchered picture, or you get to use the single 3.5mm minijack for audio input and get a decent picture.

This makes the 3 HDMI ports kind of useless, because to get a decent picture you have to have them all share the same audio input. Um, yay?

As if that isn't craptastic enough, the TV also sends the EDID for a 720p display on its HDMI ports. The vendor refuses to fix this and won't disclose specs or firmware info so I can fix it myself. Grr!

There is, of course, also no way to turn off the "sharpening" and other fun.

Why can't they JUST DISPLAY THE PIXELS THEY ARE SENT? Is it really that hard?

(I tried to log in with OpenID, but DreamWidth doesn't like my Google OpenID for some reason; you can get me at http://blog.ringerc.id.au/ anyway)

Date: 2012-01-10 12:24 am (UTC)
emceeaich: A close-up of a pair of cats-eye glasses (Default)
From: [personal profile] emceeaich
Your 1920x1080 TV takes a 1920x1080 signal, chops the edges off it and then stretches the rest to fit the screen because of decisions made in the 1930s.

That is a [community profile] metaquotes winner.

... and traintracks are standardized

Date: 2012-03-09 03:53 pm (UTC)
From: (Anonymous)
... because of roman carriages. And Windows still has DOS prompts because some businesses require DOS applications. And on and on.

give it a few years and all vestiges of analogue will be removed and you'll require authentication keys to display video from one device on your perfectly calibrated digital monitor (at 4K resolutions).

ain't progress wonderful?

Re: ... and traintracks are standardized

Date: 2012-05-30 10:13 pm (UTC)
From: (Anonymous)
Windows still has DOS prompts because a console of some sort is very useful* for diagnostics and debugging, and they have no need to switch entirely to PowerShell* yet.

(* Thus going to PowerShell rather than just dumping the console, and why eg. OSX has a shell available, even though the average user will never, ever, ever need to look at it.)

DPI 1080 Double LCD HDTV overscan with HDMI

Date: 2012-05-30 10:14 pm (UTC)
From: (Anonymous)
The fact that I don't understand anything on this page, in the article or the comments, is *exactly* why Apple will succeed overwhelmingly with whatever television offering they produce. Ninety-eight percent of us just want something that plugs in and works. Looking forward to it.

Date: 2012-05-31 08:00 am (UTC)
From: (Anonymous)
I have worked in audio visual for some years, and tv's overscanning is such a pain in the ass. The options to just scan are either not there, or incredibly convoluted to find. Take samsung. Bring up the input menu, hold tools, change the label to pc (a label is just a name, or not?), as opposed to game or satellite or nothing. No consumer will find this easily.
Page 1 of 2 << [1] [2] >>