Date: 2012-01-04 10:06 am (UTC)
spodlife: (Default)
From: [personal profile] spodlife
I feel your pain - for the last 6 years I too have been looking at EDIDs and monitors, and despairing. Not everyday, luckily, but particularly bad examples crop up every few months. One time IT bought a batch of HP monitors with DVI-I sockets. These presented an analogue EDID if it thought the source was analogue, and a digital EDID if it thought the source was digital, and flip-flopped between the two when the source, and cable, could do both.

What should the source do if the EDID is corrupt, or lies, or both lies and is corrupt yet the checksum bytes are "correct"?

HDMI and DVI EDIDs tend to be different because of the audio descriptors in the HDMI EDID extention (of course nothing prevents audio packets being transmitted over DVI because it is all the same signalling). DVI monitors also tend not to advertise TV resolutions and refresh rates (50Hz, anyone?), while as noted HDMI is very TV focused.

Another rage inducing Samsung TV cropped and scaled the HDMI source in every mode (yes, even at 640x480 it didn't display every pixel), and behaved perfectly normally when using the VGA or DVI inputs.

To counter the blurries have you noticed TVs often apply sharpening to the image too? I first saw this when a TV edge enhanced all the JPEG artefacts in my desktop background image.
Identity URL: 
Account name:
If you don't have an account you can create one now.
HTML doesn't work in the subject.


If you are unable to use this captcha for any reason, please contact us by email at

Notice: This account is set to log the IP addresses of everyone who comments.
Links will be displayed as unclickable URLs to help prevent spam.


Matthew Garrett

About Matthew

Power management, mobile and firmware developer on Linux. Security developer at Google. Ex-biologist. @mjg59 on Twitter. Content here should not be interpreted as the opinion of my employer.

Expand Cut Tags

No cut tags