Tracy R Reed wrote:
Andrew Lentvorski wrote:
However, I second the comment about DVI cables. It is *remarkable*
how much better the screen looks. Looking at contrast and viewing
angle at Fry's is okay, but pay no attention to sharpness and
focus unless it's on a DVI cable.
I have always wondered about this. I thought it made no difference
and wondered why DVI was such a big deal. If they really are
different why is DVI so easily converted to VGA with a dumb little
dongle?
It isn't. The *dongle* doesn't convert anything.
Here is the DVI pinout:
http://en.wikipedia.org/wiki/Image:DVI_pinout.svg
The analog pins are *optional*. Your card doesn't have to produce them
and a digital-only cable doesn't have to carry them.
Perhaps it is really just running VGA signal over a DVI cable when
using the convert dongle but uses better signalling when it goes
straight DVI?
The difference is that a VGA signal is analog and a DVI signal is purely
digital. So, while a VGA signal can pick up noise, crosstalk, etc. and
degrade, the DVI signal will remain completely perfect until it hits the
point where it can't function at all.
Now, with LCD monitors, the analog inputs have to go through an
Analog-to-Digital conversion as well as a PLL to lock the scan line to
the refresh of the LCD. After this, the 24-bit RGB value goes through a
final D-to-A to go to the screen pixel.
So, for a VGA cable on an LCD monitor you have:
video memory->RAMDAC(D-to-A)->VGA Analog Cable->scan lock(A-to-D)->screen
pixel(D-to-A)
With a DVI signal you have:
video memory->DVI Digital cable->screen pixel(D-to-A)
That's a lot of analog removed out of the way that can do nothing but pick up
junk.
-a
--
[email protected]
http://www.kernel-panic.org/cgi-bin/mailman/listinfo/kplug-list