Is DVI connector worth it?

Y

Yousuf Khan

A friend has got a new LCD monitor and he wants to know if connecting
via a DVI connector would improve the quality over the VGA connectors? I
would suspect it's imperceptible. He doesn't have a video card with a
DVI connector yet, but he wants to get one if there's any difference.

Yousuf Khan
 
D

DaveW

Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
considerably superior to using analog VGA inputs and outputs. When using
all DVI input/outputs the video signal remains constantly in the digital
domain. However, when using VGA input/outputs the signal originates as a
digital signal in the video card, is converted to an analog signal for the
video card ouput, travels thru the cable as an analog signal, and then at
the LCD has to be reconverted from an analog signal back to a digital signal
that the monitor can use internally. All these signal conversions corrupt
the signal leading to a degraded image on the LCD monitor.
 
J

J. Clarke

Yousuf said:
A friend has got a new LCD monitor and he wants to know if connecting
via a DVI connector would improve the quality over the VGA connectors? I
would suspect it's imperceptible. He doesn't have a video card with a
DVI connector yet, but he wants to get one if there's any difference.

Only way to tell for sure is to try it. On some monitors there's a huge
difference, on others maybe an expert who is looking for differences can
tell but the average person can't. The newer monitors tend to do better
with analog signals than do the older ones.
 
Y

Yousuf Khan

DaveW said:
Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
considerably superior to using analog VGA inputs and outputs. When using
all DVI input/outputs the video signal remains constantly in the digital
domain. However, when using VGA input/outputs the signal originates as a
digital signal in the video card, is converted to an analog signal for the
video card ouput, travels thru the cable as an analog signal, and then at
the LCD has to be reconverted from an analog signal back to a digital signal
that the monitor can use internally. All these signal conversions corrupt
the signal leading to a degraded image on the LCD monitor.

Yes, yes, understood, that's the theory, what's the reality? Is there
any noticeable difference?

Yousuf Khan
 
M

Mxsmanic

DaveW said:
Using a DVI-out video card thru a DVI cable to a DVI- in LCD monitor is
considerably superior to using analog VGA inputs and outputs. When using
all DVI input/outputs the video signal remains constantly in the digital
domain. However, when using VGA input/outputs the signal originates as a
digital signal in the video card, is converted to an analog signal for the
video card ouput, travels thru the cable as an analog signal, and then at
the LCD has to be reconverted from an analog signal back to a digital signal
that the monitor can use internally.

The final signal is analog, not digital, even in an LCD monitor. LCDs
are not digital devices. There aren't any digital input or output
devices, in fact.

I keep hearing that digital is vastly better, but I can't see anything
wrong with the analog signal on my LCD monitor. What _exactly_ is
different with digital? The pixels look perfect on my monitor; how
can they become "more perfect"?
 
M

Mxsmanic

Yousuf said:
Yes, yes, understood, that's the theory, what's the reality? Is there
any noticeable difference?

I'm currently trying to get the DVI connection on my configuration
working to test this out. However, I can say that with an Asus video
card and a Eizo monitor, examination of the screen under a magnifying
glass reveals no "bleeding" of image from one pixel to the next, and
that is really the only thing that could be worse about analog. I
therefore wonder what DVI will bring me, which is why I've been hoping
to try it to see, now that I have both a card and a monitor with which
to try it.

Remember, no digital system can ever be superior to the best analog
system.
 
B

Bob Myers

Yousuf Khan said:
Yes, yes, understood, that's the theory, what's the reality? Is there
any noticeable difference?

Quite often, the answer is no.

There's actually very little accumulated error in the video signal from the
D/A and A/D conversions (Dave forgot to mention one - at the end,
there's a conversion back to digital form, since fundamentally LCDs are
analog-drive devices. This happens at the column drivers in the panel.).
What amplitude error may be introduced in these averages out over
successive frames, so that there is little or no visible impact on the image
quality. (Current digital interfaces may in some applications be at a
disadvantage here, in fact, as analog video systems are very often capable
of better than 8 bit/component accuracy). The real visible difference
between "analog" and "digital" inputs on monitors has to do with the
pixel-level timing, and depends on how accurately the monitor can
produce a stable sampling clock with which to sample the incoming
analog video signals. (If the VGA interface carried pixel-level timing
information, visible differences between it and "digital" connection would
vanish in almost all mainstream applications. Standards which would
provide such information have been developed, but to date have not
been widely adopted.)

Bob M.
 
B

Bob Myers

Mxsmanic said:
Remember, no digital system can ever be superior to the best analog
system.

This is also a common misconception. You cannot make sweeping
claims of superiority for either digital or analog encoding per se; the
best one can ever hope to do is to compare specific implementations
of these against whatever criteria are important for a given application.
Ideas that "analog" systems somehow provide "infinite" accuracy or
the equivalent are basically nonsense.

Bob M.
 
Y

Yousuf Khan

Bob said:
Quite often, the answer is no.

There's actually very little accumulated error in the video signal from the
D/A and A/D conversions (Dave forgot to mention one - at the end,
there's a conversion back to digital form, since fundamentally LCDs are
analog-drive devices. This happens at the column drivers in the panel.).
What amplitude error may be introduced in these averages out over
successive frames, so that there is little or no visible impact on the image
quality.

Yeah, that's basically what I've been hearing from asking around
recently. Several people that I've asked said they couldn't tell the
difference between the DVI interface and the VGA one.
(Current digital interfaces may in some applications be at a
disadvantage here, in fact, as analog video systems are very often capable
of better than 8 bit/component accuracy).

I see what you mean, even if the digital cables were capable of greater
than 8 bits/component would the digital internals of the LCDs be able
to display anything greater than 8 bits/component? So far, my friend
has been unimpressed with the quality of the picture of his LCD
compared to his old CRT.

Yousuf Khan
 
B

Bob Myers

Yousuf Khan said:
I see what you mean, even if the digital cables were capable of greater
than 8 bits/component would the digital internals of the LCDs be able
to display anything greater than 8 bits/component? So far, my friend
has been unimpressed with the quality of the picture of his LCD
compared to his old CRT.

So far, LCDs are typically 6 or 8 bits per primary. 10-bit and
even 12-bit drivers are now starting to come to the market, though
(this is primarily happening at the high end, i.e., the LCD TV panel
market), so eventually we will see better performance in this
regard.

Bob M.
 
M

Mxsmanic

Bob said:
This is also a common misconception.

It's an unavoidable fact. You cannot build any digital system that
interfaces with the real world that is superior to the best analog
system, period. The reason for this is that all interfaces are
analog, therefore no system can ever be superior to the best analog
system.
You cannot make sweeping claims of superiority for either digital
or analog encoding per se ...

Encoding is not important. It's the physical interface with the real
world that is important. And it is always an analog interface.
Ideas that "analog" systems somehow provide "infinite" accuracy or
the equivalent are basically nonsense.

They provide it in theory, but not in practice. Conversely, some
digital systems provide nearly infinite accuracy in practice, but not
in theory.

The important thing to remember is that no physical interface can be
digital. Therefore no digital system that interfaces with the
physical world can ever be better than the best analog system.

An inevitable consequence of this is that it will always be possible
to build better analog audio or video systems than any digital system.
 
B

Bob Myers

Mxsmanic said:
It's an unavoidable fact. You cannot build any digital system that
interfaces with the real world that is superior to the best analog
system, period. The reason for this is that all interfaces are
analog, therefore no system can ever be superior to the best analog
system.

But the "best analog system" is more than just an interface. As
long as the digital system is capable of capturing all information
presented by this hypothetical analog interface, and then
conveying it in a lossless manner, it would be superior than an
analog system which would by necessity introduce additional
noise into the signal once you're past the "interface."
They provide it in theory, but not in practice. Conversely, some
digital systems provide nearly infinite accuracy in practice, but not
in theory.

Actually, analog systems cannot provide "infinite" accuracy
even in theory. ANY information transmission, regardless of
whether in "digital" or "analog" form, is limited in its information
capacity by the available channel bandwidth and noise level, per
Shannon's theorem. Infinite accuracy implies an infinite information
capacity (i.e., how many decimal places do you require for
"infinite" precision?), and this would require infinite bandwidth or
precisely zero noise, neither of which is even theoretically possible.
Your last sentence is nonsensical; it implies that there are digital
systems which provide, in practice, better accuracy than can be
explained by the theory underlying their operation!

The important thing to remember is that no physical interface can be
digital.

This is not correct. Transducers have been designed which essentially
do a direct conversion of certain types of real-world parameters
directly into numeric or "digital" form; it's just that they generally
have not been very practical to implement, or provided any real
advantages over other approaches.


Bob M.
 
M

Mxsmanic

Bob said:
But the "best analog system" is more than just an interface.

Yes, but since the interface is always analog, the best digital system
can never be better than the best analog system.

Digital systems are simply analog systems with a non-zero threshold
for information content in the signal-to-noise ratio. Analog systems
treat noise as signal; digital systems ignore noise below a certain
threshold. Digital sacrifices capacity in order to do this.
As long as the digital system is capable of capturing all information
presented by this hypothetical analog interface, and then
conveying it in a lossless manner, it would be superior than an
analog system which would by necessity introduce additional
noise into the signal once you're past the "interface."

It depends. See above. Digital systems achieve lossless information
recording and transfer by sacrificing bandwidth. This works as long
as one remains in the realm of pure information. It doesn't and
cannot work for the final physical interfaces (which are always
analog) at either end.

Ultimately, then, you can build an analog system that will meet or
beat any digital system. The reason this doesn't actually happen is
that, up to a certain point, analog systems of this kind are much more
expensive than digital systems. By sacrificing capacity you can
greatly reduce cost and keep errors arbitrarily low (although you
cannot eliminate them).
Actually, analog systems cannot provide "infinite" accuracy
even in theory.

In theory, they can provide perfect accuracy; in fact, in theory, they
do this by definition.
Infinite accuracy implies an infinite information
capacity (i.e., how many decimal places do you require for
"infinite" precision?), and this would require infinite bandwidth or
precisely zero noise, neither of which is even theoretically possible.

There is nothing that theoretically forbids noise. At the quantum
level some interactions are lossless, in fact, but they are hard to
use in a practical way. Think superconduction, for example.
This is not correct. Transducers have been designed which essentially
do a direct conversion of certain types of real-world parameters
directly into numeric or "digital" form; it's just that they generally
have not been very practical to implement, or provided any real
advantages over other approaches.

No, they are analog devices as well. All such interfaces are analog.
 
Y

Yousuf Khan

Bob said:
So far, LCDs are typically 6 or 8 bits per primary. 10-bit and
even 12-bit drivers are now starting to come to the market, though
(this is primarily happening at the high end, i.e., the LCD TV panel
market), so eventually we will see better performance in this
regard.

Oh really? They don't typically advertise this feature on LCDs do they?
They talk about brightness & contrast ratios, update speeds, etc., but
not internal precision.

Yousuf Khan
 
C

chrisv

Mxsmanic said:
Ultimately, then, you can build an analog system that will meet or
beat any digital system. The reason this doesn't actually happen is
that, up to a certain point, analog systems of this kind are much more
expensive than digital systems. By sacrificing capacity you can
greatly reduce cost and keep errors arbitrarily low (although you
cannot eliminate them).

Not true.
In theory, they can provide perfect accuracy; in fact, in theory, they
do this by definition.

Wrong again.
 
B

Bob Myers

Mxsmanic said:
Yes, but since the interface is always analog, the best digital system
can never be better than the best analog system.

OK, well, we've had this discussion before. I've said what I have
to say both here and in previous posts on the subject, and those
(and other sources) are available to anyone who at this point still cares
enough to look further into this. These same sources are available to
you as well, so feel free to investigate this further. As you should.

Bob M.
 
B

Bob Myers

Yousuf Khan said:
Oh really? They don't typically advertise this feature on LCDs do they?
They talk about brightness & contrast ratios, update speeds, etc., but
not internal precision.

Well, it's not exactly a secret. And the vast majority of display
interfaces and systems only provide 8 bits per component anyway,
so the point is in most cases moot.

Bob M.
 
R

rjn

Bob Myers wrote >>
Yousuf Khan wrote >
Oh really? They don't typically advertise this feature on LCDs do they?

Nope. They hope you don't notice that light grays are pink
(for example, your results may vary), and that any gamma
correction in your app, card driver or the monitor itself
cannot fully fix it without introducing other more
objectionable artifacts, such as visible terracing.
This is independent of whether the connection is analog
or digital.

Anyone doing carefully color-managed work needs to test
any contemplated LCD display. My guess is that few can
track a gray scale as accurately as, say, a Sony Artisan
(GDM-C520K, the only CRT computer display Sony still makes).
 
N

Not Gimpy Anymore

Mxsmanic said:
The final signal is analog, not digital, even in an LCD monitor. LCDs
are not digital devices. There aren't any digital input or output
devices, in fact.

I keep hearing that digital is vastly better, but I can't see anything
wrong with the analog signal on my LCD monitor. What _exactly_ is
different with digital? The pixels look perfect on my monitor; how
can they become "more perfect"?

Indeed the final signal to the LCD cell must have an analog value to be
able to adjust the proportional transmissibility of the cell according to
the
original value determined in the graphics bitmap. What is at issue in the
discussion is the integrity of passing that origional value (which
incidentally
is numeric) to become the analog value of need at the cell.
Historically there have been issues with integrity of passing the values
along the signal chain in analog fashion, so a lot of thought was put into
a way to maintain the numeric value integrity until the final conversion
is made. That resulted in "digital" interfaces, which in fact did
demonstrate
the ability to maintain a higher level of integrity in preserving the
desired value.
However, the integrity of the analog method has also been improved in the
recent years, so these days it is practically impossible to detect a visual
difference in the transmission method for many displays, especially if
operating in the native pixel format.

Hence - ya gotta see it to decide!! In most cases for today it is not
easy
to justify exchanging cards *only* to gain a "digital channel". Of course
all the companies making new hardware really WANT you to make the
change..... OTOH, if changing cards provides for desirable new features,
you have your justification and satisfaction.

My $0.02
NGA
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top