Is DVI connector worth it?

J

J. Clarke

Captin said:
I’m asking everyone here. Does the performance of DVI vary a great
deal from video card to video card?
I mean is it possible we have a situation where DVI offers a step
forward with some video cards and not so with others?

It's not a simple question.

First, the result depends on the monitor. In general, with LCD displays,
using the same board, same monitor, same cable, same everything, DVI will
yield an image that is any where from imperceptibly to greatly superior to
that yielded by analog. With CRT displays, in general DVI yields inferior
results to analog simply because the DAC used in CRTs with DVI is usually
not of very high quality.

Then it depends on the resolution--the resolution limits for DVI are lower
than for analog. If the monitor and video board can support a higher
resoltion with analog than DVI allows, then in general analog can give a
better image by using that high resolution.

Then it depends on the configuration--if your monitor is one which does not
scale the DVI input then you will always have a 1:1 correspondence between
physical and logical pixels and you won't get sharper than that. If it
_does_ scale however and if the signal sent out by the video board is
different from the native resolution of the monitor then the image will be
degraded to some exent. Whether the result will be better or worse than
that yielded by analog at the same resolution then depends on the details
of the implementation of scaling in the monitor.
 
N

Not Gimpy Anymore

Captin said:
I'm following threads on LCD monitors because I'm in the market. It
seems some opinion is that DVI is not a step forward .
The way I see it is even if DVI is no advantage the better monitors
will
have the option regardless?
Also what I'm wondering about is how much the video card
contributes
towards DVI performance? Are some cards simply holding back the
benefits of the DVI interface?

--
Posted using the http://www.hardwareforumz.com interface, at author's
request
Articles individually checked for conformance to usenet standards
Topic URL:
http://www.hardwareforumz.com/General-DVI-connector-worth-ftopict58538.html
Visit Topic URL to contact author (reg. req'd). Report abuse:
http://www.hardwareforumz.com/eform.php?p=309275

IMHO, cards without DVI are made so mostly for purposes of economy or
space.

IMEO (Expert Opinion) most newer LCD monitors do well enough on VGA that the
extra expense of the DVI link(s) is not justified. That said, YMMV,
depending on how
fussy you are about artifacts. Think of DVI as just a different way of
delivering the
detailed pixel information to the display. How the display uses that
information is the
key, and those details are what have improved significantly over the last
several years.

It is worth taking time to test things personally - unfortunately getting
access to the
necessary components to DIY is a barrier for most of us "casual" users.

Regards,
NGA
 
C

Captin

True for all digital systems. Analog systems always have some
sort of
error, and this error increases gradually and gracefully as
noise
increases. Digital systems draw an artificial line below
which all is
noise and above which all is signal. As long as noise
actually
remains below this line in the channel, digital transmission
is
error-free. But if it rises above the line, there is a
_sudden_ (and
often catastrophic) appearance of serious, uncorrectible
errors in the
channel.

The whole idea of digital is to draw the line at the right
place, so
that you always have error-free transmission. You sacrifice
the bit
of channel capacity below the line in order to get error-free
transmission at a slightly slower rate than analog might
provide.

Who is setting the pace with 19" LCD monitors currently?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top