DVI - any real difference?

B

Bobby

I am about to purchase a TFT monitor to replace my 19" CRT.

I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
outputs.

How important is it to buy a display with DVI input? Does it make any real
differerence?

Cheers.

Bobby
 
J

Jerry G.

On a 19 inch screen, you will see a slight difference in the sharpness. I
have 2 of 17 inch DVI monitors here in the office. The older machine's ATI
display card is too old to have DVI. The newer one has it. I can see the
difference in the newer one if I look very close at the details. Since the
bandwidth of the display data is very wide, there is slightly less loss with
the DVI when run through the VGA cable.

Since you paid a little extra for your display card to have DVI, you should
use it. When connect the new LCD monitor, and the computer is started up,
simple press the Auto Setup button on the monitor. This will set up the
monitors data timing to match the display card. You will be extremely
impressed at the results.

--

Jerry G.
======


I am about to purchase a TFT monitor to replace my 19" CRT.

I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
outputs.

How important is it to buy a display with DVI input? Does it make any real
differerence?

Cheers.

Bobby
 
H

Hawkeye

From a Cnet review


DVI support is found primarily on LCDs. However, the advantage of
digital signals for LCDs is of somewhat less importance now than it
was a few years ago. Analog signal processing has improved to the
point where major differences in image quality can be difficult to
detect. Unless you're a pro photographer, a prepress professional, or
someone else who needs superprecise, top-notch image quality, you
should be fine using a CRT or an LCD on an analog signal.
 
C

Connected

I am about to purchase a TFT monitor to replace my 19" CRT.

I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
outputs.

How important is it to buy a display with DVI input? Does it make any real
differerence?

Cheers.

Bobby
Samsung say the difference is minor but that you should notice
slightly sharper text. So overall it gives a slightly sharper image
but it is a minor difference.
 
D

DevilsPGD

I am about to purchase a TFT monitor to replace my 19" CRT.

I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and DVI
outputs.

How important is it to buy a display with DVI input? Does it make any real
differerence?

If you're planning on buying a LCD (flat panel) monitor, definitely go
DVI. If you're sticking with CRT it doesn't make a huge difference.

Either way it's not the end of the world, you can get a pretty decent
picture on any modern LCD from an analog signal, but it's noticeably
better using DVI-D on my Dell 2005FPW (20.1" widescreen LCD)
 
J

J. Clarke

Jerry said:
On a 19 inch screen, you will

^H^H^H^H may
see a slight difference in the sharpness. I
have 2 of 17 inch DVI monitors here in the office. The older machine's ATI
display card is too old to have DVI. The newer one has it. I can see the
difference in the newer one if I look very close at the details. Since the
bandwidth of the display data is very wide, there is slightly less loss
with the DVI when run through the VGA cable.

Not a valid comparison--try it on the analog and digital outputs of the
_same_ board and see if you can see the difference.
 
B

Barry Watzman

Yes, it does.

There are a number of issues, but from what I've seen, #1 is that most
analog cables -- even those that come with the monitor -- are not
impedance matched and introduce ringing ("ghosts") around sharp
transitions. It's most noticeable on small text, it's not noticeable at
all (unless its really bad) on TV or movies or games. It's very subtle,
many people won't notice it, but I'm in the display industry and I see
it quite often.

The #2 problem is time base accuracy and stability -- the analog monitor
doesn't sample the pixel at exacty the moment that the video card
"sends" it. [This used to be the #1 problem, but beweeen monitors
getting bettter and increased display resolution (more 1280x1024 vs.
1024x768), I'd say it's now #2]. You can diagnose this very quickly
(and often adjust to eliminate it) by putting up a test pattern of
alternating black and white vertical bars one single pixel wide; you
should see a perfect reproduction, with zero moire present. The key is
perfect adjustment of the dot clock frequency and phase. Note, however,
that on the majority of monitors, the "Auto" function doesn't produce an
exactly correct adjustment. Close, in many cases, but not exact.

DVI simply eliminates the issues that cause both forms of distortion.
With an analog monitor, maybe it's ok, maybe it's not (and it's usually
not without test pattern adjustment). With DVI, assuming that the
display works, there is no distortion from either cable issues or dot
clock matching to the video card. In those regards, it's perfect.
 
D

DaveW

The DVI output of the video card gives a pure digital signal that is much
sharper and cleaner than the analog signal from the VGA output.
Yes, get an LCD monitor with a DVI connection.
 
S

singha_lvr

How important is it to buy a display with DVI input? Does it make any real
differerence?

I have a noticable difference on my NEC MultiSync LCD 1860NX using a
Radeon 9600XT card.

I have both analog and DVI inputs. It took me a while to find a DVI
cable, but when I switched it was a remarkably improved picture
quality.
 
J

Jeff McNulty

Bobby said:
I am about to purchase a TFT monitor to replace my 19" CRT.

I have an ATI 9600 Pro graphics card in my PC which has 15 pin D-sub and
DVI outputs.

How important is it to buy a display with DVI input? Does it make any real
differerence?

Cheers.

Bobby
How about on a CRT. I have a NEC 1350X 22' monitor. It has both analog and
digital inputs. Video card is ATI 9800 pro.
 
B

Barry Watzman

When interfacing to a CRT, you don't have the "dot clock" issue, so to
that extent there is no benefit to DVI. However, you still have the
issue with analog signal integrity on the cable (noise, ringing,
impeadance mismatching, ghosting). If this isn't an issue, then there
is unlikely to be any difference, but if it is an issue, then DVI will
still be superior.
 
J

J. Clarke

Barry said:
When interfacing to a CRT, you don't have the "dot clock" issue, so to
that extent there is no benefit to DVI. However, you still have the
issue with analog signal integrity on the cable (noise, ringing,
impeadance mismatching, ghosting). If this isn't an issue, then there
is unlikely to be any difference, but if it is an issue, then DVI will
still be superior.

If, and ONLY if the D/A converter in the CRT provides signal quality greater
than that of the video board+cable. That has not usually been the
case--the D/A converters used in CRTs were for the most part pretty dismal.
 
B

Barry Watzman

Re: "If, and ONLY if the D/A converter in the CRT provides signal
quality greater than that of the video board+cable."

The D/A converter will never be worse than the video board+cable. For
all practical purposes, the D/A converter can be considered to be "perfect".
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top