D-Sub DVI vs old 15-pin VGA connection

A

AnotherAnonymous

Will there be a huge performance increase if I use the DVI connection on my
video card to the monitor over the legacy 15 pin VGA connector? I just got a
monitor that utilizes both types of connections but I dont have a DVI cable
yet. Thanks for any info.
 
B

Barry Watzman

Your question suggests that you don't really understand the situation.

Your computer (or a video card that you are thinking of getting)
probably has a DVI-I connector. This single connector includes BOTH a
digital interface and an analog interface using different pins of the
same connector. Thus simply switching from a 15-pin analog VGA
connector to the DVI CONNECTOR doesn't specify which interface (analog
or digital) you will be using.

However, if your monitor accepts a digital signal (and the presence of
both a DVI and an analog 15-pin connector on the monitor is a clear
indication that it does), then yes, there will be a quality improvement
in using the digital interface (cable with DVI connector on both ends,
and you MIGHT still have to make a menu selection of analog or digital
input signal). How great the improvement will be is impossible to say.
On some monitors it is possible to adjust an analog signal so well
that there is no visible difference. In other cases, the difference is
tremendous. Quite a lot of this has to do with the quality of the
analog video cable, which is highly variable.
 
A

Andrew Rossmann

Your question suggests that you don't really understand the situation.

Your computer (or a video card that you are thinking of getting)
probably has a DVI-I connector. This single connector includes BOTH a
digital interface and an analog interface using different pins of the
same connector. Thus simply switching from a 15-pin analog VGA
connector to the DVI CONNECTOR doesn't specify which interface (analog
or digital) you will be using.

However, if your monitor accepts a digital signal (and the presence of
both a DVI and an analog 15-pin connector on the monitor is a clear
indication that it does), then yes, there will be a quality improvement
in using the digital interface (cable with DVI connector on both ends,
and you MIGHT still have to make a menu selection of analog or digital
input signal). How great the improvement will be is impossible to say.
On some monitors it is possible to adjust an analog signal so well
that there is no visible difference. In other cases, the difference is
tremendous. Quite a lot of this has to do with the quality of the
analog video cable, which is highly variable.

On some monitors, some picture functions are disabled when using DVI. If
the monitor is not properly configured at the factory, you may end up
with a worse picture than analog.
 
B

Barry Watzman

Since DVI-D (or -I if using the digital interface) is digital, many
analog functions simply don't make sense. I have NEVER seen a situation
in which a digital display was inferior.
 
R

R. C. White

Hi, Barry.

Without getting into the main topic of this thread, could I ask a simple
question on what seems to me a very minor point: Why does my monitor insist
on making such a big deal about D-Sub?

It's a year-old cheapie 17" LCD from Wal*Mart (from Balance Digital
Technology in China). The box makes a point of saying that it has D-Sub,
and every time the computer boots (or black-screens), the monitor displays a
small plain white-on-black logo, just the text "D-SUB" in a rectangle with
rounded corners.

My understanding is that D-Sub refers only to the shape of the connector on
either end of the monitor cable, which has 15 pins arranged in a shape
resembling the letter "D", and its size of a little over a half-inch wide.
This hardly seems worthy of such emphasis. Is there something more
significant than that?

RC
--
R. C. White, CPA
San Marcos, TX
(e-mail address removed)
Microsoft Windows MVP
(currently running Windows Mail 7 in Vista x64 Build 5472)
 
B

Benjamin Gawert

* R. C. White:
Hi, Barry.

I'm not Barry, but I try to answer your question...
Without getting into the main topic of this thread, could I ask a simple
question on what seems to me a very minor point: Why does my monitor
insist on making such a big deal about D-Sub?

It's a year-old cheapie 17" LCD from Wal*Mart (from Balance Digital
Technology in China). The box makes a point of saying that it has
D-Sub, and every time the computer boots (or black-screens), the monitor
displays a small plain white-on-black logo, just the text "D-SUB" in a
rectangle with rounded corners.

My understanding is that D-Sub refers only to the shape of the connector
on either end of the monitor cable, which has 15 pins arranged in a
shape resembling the letter "D", and its size of a little over a
half-inch wide. This hardly seems worthy of such emphasis. Is there
something more significant than that?

Nope. Most better CRTs have more than one signal input. Very old
monitors (until ~1993/1994) also had a 9pin input which could take
digital signals (EGA for example). Newer monitors often also have a BNC
inputs. In this case the OSD tells the user which signal input the
monitor uses ("D-SUB" for the VGA input and "BNC" or something like that
for the BNC inputs). Some monitors only have a single VGA input but the
same chassis and firmware is used for models with additional BNC inputs,
then the OSD tells the user "D-SUB" everytime the monitor detects a
signal change, despite the fact that there is no secondary input.

Benjamin
 
B

Barry Watzman

It would be better if they said "analog" instead of "D-SUB", but they
mean an analog 15-pin VGA interface. Normally you only see the
on-screen indication like that when the monitor has multiple interfaces
(e.g. Analog and Digital). You will some times see the analog interface
referred to as "Analog", "VGA" or "D-SUB" (which I think is the worst of
the 3 choices). But it's no worse than the english in some of the
manuals for Chinese products.
 
R

R. C. White

Thanks, Barry - and Benjamin.

That pretty much confirms what I thought.

This LCD has only the single input, which is D-SUB, so I don't see the
necessity to emphasize it and remind me, as though it were something
special. But I guess it doesn't hurt anything. Just makes some curious
folks wonder. ;^}

RC
--
R. C. White, CPA
San Marcos, TX
(e-mail address removed)
Microsoft Windows MVP
(currently running Windows Mail 7 in Vista x64 Build 5472)
 
B

Barry Watzman

It's possible that the controller and firmware, and perhaps even the PC
board, support DVI input as well, but the model you have simply didn't
fully implement it (in an extreme case, everything for DVI support is
present except the connector itself, for a combination of cost and
marketing reasons).
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top