DVI connection possible?

H

hantekisalp

I have a Gateway CX210X with an ATI Radeon Mobility X1400 video card.
Neither the notebook, nor the optional port replicator, has DVI out,
only VGA. My Dell 2405 monitor has DVI inputs. Is there any way at all
to use a DVI (or even S-Video) connection between the two?

Thanks.
 
P

peter

in my hand i hold a VGA to DVI converter which came with my Nvidea video
card.
I am sure a Google search would turn one up somewhere for sale.
peter
 
B

Barry Watzman

That will not work for what he wants to do. What you have allows
connection of a VGA monitor to a DVI-I port. A DVI-I port has both
analog and digital signals present at the DVI connector. With the
adapter that you have, it connects the VGA monitor to the proper pins of
the DVI port.

BUT it won't work backwards. That device cannot convert digital signals
to analog signals or vice-versa. His computer has an analog port, which
does not have digital signals. The only way it would work (and this is
very doubtful, although not totally impossible) would be if the monitor
had a DVI-I input that accepted analog signals through the DVI port, and
this is rare (normally, DVI-I is an output port only, not an input
port). And even if it did work, it would not be a digital DVI
connection, it would be an analog connection.

The answer to his question is that with the one exception in the
previous paragraph (which is not a digital connection), what he wants to
do is generally not possible.
 
M

Michael W. Ryder

I have a Gateway CX210X with an ATI Radeon Mobility X1400 video card.
Neither the notebook, nor the optional port replicator, has DVI out,
only VGA. My Dell 2405 monitor has DVI inputs. Is there any way at all
to use a DVI (or even S-Video) connection between the two?

Thanks.

According to Dell's site the monitor has a VGA connector along with the
DVI and S-Video connectors. You won't be able to use the DVI connector
with your video card but that shouldn't make too much difference.
 
F

Fidelis K

According to Dell's site the monitor has a VGA connector along with the
DVI and S-Video connectors. You won't be able to use the DVI connector
with your video card but that shouldn't make too much difference.

I used to own a Dell 2405. Its VGA input was extremely terrible (check out
reviews and Dell forum). Granted, the native resolution is high (1920x1200),
but Dell did a poor job implementing VGA. In contrast, my Sony P234/B
(1920x1200) has an excellent VGA input. The quality of video signals from
that input almost matches that of its DVI. So, forget using the 2405 if you
can't utilize the DVI.
 
G

Geoff

peter said:
in my hand i hold a VGA to DVI converter which came with my Nvidea video
card.
I am sure a Google search would turn one up somewhere for sale.
peter

they are for plugging a VGA into a DVI connection (since most DVI contains
both analogue and digital)
 
B

Barry Watzman

The DVI connection that contains both Analog and Digital is called
DVI-I, and on almost all video cards with a DVI connection, it is DVI-I.
But DVI-D connections (digital only) do exist (most HDTV set top boxes
that have DVI outputs are of this type), and there is even a DVI-A
connection defined, analog ONLY, but I've never actually seen it used
(it would be a VGA output with a DVI connector).
 
J

James Colbert

Fidelis K said:
I used to own a Dell 2405. Its VGA input was extremely terrible (check out
reviews and Dell forum). Granted, the native resolution is high
(1920x1200), but Dell did a poor job implementing VGA. In contrast, my
Sony P234/B (1920x1200) has an excellent VGA input. The quality of video
signals from that input almost matches that of its DVI. So, forget using
the 2405 if you can't utilize the DVI.

That's not the case here. I have a 2405 and am using both the DVI and the
VGA connectors. My setup is two computers, one using VGA, the other DVI and
a KVM to switch the mouse and keyboard.

The VGA quality, while not quite as good as the DVI, is quite good. No
issues at all. 1920x1200.

James
 
H

hantekisalp

poor VGA quality is precisely the problem -- more specifically, lack of
sharpness. i do a lot of photo retouching and can see substantially
more detail and sharpness when looking at the same photo on other
monitors -- even on notebook LCDs.

i was hoping to connect via DVI to gain some sharpness, but it seems i
am SOL on this one.

thanks to all for the replies. i'll have a gander at the Sony.
 
B

Barry Watzman

Most probably the monitor dot clock is not properly adjusted, or you
have bad cables.

To properly adjust the dot clock frequency and phase of an analog LCD
monitor, download this test program:

www.winsite.com/bin/Info?500000030936

or (same site)

http://ns.winsite.net/bin/Info?500000030936

This program is variously known as CRTAT, CRTAT2, and CRT Align
(crtalign), and was written by Stephen Jenkins in about 1992 or 1993.
This is a very old Windows 3.1 program written in visual basic. It runs
under XP just fine, absolutely perfectly in fact, even with today's high
resolution monitors (you do need VBRUN300.DLL (the Visual basic version
3 runtime DLL library), which it may or may not come with the program
depending on where you download it from, but if you don't have
VBRUN300.DLL, it can be easily found on the web).

This program is totally non-invasive, it's "installation" makes NO
changes to your registry or to ANY system components or files. In fact,
if you just unzip the program and double click the exe file, it will run
fine without actual "installation" (but the program and the help file
need to be in the same directory, and VBRUN300.DLL needs to be available
in \Windows\System).

To use the program for this purpose, after installation, select the
leftmost of the 3 functions in the "Test" group (or "resolution" in the
drop-down menu) and then check both "mode" check-boxes.

When you display this pattern, you should see an absolutely perfect and
uniform field of alternating (but very, very fine) black and white
vertical bars each only one single pixel wide. If you see "moire"
distortion, or smearing, your display isn't adjusted correctly. Digital
monitors (with DVI interfaces) will always be "perfect". Analog
monitors will usually show an initial moire distortion pattern until
they are adjusted (dot clock frequency and phase). In most cases,
perfect adjustment can be achieved (and is "remembered" by the display),
but in some cases you can't achieve this. Note that the "auto"
(auto-adjust) function on almost all analog LCD monitors gets "close"
but usually does not get to the best possible adjustment.

[On many monitors, the dot clock frequency is called Horizontal size or
width. Phase is usually called Phase]

If you have an analog monitor and you don't use this program to adjust
your monitor, you are doing yourself a real disservice.

Two other comments:

First, you MUST run the video card only a the native pixel resolution of
the LCD panel. NO EXCEPTIONS OF ANY KIND ON THIS POINT, PERIOD.

Second, poor quality video cables are a huge issue with analog LCD
monitors. The cable issue is self explanatory, but MOST of the analog
cables offered for sale are "poor quality". You can almost tell the
quality by the thickness of the cable. You want something significantly
larger than a number 2 pencil .... maybe even approaching the size of a
garden hose (there are 5 individual coax cables inside a good analog
video cable, and the larger their individual diameters, the lower their
loss and capacitance). Unfortunately, really good video cables are both
hard to find and expensive.
 
F

Fidelis K

What you wrote is precisely true. But, that won't make a difference if the
monitor itself is not of high quality. Dell LCDs are fine for general
business and gaming purposes, but they don't offer the image quality that,
say, medical professionals look for.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top