Newbie question

Z

zcarenow

I have a video card given to me and put in a pc that has a dvi and vga
connection. my flat screen has a vga to vga end cable and a dvi to dvi
cable. The dvi to dvi cable doesn't work as it gives a "no signal"
message. So i used the vga to vga cable and it worked. However, that
means i lose the advantage of digital signal which is more sharper
right? Someone told me to buy a dvi(from video card) to vga(on flat
screen) connection cable? I'm thinking don't u lose the digital
signal advantage by doing this since it becomes analog at the flat
screen end? I actually want to know why the dvi to dvi cable isn't
working? Thanks.
 
A

Augustus

I have a video card given to me and put in a pc that has a dvi and vga
connection. my flat screen has a vga to vga end cable and a dvi to dvi
cable. The dvi to dvi cable doesn't work as it gives a "no signal"
message. So i used the vga to vga cable and it worked. However, that
means i lose the advantage of digital signal which is more sharper
right? Someone told me to buy a dvi(from video card) to vga(on flat
screen) connection cable?

Yes, you lose the digital signal this way. Odds are you are using the wrong
DVI cable....there are three types . Most video cards and many monitors work
with DVI-D on both ends. That's the one without four prongs around the spade
shaped connector. I once got a DVI-I connector that plugged into the video
card no problem, but wouldn't mate with the LCD connector because of the 4
extra prongs. If your monitor actually has the four extra holes for DVI-I,
and you're using a DVI-D cable, then that's the "no signal" problem right
there. The DVI-A type is rarely seen and is analog thru the DVI port.
http://www.datapro.net/techinfo/dvi_cables.html
 
Z

zcarenow

Ok, i checked this out. The cable that i got with the monitor was DVI-
D male to DVI-D male. The video card is actually DVI-I female and the
monitor is DVI-D female. So according to your guide i need a DVI-D
cable which i used since it was provided with flat monitor. So what
could be wrong?
 
K

Kent_Diego

Ok, i checked this out. The cable that i got with the monitor was DVI-
D male to DVI-D male. The video card is actually DVI-I female and the
monitor is DVI-D female. So according to your guide i need a DVI-D
cable which i used since it was provided with flat monitor. So what
could be wrong?
Some of the bottom end video cards cannot do dual monitor so can only
support one VGA or one DVI. What video card do you have? Try disconnecting
the VGA cable and starting computer.
 
A

Augustus

Ok, i checked this out. The cable that i got with the monitor was DVI-
D male to DVI-D male. The video card is actually DVI-I female and the
monitor is DVI-D female. So according to your guide i need a DVI-D
cable which i used since it was provided with flat monitor. So what
could be wrong?

OK...at this point you need to check for output from the card on the DVI
port. Use the DVI D-Sub VGA adapter to hookup your DVI port on the card to
the D-Sub VGA connector on the monitor. Do you get output?
I'd also try to hookup the monitor with DVI to another computer if possible
and see if it works.
 
A

Augustus

Some of the bottom end video cards cannot do dual monitor so can only
support one VGA or one DVI. What video card do you have? Try disconnecting
the VGA cable and starting computer.
I don't think he's trying to run two monitors at the same time here. Just
trying to get DVI connection to his new LCD from what it looks like.
 
B

Barry Watzman

As someone else suggested, first disconnect the analog cable (before
turning on the computer) and use a DVI cable only (DVI-D at both ends).
If you still get "no signal", what's likely happening is that you have
to use the OSD (on screen display ... control panel) on the monitor to
switch the monitor from VGA mode to DVI mode. The details vary by
monitor make and model. It might also be a button on the monitor that
switches it.
 
B

Barry Watzman

No, he should not use the adapter if he wants a digital signal. A DVI-I
port has both analog and digital signals available at the same
connector. The adapter is a passive device, it just connects an analog
15-pin VGA socket to the analog signals present at the DVI-I port.
Depending on the card, those could either be EXACTLY the same signals
that are also present at the card's own VGA socket, or they could be a
2nd (different) analog channel (on a card that can support dual
independent displays). (In this last case, the actual video signal will
normally be an analog version of the digital DVI signal.) Either way,
however, it's just analog VGA, and that is not what he wants. And the
presence of such signals does not say anything either way about the
presence or absence of digital signals at the same connector.
 
M

Matt Ion

Barry said:
As someone else suggested, first disconnect the analog cable (before
turning on the computer) and use a DVI cable only (DVI-D at both ends).
If you still get "no signal", what's likely happening is that you have
to use the OSD (on screen display ... control panel) on the monitor to
switch the monitor from VGA mode to DVI mode. The details vary by
monitor make and model. It might also be a button on the monitor that
switches it.

That was my thinking as well. My Viewsonic will automatically switch to
the DVI input if there's no signal on the VGA (which is a little
annoying at times, since I'm running my main desktop on the DVI but
three other machines on the VGA via a KVM), but most I've seen do need
to be switched manually. As Barry points out, there may be a button
specifically for this, or you may need to access the monitor's OSD menu.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top