DVI Problems...New Monitor won't display DVI Input

C

Chad Coleman

I recently purchased a new video card a few months ago. It's a Chaintech
GeForce FX 5900XT, and then recently within the past few weeks bought a new
Sony 19" HS94P Monitor. I have the latest drivers for the card, but I can't
get it to display the DVI output. I have it in CLONE mode, and it shows that
I have two monitors connected...because I have both the DVI and the standard
cable hooked up at the same time. Originally I just hooked up the DVI but I
didn't get a picture...so I hooked them both to see if I could switch back
and forth. When I try to flip it over, the screen just goes black and then
after 15 seconds (the alloted time in windows) it flips back to analog HD15
mode. What gives here?

Also if you have it in DVI mode, can you set the refresh rate as high as you
can in Analog? I'm running this monitor in 1280x1024 at 75hz....the funny
thing is when I do get it to show that it's DVI, it's really not and then
the refresh rate will drop to 60hz, and the color options are disabled...

Someone throw this dog a bone...

Chad
 
P

peter

did you load the Sony .inf files???there should(might) be 2 one for analog and
one for digital
An LCD monitor usually has a factory setting at which it works best.This is
usually listed in the manual that came with the monitor.
Your video card should be able to display that same rsolution on the DVI output
Most LCD's run at 60hz.............some can and will run higher depending upon
that factory setting.
At 60hz on an LCD you will not see flicker
peter
 
C

Chad Coleman

Thanks for the reply...

I have taken care of getting the INF's installed from the Sony Website...it
shows in my display properties NView that there are two connections
including the DVI...but when I switch to it, it switches and states it's in
DVI mode, but it's really not. The monitor property when I click the button
on the monitor says HD15 mode. I just don't get it...
 
P

peter

I would disconnect the analog connection then go back to display properties and
make sure it showed digital ,set that as primary display.
On my Radeon card I had the DVI set as primary and the analog set as
secondary.......no clone.and it worked I could pick wether I wanted analog or
digital and since digital was primary it started with digital.
Then I finally said the hell with it and just ran with the DVI connection.
peter
 
C

Chad Coleman

Well today I hooked up a different computer with a DVI on it, and the DVI
worked fine on the monitor. The monitor has a INPUT switch on it where I can
change it on the fly...however the DVI when it's hooked up to my system
always says no signal...on the other computer it was perfect. This tells me
it's not the cable or the monitor. It's the video card..I've emailed the
Chaintech support for help in this issue. I think I just have a dead DVI
port on my Chaintech GeForce FX 5900xt.

stay tuned...
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top