"Out of Scan Range" error on Sony LCD Monitor

P

Peter Werner

First my setup:

Macintosh G4/466 "Digital Audio" running OS 9.2.2
ATI Radeon 32 Mb DDR video card
Sony SDM-X82 TFT LCD Monitor

This works just fine through its VGA connection, but both my card and
the monitor have a DVI port and I've really been wanting to use this
for my connection. However, when I disconnect the VGA cable and
connect the DVI cable, about halfway through the startup, the screen
goes blank and I get a (Sony) error message on my monitor that says:

Out of Scan Range
Resolution > 1280 X 1024

This is odd, because my resolution is in fact set to 1280 X 1024 at 60
Hz. That's supposed to be well within the range for my monitor and
what I use for VGA.

Some further reading on the subject leads me to believe that it might
have something to do with the fact that the DVI port on the video card
is DVI-I while the port on the monitor is DVI-D. The cable itseld is
the one that came with the monitor and is DVI-D to DVI-D. However,
I've also heard that one should be able to plug this kind of cable
into a DVI-I port without any problem.

Anybody know what's going on? If I need to use another resolution/hz
rate, then what do I use? Should I get some kind of DVI-I to DVI-D
filter?

Let me know,
Peter
 
H

Harry Muscle

Peter Werner said:
First my setup:

Macintosh G4/466 "Digital Audio" running OS 9.2.2
ATI Radeon 32 Mb DDR video card
Sony SDM-X82 TFT LCD Monitor

This works just fine through its VGA connection, but both my card and
the monitor have a DVI port and I've really been wanting to use this
for my connection. However, when I disconnect the VGA cable and
connect the DVI cable, about halfway through the startup, the screen
goes blank and I get a (Sony) error message on my monitor that says:

Out of Scan Range
Resolution > 1280 X 1024

This is odd, because my resolution is in fact set to 1280 X 1024 at 60
Hz. That's supposed to be well within the range for my monitor and
what I use for VGA.

Some further reading on the subject leads me to believe that it might
have something to do with the fact that the DVI port on the video card
is DVI-I while the port on the monitor is DVI-D. The cable itseld is
the one that came with the monitor and is DVI-D to DVI-D. However,
I've also heard that one should be able to plug this kind of cable
into a DVI-I port without any problem.

Anybody know what's going on? If I need to use another resolution/hz
rate, then what do I use? Should I get some kind of DVI-I to DVI-D
filter?

Let me know,
Peter

As far as I recall DVI-I is digital and analog while DVI-D is only digital,
however, the digital part is exactly the same, only the plugs might differ.
Are you connecting this to a LCD monitor? If so, is the native resolution
of it 1280*1024? Or is it lower, if it's lowever, try lowering the
resolution on the pc and see what you get.

Harry
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top