1680 x 1050 on a Radeon 9200

A

a7yvm109gf5d1

Hey guys and gals,
I don't have all the info since this isn't a problem on my PC, but
here's the problem.

We just bought a Acer AL2216W 22" monitor. It is a 1680 x 1050 60Hz
panel. This resolution didn't show up in the ATI control panel so we
used Powerstrip to create this resolution. This is what I did a few
weeks ago on my machine and it worked right away.

On the new setup here's what happens:

1) powerstrip forces the resolution onto the card just fine until
reboot. When Windows XP starts up again, the monitor shows "input not
supported". Plugging a CRT to the machine works though. It is in some
sort of 1680x1050 the LCD doesn't like, but it worked before ...

or

2) The monitor reports it is receiving 1280 x 1024, but a virtual
desktop appears when we ask for a 1600 x 1200 display for example. We
can scroll around, but I am sure we are still in 1280x1024.

Something is coming between what I want (1680 x 1050) and what the
card outputs. I tried copying the timing parameters from my machine's
Powerstrip to the new machine, somehow on reboot it gets wiped out to
a "unsupported" mode.

What more info do you guys need? I'll try to get more when the person
leaves and I can tool around their machine.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top