dvi->vga radeon 9200se

T

Tim

Hello all,

I have a radeon 9200se card installed on win xp sp2 and have two philips lcd
screens (190s5cb) installed.
Both screens have only VGA input. The card however, has one VGA and one DVI
(DVI-I I guess...) output.

As I've read on many newsgroups, a lot of people, including myself, have
quality issues with the DVI to VGA adapter
on their LCD screens.

As far as my reading-up goes, DVI-I or DVI-A should give a slightly better
quality, despite the conversion to analogue VGA.

However, my second screen (which is the one on DVI) is rather blurry. Enough
to give you a headache if you try
to read on it for more than five minutes.

Could this be the card, ie. the DVI-VGA circuitry on board doesn't to it's
job very well? Or what?

I've been looking for dualhead VGA cards, but can only find the matrox g450
(which I feel is a bit simple).

So, second question is: if the card is the culprit, what new card should I
look for? It has to be dual VGA and I do no gaming, but quite a lot of 2D
design and video editing using Premiere Pro and just the occasional 3D
design in 3dMax.

Thanks a lot!

Best regards,
Tim
 
B

Barry Watzman

First of all, a DVI to VGA adapter is ONLY a "connector" adapter. It
doesn't DO ***ANYTHING***, it just makes a connection between pins of
the 15-pin VGA connector and pins of the DVI-I connector.

In other words, the video board actually has THREE outputs, two VGA and
one DVI. The DVI-I connector has the pins for both a DVI-D signal and a
VGA (or DVI-A) analog signal, all in one connector, and the 2nd VGA
output and the digital output are, by definition, always showing the
same image.

Analog LCD monitors need to have their dot clock frequency and phase
adjusted. I just can't overstate how critical this is, and the "auto"
or "self-adjustment" just doesn't "get it" more than perhaps 10% of the
time. To do the adjustment, you need a test pattern consisting of
vertical bars, alternating black and white, each bar being only and
exactly one single pixel wide. Without proper adjustment of the dot
clock frequency and phase, to be candid about it, most analog monitors
look like crap.

[I have a free test pattern generator program that does the job, if
someone has a web site where they can post it, I'd be happy to provide
it. I don't want to send out dozens of copies to individual end users
by E-Mail.]
 
T

Tim

I know the converter just 'reformats' the connector, but I thought that it
could be responsible for some quality loss.
Anyway, Philips did provide an adjustment program, but it doesn't get the
concept of dual screen. So, in order to adjust
the second screen, I have to connect it to the VGA output.

So, if your tool allows to be dragged to the second screen, I'll be more
than happy to post it on a website, or if you
could give me the name I'll google for it.

Thanks!

Tim

Barry Watzman said:
First of all, a DVI to VGA adapter is ONLY a "connector" adapter. It
doesn't DO ***ANYTHING***, it just makes a connection between pins of the
15-pin VGA connector and pins of the DVI-I connector.

In other words, the video board actually has THREE outputs, two VGA and
one DVI. The DVI-I connector has the pins for both a DVI-D signal and a
VGA (or DVI-A) analog signal, all in one connector, and the 2nd VGA output
and the digital output are, by definition, always showing the same image.

Analog LCD monitors need to have their dot clock frequency and phase
adjusted. I just can't overstate how critical this is, and the "auto" or
"self-adjustment" just doesn't "get it" more than perhaps 10% of the time.
To do the adjustment, you need a test pattern consisting of vertical bars,
alternating black and white, each bar being only and exactly one single
pixel wide. Without proper adjustment of the dot clock frequency and
phase, to be candid about it, most analog monitors look like crap.

[I have a free test pattern generator program that does the job, if
someone has a web site where they can post it, I'd be happy to provide it.
I don't want to send out dozens of copies to individual end users by
E-Mail.]

Hello all,

I have a radeon 9200se card installed on win xp sp2 and have two philips
lcd screens (190s5cb) installed.
Both screens have only VGA input. The card however, has one VGA and one
DVI (DVI-I I guess...) output.

As I've read on many newsgroups, a lot of people, including myself, have
quality issues with the DVI to VGA adapter
on their LCD screens.

As far as my reading-up goes, DVI-I or DVI-A should give a slightly
better quality, despite the conversion to analogue VGA.

However, my second screen (which is the one on DVI) is rather blurry.
Enough to give you a headache if you try
to read on it for more than five minutes.

Could this be the card, ie. the DVI-VGA circuitry on board doesn't to
it's job very well? Or what?

I've been looking for dualhead VGA cards, but can only find the matrox
g450 (which I feel is a bit simple).

So, second question is: if the card is the culprit, what new card should
I look for? It has to be dual VGA and I do no gaming, but quite a lot of
2D design and video editing using Premiere Pro and just the occasional 3D
design in 3dMax.

Thanks a lot!

Best regards,
Tim
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top