ATI to nvidia or vise versa - image quality?

B

BillL

Hello all

In February or March '05 (don't want to rush ;o) I've decided to go for a
major upgrade. Already decided I'm going to go for a AMD 64 (3000 or 3200
90nm) Nforce 4 939 Ultra and SATA2 HDD but am a bit stumped on the video
card choice. I'm currently using a Sapphire 9700 (non-pro 128 MB) which I'm
very happy with but will be going down the PCI Express route (not SLI ....
gotta eat after all!! ;o).

Currently I'd be looking at either an ATI X700 or nvidia 6600vanilla/GT.
However, I'm very aware that when I upgraded from my GF Ti200 to the
Sapphire I really noticed that image quality (both 2D & 3D) really improved
(of course performance did as well!).

I, like anyone else, would like the best 'bang for buck' in terms of
performance but has nvidia been able to improve in the area of image quality
over the earlier GFx cards or is ATI still the 'best' when it comes top
image quality?

Thanks for your time.

Bill
 
T

Tod

I'm a very happy with my ATI 9600 PRO AIW image quality.
Before I got the ATI (as a birthday present), I had checked out the
Nvidia 5200, it's image quality was as good as my old ATI 9100.
So yes, Nvidia has pressured the card manufactures to improve image quality.
 
F

First of One

2D image quality of both cards are about equal. In 3D, most of the trilinear
filtering anomalies seen in either brand's drivers have been eliminated.
ATi's FSAA implementation still yields better quality (and can be faster).
However, the 6600 is capable of "high dynamic range" rendering, which makes
a huge difference in games that support it (so far only Far Cry, though).

3 months is a long time, though, and you'll find that neither the X700 nor
the 6600 is significantly faster than your 9700.
 
M

Mirek Fídler

Currently I'd be looking at either an ATI X700 or nvidia 6600vanilla/GT.
However, I'm very aware that when I upgraded from my GF Ti200 to the
Sapphire I really noticed that image quality (both 2D & 3D) really
improved (of course performance did as well!).

Several years ago I replaced my Matrox Millenium by Nvidia TNT and could
live with it only for month (just as long as to finish HL1 :), because of
terrible 2D quality. Then I went back to Matrox (G400) and two years later,
I was VERY afraid about upgrading to Nvidia Ti4200. But to my surprise, this
time 2D quality was even better than that of Matrox! Something has changed
during that time....

Also, when concerned about 2D image quality, if your major upgrade will
contain LCD panel I believe that DVI connection is likely to avoid any
differences in 2D image quality.

Mirek
 
E

Ed Light

Mirek Fídler said:
Also, when concerned about 2D image quality, if your major upgrade will
contain LCD panel I believe that DVI connection is likely to avoid any
differences in 2D image quality.
In a recent test two msi nvidia cards had inferior dvi quality, where two
ati cards were fine.
Actually, on the nvidia cards, when the dvi was coming from the gpu it was
inferior, and when it was coming from a separate chip it was ok (both were
on a dual dvi card).

It was only compromised above 1280 or whatever that is.

--
Ed Light

Smiley :-/
MS Smiley :-\

Send spam to the FTC at
(e-mail address removed)
Thanks, robots.
 
S

SteveK

BillL said:
Hello all

In February or March '05 (don't want to rush ;o) I've decided to go for a
major upgrade. Already decided I'm going to go for a AMD 64 (3000 or 3200
90nm) Nforce 4 939 Ultra and SATA2 HDD but am a bit stumped on the video
card choice. I'm currently using a Sapphire 9700 (non-pro 128 MB) which
I'm very happy with but will be going down the PCI Express route (not SLI
.... gotta eat after all!! ;o).

Currently I'd be looking at either an ATI X700 or nvidia 6600vanilla/GT.
However, I'm very aware that when I upgraded from my GF Ti200 to the
Sapphire I really noticed that image quality (both 2D & 3D) really
improved (of course performance did as well!).

I, like anyone else, would like the best 'bang for buck' in terms of
performance but has nvidia been able to improve in the area of image
quality over the earlier GFx cards or is ATI still the 'best' when it
comes top image quality?

ATi is better at image clarity - I find that the overall brightness/contrast
behaviour is way better balanced using the Ati card. The image appears plain
clearer and more real life.
 
N

Nicholas Buenk

First of One said:
2D image quality of both cards are about equal. In 3D, most of the
trilinear
filtering anomalies seen in either brand's drivers have been eliminated.
ATi's FSAA implementation still yields better quality (and can be faster).
However, the 6600 is capable of "high dynamic range" rendering, which
makes
a huge difference in games that support it (so far only Far Cry, though).

And takes away about half of your FPS, in Far Cry it's barely useable.
Also nvidia now uses ATI's crappy method for antisotropic fitlering, sure
it's faster. But you see lots of shimmering.
3 months is a long time, though, and you'll find that neither the X700 nor
the 6600 is significantly faster than your 9700.

Yep he should go for a 6800GT.
 
M

Mac Cool

Ed Light:
In a recent test two msi nvidia cards had inferior dvi quality, where
two ati cards were fine.
Actually, on the nvidia cards, when the dvi was coming from the gpu
it was inferior, and when it was coming from a separate chip it was
ok (both were on a dual dvi card).

Any more info on that, like who conducted the test?
 
B

BillL

SteveK said:
ATi is better at image clarity - I find that the overall
brightness/contrast
behaviour is way better balanced using the Ati card. The image appears
plain
clearer and more real life.

Thanks for all the replies. It appears that image quality (3D or 2D) isn't
something to worry about between ATI & nvidia so I'll see what's offering
the best bang for buck ratio in 3 months time - might well be a nvidia 6800
GT.

BillL
 
M

Mirek Fídler

Ed Light said:
In a recent test two msi nvidia cards had inferior dvi quality, where two
ati cards were fine.
Actually, on the nvidia cards, when the dvi was coming from the gpu it was
inferior, and when it was coming from a separate chip it was ok (both were
on a dual dvi card).

It was only compromised above 1280 or whatever that is.

OK, if you plan 21" LCD with 1600x1200, then you should be careful about
this too. It is more likely about timing and maximum transfer speed (DVI has
to be VERY fast, higher resolution requires more speed).

Mirek
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top