ATI Rant and Some Questions (HD 2900XT)

H

hbscott

First off, I'm no expert with graphic cards. I may be ignorant, but
I'm willing to learn. Please correct me where I am wrong.

I ordered an HD 2900XT on 9/28/07. The card had been out for a while
and at that time I was not hearing any true negatives so I thought
it would be a long term solution for 3D gaming. I looked forward to
adding a second card in Crossfire at a later date when prices dropped.
My major concern when I went to look for a second card is I am reading
reviews stating, "AMD confirmed they used shaders and not ROPs
(hardware) to resolve anti-aliasing on the HD 2900XT. This design flaw
is causing the poor performances I've seen when enabling AA in games.
Using shader AA instead of ROP AA was a poor design decision that no
driver updates or patches will ever correct. This piece of hardware is
flawed from the start and doomed to a quick death."

This got me to researching some more and I found that the upcoming
DirectX 10.1 will begin to implement more Shader AA "features". I
googled "2900XT and DirectX 10.1" and couldn't find anything telling
me that 10.1 will help make the 2900XT a long term solution for 3D
gaming. Yet when I googled, "ATI and DirectX 10.1" there was no end
to the articles telling me that the 3xxx series of cards will take
advantage of 10.1.

The 3870 is listing for ~ $170 less than a 2900XT and from what I
understand there is no significant difference in 3D speeds and
features. Good for the person who gets the 3870 on deal. More power
to them. As for me, I am pissed. The X800Pro that I replaced lasted
for a few great years of gaming and I looked forward to having a few
more years of top of the line ATI graphics to take me through the
DirectX 10 era. Three months later and I am sitting on a (soon to be)
dud from what I have read. Quite honestly I am about ready to try out
the other guys as soon as their 10.1 compliant cards are mainstream.

END RANT

Questions: Am I looking at not being able to crank up graphics
settings in near-future games or am I overreacting?

Can anyone say for sure if the 2900XT will improve with age as new
DirectX protocols are made available?

Why won't ATI Support reply to my questions? Have I stated myself
clearly? Can I assume that they realize they are going to piss off a
lot of their high-end customers?

My home built system: P5K3 Deluxe, C2Q Q6600 2.4, 2gigs DDR3, HD
2900XT, ITZ 700W, built in early 10/07. This is my second self-built
system (after three years). So far I am pleased with all of my
decisions in the long term except for the graphics card. I mostly
play 1st person shooters (single player and online).

Thanks for taking the time to read and for any replies.
 
K

KlausK

hbscott said:
First off, I'm no expert with graphic cards. I may be ignorant, but
I'm willing to learn. Please correct me where I am wrong.

I ordered an HD 2900XT on 9/28/07. The card had been out for a while
and at that time I was not hearing any true negatives so I thought
it would be a long term solution for 3D gaming. I looked forward to
adding a second card in Crossfire at a later date when prices dropped.
My major concern when I went to look for a second card is I am reading
reviews stating, "AMD confirmed they used shaders and not ROPs
(hardware) to resolve anti-aliasing on the HD 2900XT. This design flaw
is causing the poor performances I've seen when enabling AA in games.
Using shader AA instead of ROP AA was a poor design decision that no
driver updates or patches will ever correct. This piece of hardware is
flawed from the start and doomed to a quick death."

This got me to researching some more and I found that the upcoming
DirectX 10.1 will begin to implement more Shader AA "features". I
googled "2900XT and DirectX 10.1" and couldn't find anything telling
me that 10.1 will help make the 2900XT a long term solution for 3D
gaming. Yet when I googled, "ATI and DirectX 10.1" there was no end
to the articles telling me that the 3xxx series of cards will take
advantage of 10.1.

The 3870 is listing for ~ $170 less than a 2900XT and from what I
understand there is no significant difference in 3D speeds and
features. Good for the person who gets the 3870 on deal. More power
to them. As for me, I am pissed. The X800Pro that I replaced lasted
for a few great years of gaming and I looked forward to having a few
more years of top of the line ATI graphics to take me through the
DirectX 10 era. Three months later and I am sitting on a (soon to be)
dud from what I have read. Quite honestly I am about ready to try out
the other guys as soon as their 10.1 compliant cards are mainstream.

END RANT

Questions: Am I looking at not being able to crank up graphics
settings in near-future games or am I overreacting?

Can anyone say for sure if the 2900XT will improve with age as new
DirectX protocols are made available?

Why won't ATI Support reply to my questions? Have I stated myself
clearly? Can I assume that they realize they are going to piss off a
lot of their high-end customers?

My home built system: P5K3 Deluxe, C2Q Q6600 2.4, 2gigs DDR3, HD
2900XT, ITZ 700W, built in early 10/07. This is my second self-built
system (after three years). So far I am pleased with all of my
decisions in the long term except for the graphics card. I mostly
play 1st person shooters (single player and online).

Thanks for taking the time to read and for any replies.

AA is overrated. Once the rez go beyond 1280, the effect of AA on image
quality is almost zero. Long time ago when games run at 640x480 or 320x240,
AA did make them look better.

With your system, you can run games at a high rez. Unless you have vision
like an eagle's, you won't notice a difference. I'm running COD4 at
1920x1080 on an HDTV and played with graphic settings. I notice zero
difference between AA and no-AA.
 
H

hbscott

First off, I'm no expert with graphic cards. I may be ignorant, but
I'm willing to learn. Please correct me where I am wrong.

I ordered an HD 2900XT on 9/28/07. The card had been out for a while
and at that time I was not hearing any true negatives so I thought
it would be a long term solution for 3D gaming. I looked forward to
adding a second card in Crossfire at a later date when prices dropped.

My major concern when I went to look for a second card is I am reading
reviews stating, "AMD confirmed they used shaders and not ROPs
(hardware) to resolve anti-aliasing on the HD 2900XT. This design flaw
is causing the poor performances I've seen when enabling AA in games.
Using shader AA instead of ROP AA was a poor design decision that no
driver updates or patches will ever correct. This piece of hardware is
flawed from the start and doomed to a quick death."

This got me to researching some more and I found that the upcoming
DirectX 10.1 will begin to implement more Shader AA "features". I
googled "2900XT and DirectX 10.1" and couldn't find anything telling
me that 10.1 will help make the 2900XT a long term solution for 3D
gaming. Yet when I googled, "ATI and DirectX 10.1" there was no end
to the articles telling me that the 3xxx series of cards will take
advantage of 10.1.

The 3870 is listing for ~ $170 less than a 2900XT and from what I
understand there is no significant difference in 3D speeds and
features. Good for the person who gets the 3870 on deal. More power
to them. As for me, I am pissed. The X800Pro that I replaced lasted
for a few great years of gaming and I looked forward to having a few
more years of top of the line ATI graphics to take me through the
DirectX 10 era. Three months later and I am sitting on a (soon to be)
dud from what I have read. Quite honestly I am about ready to try out
the other guys as soon as their 10.1 compliant cards are mainstream.

END RANT

Questions: Am I looking at not being able to crank up graphics
settings in near-future games or am I overreacting?

Can anyone say for sure if the 2900XT will improve with age as new
DirectX protocols are made available?

Why won't ATI Support reply to my questions? Have I stated myself
clearly? Can I assume that they realize they are going to piss off a
lot of their high-end customers?

My home built system: P5K3 Deluxe, C2Q Q6600 2.4, 2gigs DDR3, HD
2900XT, ITZ 700W, built in early 10/07. This is my second self-built
system (after three years). So far I am pleased with all of my
decisions in the long term except for the graphics card. I mostly
play 1st person shooters (single player and online).

Thanks for taking the time to read and for any replies.
 
F

First of One

You seem to have DX10.1 capability confused with faster AA. In fact the
HD38x0 cards are inflicted with the same shader-resolver-only design, and
suffer a similar performance hit with AA enabled.

If you want fast AA, get a Geforce 8800GT or GTS (like many have done) - of
course these cards don't support DX10.1 either.

Your X800Pro lacked SM 3.0 capability, so it could not, for example, do
high-dynamic range rendering in games like Oblivion. Despite the image
quality improvements with HDR, you didn't seem to lament this missing
feature. It's gonna be a similar deal with DX10.1.
 
H

hbscott

You seem to have DX10.1 capability confused with faster AA. In fact the
HD38x0 cards are inflicted with the same shader-resolver-only design, and
suffer a similar performance hit with AA enabled.

Ahhh...just the kind of info I was looking for.
If you want fast AA, get a Geforce 8800GT or GTS (like many have done) - of
course these cards don't support DX10.1 either.

Your X800Pro lacked SM 3.0 capability, so it could not, for example, do
high-dynamic range rendering in games like Oblivion. Despite the image
quality improvements with HDR, you didn't seem to lament this missing
feature. It's gonna be a similar deal with DX10.1.


Thanks for the reply. My X800Pro served me well for a long time. I
didn't miss SM 3.0 until I tried to play Bioshock. That was a wake up
call. When I researched a solution I saw where ATI had not included
this feature when most other cards shipping at that time were fully
compliant. Anyway...I was aware that that card wouldn't have run
Bioshock well anyway. That's when I decided to get out of the AGP
market and build from the ground up.

I guess I'm just mad because I found out only three months later that
my 2900XT was not as "top of the line" as I thought. Feeling screwed
twice (that quickly) by the same company is not good for getting my
business again. I didn't look into the reviews closely enough and I
take some of the blame, but not all. Like I said I'm no expert, but I
figured their best card at the time would be able to take me further
down the road than I suspect this one will.

I will mostly agree with KlausK that AA is overated. But, it is also
my opinion that a card that cost as much as the 2900XT shouldn't blink
when you turn on AA, in particular with older games. I haven't found
this to be the case and feel like I didn't get my money's worth. This
is all behind me now. I've decided to overclock and experiment with
this card until Nvidia has a 10.1 card on the market. I hope I can
hang on to the Christmas money I was going to use on a second card.

Happy New Year
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top