DX9 (HL2 & Doom3) on ATI vs Nvidia

B

Ben Pope

YanquiDawg said:
Don't you think this problem can be remedied with software?


I suspect that the PS2 hardware architecture is to blame for the performance
issues. I guess only time will tell, but as soon as HL2 comes out nVidia
will have to work hard to compete on the benchmarks. It looks like it'll be
the first true test for DX9 and PS2 hardware.

Ben
 
M

methylenedioxy

YanquiDawg said:
Don't you think this problem can be remedied with software?
Yes because even in the article if you read it they reckon it's only a
driver issue anyway....
 
B

Ben Pope

methylenedioxy said:
Yes because even in the article if you read it they reckon it's only a
driver issue anyway....

Would you like to quote that paragraph?

Ben
 
M

methylenedioxy

Yes. NV30 class hardware can run the ARB2 path that uses
ARB_fragment_program, but it is very slow, which is why I have a separate
NV30 back end that uses NV_fragment_program to specify most of the
operations as 12 or 16 bit instead of 32 bit."
 
B

Ben Pope

methylenedioxy said:
Yes. NV30 class hardware can run the ARB2 path that uses
ARB_fragment_program, but it is very slow, which is why I have a
separate NV30 back end that uses NV_fragment_program to specify most
of the operations as 12 or 16 bit instead of 32 bit."

That looks like pixel shader code. Not driver code.

Or course lowering the precision of calculations is going to make it run
faster.

Not much nVidia can do about that - unless they re-write Pixel Shader code
for games (wouldn't be the first time somebody has done this) - but then
that would result in different graphics - far from what nVidia have been
selling themselves on with this "How it's meant to played" stuff.

Ben
 
M

methylenedioxy

Ben Pope said:
That looks like pixel shader code. Not driver code.

Or course lowering the precision of calculations is going to make it run
faster.

Not much nVidia can do about that - unless they re-write Pixel Shader code
for games (wouldn't be the first time somebody has done this) - but then
that would result in different graphics - far from what nVidia have been
selling themselves on with this "How it's meant to played" stuff.

Ben
Well, whatever it is, it's a software fix :)
I'm not justifying nvidia, had my first and last geforce until recently and
came back to ati :)
 
S

Strontium

-
ginfest stood up at show-n-tell, in cmt7b.406472$Ho3.61245@sccrnsc03, and
said:
Evidentially he regrets purchasing his ATI card.

Why would he 'evident[ly]' regret purchasing an ATI card?
From the article:

I don’t know how anyone could objectively look at the performance we’ve seen
in these benchmarks and conclude that a 5900 Ultra is a smarter buying
decision than a 9800 Pro – even the 128MB 9800 Pro (as used in the tests
here) trumps the lofty 256MB 5900 Ultra. If you’re still “stuck in the past”
and think that ATI is plagued with driver issues, than go ahead and keep
your head stuck in the sand like an ostrich, buy a 5900 Ultra and then start
crying when your pals are smoking your ass in games like Half Life 2 and
Halo because they’re running ATI hardware.

What can NVIDIA do right now to turn things around? First off, lower the
damn price of its high-end cards – even if they were priced the same, a 9800
Pro would be a better choice for Pixel Shader 2.0 performance. Secondly,
they’d better pull out every stop in their playbook of tricks to fine-tune
their drivers.

We have no doubt that NVIDIA is hard at work right now on its next-gen
silicon, which will undoubtedly be extremely fast – for their sake and the
sake of gamers like us, it had better be!
 
B

Ben Pope

methylenedioxy said:
Well, whatever it is, it's a software fix :)
I'm not justifying nvidia, had my first and last geforce until
recently and came back to ati :)

Yeah, "software" is in changing the game... We're supposed to be advancing
graphics as we go along, not keeping them the same and disabling bits 'cos
Video cards can't cope!

"All performance issues can be fixed by software. Don't run any."

LOL.

Ben
 
B

Ben Pope

ginfest said:
Evidentially he regrets purchasing his ATI card.

Actually I don't. I'm incredibly impressed by the speed and visual quality
of the graphics. Turning on 2xAA works for a start. I could have spent
more money on an nVidia and card and had less speed and less graphics
quality, but that sounds like a lose, lose, lose situation to me.

Admittedly Linux 3D Driver support is far better from nVidia than ATI, but
it's not like I'll be giving up Windows completely and playing games in
Linux. Not for a considerably while anyway. For a start getting most games
don't even support Linux. I have very little use for 3D graphics in Linux.

I'm happy with my purchase, and if somebody offered me the two cards (nVidia
5900 Ultra vs 9800 Pro) at the same price (Under £300) - I'd go with the ATI
again.

Thank you.

Ben
 
M

methylenedioxy

Ben Pope said:
Actually I don't. I'm incredibly impressed by the speed and visual quality
of the graphics. Turning on 2xAA works for a start. I could have spent
more money on an nVidia and card and had less speed and less graphics
quality, but that sounds like a lose, lose, lose situation to me.
I'm just wondering what card you have? I don't run ANY games at less than
6XAA and they run superb (apart from command and conquer generals that is,
but that's badly coded anyway, could have a super super system and it
wouldn't run)
 
B

Ben Pope

methylenedioxy said:
I'm just wondering what card you have? I don't run ANY games at less
than 6XAA and they run superb (apart from command and conquer
generals that is, but that's badly coded anyway, could have a super
super system and it wouldn't run)

I'm using a Crucial 9800 Pro.

I've only really played a couple of games... GTA Vice City and Splinter
Cell, both at 1600x1200 with 6xAA and 8xAF, and everything set to quality
and no slow downs whatsoever. Some of the scenes in Splinter Cell are very
pretty - some really nice lighting effects - haven't played it very much to
be honest.

That was when I had my XP2500 at 200MHz x 11 - it's at default at the
moment, pending troubleshooting of the occasional error in Prime95. I don't
expect any slowdowns at this CPU speed though.

Ben
 
M

methylenedioxy

Ben Pope said:
I'm using a Crucial 9800 Pro.

I've only really played a couple of games... GTA Vice City and Splinter
Cell, both at 1600x1200 with 6xAA and 8xAF, and everything set to quality
and no slow downs whatsoever. Some of the scenes in Splinter Cell are very
pretty - some really nice lighting effects - haven't played it very much to
be honest.

That was when I had my XP2500 at 200MHz x 11 - it's at default at the
moment, pending troubleshooting of the occasional error in Prime95. I don't
expect any slowdowns at this CPU speed though.

Ben
Interesting read, were you aware that Splinter Cell doesn't actually support
AA? It also makes the game go "wonky" as in not run properly with AA
switched on...
 
B

Ben Pope

methylenedioxy said:
Interesting read, were you aware that Splinter Cell doesn't actually
support AA? It also makes the game go "wonky" as in not run properly
with AA switched on...

I couldn't find the options in either game - I forced it in the ATI control
panel and it does work...

No wonkiness here from what I've seen.

Ben
 
G

Guest

I couldn't find the options in either game - I forced it in the ATI control
panel and it does work...

No wonkiness here from what I've seen.
The light sources shine through things with FSAA enabled.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top