confusion about doom3 vs HL2 benchmarks

S

Sumedh

Hi all,

This recent controversy about Nvidia cards underperforming in HL2
benchmarks sure makes me feel good about my 9600Pro purchase, not to
mention seeing the FPS/$ chart!

Anyways, in Anandtech.com's review of the 5900 Ultra, he performed
Doom3 benchmarks, and Nvidia clearly shines compared to ATI.

Whats the discrepancy here? Doesnt Doom3 make use of DX9 features? Is
that benchmark reliable? Or are we caught up in a media hype after
Gabe Newell's damning report on nvidia?

any thoughts?

sumedh
 
J

JAD

You know what is worse than this the AMD INTEL crap....

And thats about it ( as far as News group posts anyway)
 
L

Lenny

Anyways, in Anandtech.com's review of the 5900 Ultra, he performed
Doom3 benchmarks, and Nvidia clearly shines compared to ATI.
Whats the discrepancy here?

The difference is that Doom3 makes heavy use of non-textured stencil buffer
pixels to a high degree in order to generate accurate shadows for all
objects. While GFFX gets its butt handed to it in most pixel shading
circumstances, it does do well when drawing stencil volumes.

Also, Doom3 features an optimized GFFX rendering path that lowers image
quality for (a lot) more speed. I'm not sure if Anand used that code path or
the standard path (I haven't read the article yet).
Doesnt Doom3 make use of DX9 features?

*Cough* Yes, and no. Doom3 is an OpenGL game, but it does make use of DX9
*level* features, yes.
Is that benchmark reliable?

I'd think so, both games are made by very reputable companies, by very
trustworthy people. D3 and HL3 are simply two very different games, that
they behave differently isn't strange. There's no reason to think either
benchmark isn't trustworthy.
 
N

Nada

Hi all,

This recent controversy about Nvidia cards underperforming in HL2
benchmarks sure makes me feel good about my 9600Pro purchase, not to
mention seeing the FPS/$ chart!

Anyways, in Anandtech.com's review of the 5900 Ultra, he performed
Doom3 benchmarks, and Nvidia clearly shines compared to ATI.

That test was done in medium quality settings. It should had been
maximum quality test to really make it worth it, but nVidia's policy
seems to be that medium quality is "the way it's meant to be played".
 
R

Richard Dower

Nada said:
(e-mail address removed) (Sumedh) wrote:
That test was done in medium quality settings. It should had been
maximum quality test to really make it worth it, but nVidia's policy
seems to be that medium quality is "the way it's meant to be played".

Madness...this translates into the card can't handle maximum detail levels
at playable frame rates.
 
B

B

NVIDIA is quite aware of Halflife 2 results and that is the reason they have
been working with Value in developing the series 50.XX drivers which should
be released prior to the release of Halflife 2. NVIDIA does not understand
why the comments were written as they have been working together with Value
for this very reason. In addition the results shown were matched with the
series 45.XX drivers which aren't maximised for this particular game.

regards

B
 
R

Richard Dower

B said:
NVIDIA is quite aware of Halflife 2 results and that is the reason they have
been working with Value in developing the series 50.XX drivers which should
be released prior to the release of Halflife 2. NVIDIA does not understand
why the comments were written as they have been working together with Value
for this very reason. In addition the results shown were matched with the
series 45.XX drivers which aren't maximised for this particular game.

"maximised"?...is that like "optimised", why do they have to do this?...it
justs leads to accusations of cheating and missing detail levels. Can't we
just get them to work and kick ATI's ass without "optimisation"?
 
W

who be dat?

That's an interesting question. The first thing that pops into my head:
what Doom 3 benchmark? As I recall, Id hasn't released a Doom 3 Benchmark.
However, an extremely early beta of Doom 3 from an E3 preview was leaked and
floating around. Like I said, this was very early beta and it had some more
optimizing to be done to it. If this was the doom 3 benchmarks that were
posted, then these are quite frankly a waste of time as they aren't anywhere
near accurate reflection of the final code. Plus, who's to say that since
it's early beta that the game wasn't more optimized for one game as compared
to another? Since the game is in OpenGL, you in effect right the game for
each vid card. Speaking of OpenGL...

It should be pointed out that Doom3 uses OpenGL while HL2 users D3d, which
is a big difference. I think the vid actually have more capabilities
exposed in OpenGL than they do in D3d, for example. Depending on if Nvidia
has a feature that the ATi doesn't and it's exposed in their drivers, then
possibly Nvidia may perform faster than ATi. I do seem to recall that
Nvidia has the ability to handle shadows a bit better than ATi in OpenGL.

Due to the fact the games are on different subsystems, it will be tough to
draw any comparisons between two when it comes to performance.

Chris Smith
 
H

ho alexandre

Richard said:
"maximised"?...is that like "optimised", why do they have to do this?...it
justs leads to accusations of cheating and missing detail levels. Can't we
just get them to work and kick ATI's ass without "optimisation"?

Another explanation is that ATI made their optimization (or whatever you
called it) sooner :)
Either way, ATI did well, and much better than nVidia, for the time being.
 
M

methylenedioxy

who be dat? said:
That's an interesting question. The first thing that pops into my head:
what Doom 3 benchmark? As I recall, Id hasn't released a Doom 3 Benchmark.
However, an extremely early beta of Doom 3 from an E3 preview was leaked and
floating around. Like I said, this was very early beta and it had some more
optimizing to be done to it. If this was the doom 3 benchmarks that were
posted, then these are quite frankly a waste of time as they aren't anywhere
near accurate reflection of the final code. Plus, who's to say that since
it's early beta that the game wasn't more optimized for one game as compared
to another? Since the game is in OpenGL, you in effect right the game for
each vid card. Speaking of OpenGL...

It should be pointed out that Doom3 uses OpenGL while HL2 users D3d, which
is a big difference. I think the vid actually have more capabilities
exposed in OpenGL than they do in D3d, for example. Depending on if Nvidia
has a feature that the ATi doesn't and it's exposed in their drivers, then
possibly Nvidia may perform faster than ATi. I do seem to recall that
Nvidia has the ability to handle shadows a bit better than ATi in OpenGL.

Due to the fact the games are on different subsystems, it will be tough to
draw any comparisons between two when it comes to performance.

Chris Smith
Except Doom makers have already admitted they have had trouble with FX cards
and have had to write code specific for it.
Whatever is going on I think it is clear that the Nvidia cards ARE having
hardware problems and no amount of wriggling is going to get them out of it.
Have you been reading things? Nvidia'[s press release has already tried to
say there is no picture quality difference between dx8 and 9 and the shaders
used, they have had MS now breathing down their necks about this, they also
blamed valve for not using beta drivers, what else was it, 3d mark 03, they
blamed them for the problems and they also blamed the poor quality of Tomb
Raider for the problems there too. How much can one company squirm? They
have to accept responsibility at some point.
As more and more games come out, and these elusive 50 drivers appear, we
shall see whats really going on, nvidia will be caught out then, until that
time they can keep squirming, keep blaming everyone else but it will come
out.
 
N

NAZGUL

In both games neither of them are out yet and neither of them have gone gold
and as drivers are always been improved its way to soon to judge either of
the cards.
 
M

methylenedioxy

NAZGUL said:
In both games neither of them are out yet and neither of them have gone gold
and as drivers are always been improved its way to soon to judge either of
the cards.
Except this isn't about drivers if you actually read anything you will see.
The controversy is that Nvidia will "optimise" drivers meaning you don't get
what you pay for in terms of the advertised technology, this is purely a
hardware issue not drivers. Yeah sure, drivers will help, but like nvidia
said, theres no difference between pixel shading 1.4 and 2, and theres no
real difference between dx8 and 9, enjoy your fx :s
 
T

tHatDudeUK

ho alexandre said:
Either way, ATI did well, and much better than nVidia, for the time being.

Not for the time being, period. The Nvidia hardware isn't up to the task at
all.
 
C

Chris Ciccarello

Wrong. Both games are far into development and tweaks to the engines
and improvements in the drivers aren't going to drastically change the
results. Especially for HL2 which is almost done.

Anyone who thinks that Nvidia is going to magically improve shader
performance without impacting quality is confused.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top