Doom3 Benchmarks out!

J

JB

At 1024x768 with very high settings
(4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

Those numbers were timedemos, not actual in game framerates
which would be much lower.

Jeff B
 
R

RayO

magnulus said:
If you go out and buy a GeForce FX 6800 just because it runs faster in
Doom III, you're a fool. End of line.

Couldn't agree more, especially when you consider that in three years
it's going to be selling on eBay for 40 bucks. Video cards have very very
short life-cycles.


RayO
 
G

Gnu_Raiz

NightSky 421 said:
I thought it was a good article and it makes me happy I have a 9800 Pro
video card. However, I can't wait to see how Doom 3 plays on systems that
are a little more "real world". For example, I hope they bench it on
processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
like to see an all-round comparison with as many combinations of CPU and
video cards as possible.

Thanks for posting that link!


According to the article Doom3 will come with a time demo, so just run
the time demo with your card and start a thread with your hardware.
Then after a couple of weeks someone can put all the data in a
spreadsheet and give an accounting for the cards that are listed. What
gets me, is there is no mention of Multiplayer game play anywhere.
When I get the game this will be one of the first things I will check
out, cause it will determine the longevity of the game.

Gnu_Raiz
 
H

HeadRusch

Agreed.......seriously, they couldn't test a Radeon 9800 Pro?? Which was the
definitive ATI card to buy for more than a year's time........Another thing:
Is there a particular reason why these guys claim to be "Just publishing
straight up FPS numbers", and yet they dont test with AA and Filtering OFF?

Those last batches of tests leave 8x AF *ON*.......seriously, there are
plenty of gamers out there (like...ME) who never turn on AA or AF.....AF
puts more of a hit on framerates than low-level AA does.....I'm guessing
those Radeon XT tests would be higher if you turned off that 8x AF...
 
H

HeadRusch

Damn, I ****ed up.......their last benchmark shows 1024x768, medium detail
level, with no aa and no AF......50+ fps.

Perfectly acceptable for me, and a Pro 9800 overclocked should come in
somewhere in that ballpark.
 
N

noman

If you go out and buy a GeForce FX 6800 just because it runs faster in
Doom III, you're a fool. End of line.

GeForce 6800 line works fine in other games too. They do trail behind
X800XT-PE in some DX9 games but not by much. Granted ATI still has to
optimise their memory controller (which, I read somewhere, is running
at 60-70% efficiency) and they are also rewriting their openGl drivers
from scratch. You can expect more optimisations from nVidia as well.

IMO, X800XT-PE is a better choice (if you can find it, that is) than
6800Ultra and 6800GT is better than X800Pro, given their MSRPs and
also the power requirements.

The bottomline is that these are all great cards and should run most
of the Source/Doom3/CryEngine/UT based games without any problems.
 
N

NightSky 421

magnulus said:
No... I parted with 400 dollars for the GeForce FX 5900 card. I'm not
going to fall for NVidia's crap a second time. Nothing sucks worse than to
have a brand new videocard become underpowered technology in only five
months.


Even Maximum PC magazine was sucked into using a high-end GeForce FX card
as part of their Dream Machine last year. A few months later, they
admitted that they made the wrong decision. It sounds like you and them
experienced the same thing.
 
M

Mark Morrison

From the article:

"As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4
box with a GeForce 4 MX440 video card and having a surprisingly good
gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce
3 video card that is two years old will deliver a solid gaming
experience that will let you enjoy the game the way id Software
designed it to be."

Not a benchmark, but at least it's positive (if subjective).

Nick

Fingers crossed then.

--

Bunnies aren't just cute like everybody supposes !
They got them hoppy legs and twitchy little noses !
And what's with all the carrots ?
What do they need such good eyesight for anyway ?
Bunnies ! Bunnies ! It must be BUNNIES !
 
M

Mark Morrison

Mark Morrison left a note on my windscreen which said:


I do a fair bit. 22" monitor.

Jesus - on my old 21", I though 1024x768 was a good resolution for
games.

Shows how often I upgrade my gfx card, I suppose. :)

--

Bunnies aren't just cute like everybody supposes !
They got them hoppy legs and twitchy little noses !
And what's with all the carrots ?
What do they need such good eyesight for anyway ?
Bunnies ! Bunnies ! It must be BUNNIES !
 
B

BRanger

magnulus said:
ATI's OpenGL drivers aren't so great. They are workable but not great.

The only thing impressive about the new Geforce cards is instancing
support in Vertex Shader 3.0. And so far it's been used in exactly one
game, and I don't expect that to change much for a long time.

ATI hard their cards out first. Unlike NVidia, they don't need to cook
their drivers. NVidia will have to work very hard to earn back my trust.

The funny thing is, ATI is the company that gets caught "optimizing" their
drivers in this article. Give it a close read.

NVida made some unwise design decisions in the last round of cards. As
such, they had to make some tradeoffs in image quality to get the
performance up, basically making the best of a bad situation.

It's funny how different people can interpret the same data differently.
I've had an ATI card in my box for quite some time but I feel that NVidia
has the better product this round. If you feel the need to "punish" NVidia
for the FX series this go around I guess you can do that but I think it's
your loss.

B
 
I

Inglo

The funny thing is, ATI is the company that gets caught "optimizing" their
drivers in this article. Give it a close read.

NVida made some unwise design decisions in the last round of cards. As
such, they had to make some tradeoffs in image quality to get the
performance up, basically making the best of a bad situation.

It's funny how different people can interpret the same data differently.
I've had an ATI card in my box for quite some time but I feel that NVidia
has the better product this round. If you feel the need to "punish" NVidia
for the FX series this go around I guess you can do that but I think it's
your loss.

B
As consumers we should be pleased that these two big video card
companies are engaged in quality competition. nVidia catching and
perhaps surpassing ATI on this round of card releases should lead to
further innovation by ATI, and better products all around for us in the
future.
 
N

Nada

NightSky 421 said:
LOL, I love that description!

My younger cousins have a GeForce 4 MX and I'm expecting a few dozen
panic calls at midnight.

Well when I read the article, I was under the impression myself that the
game details would have to be turned down in order to get a decent playing
experience with GeForce3 and Radeon 8500 cards. As to what low detail
will actually look like, we will see. Not that I'm immediately inclined
to find out myself, of course. :)

As the release date for Doom 3 draws nearer, I for whatever reason find
myself willing to loosen up the purse strings somewhat. Still, I'm going
to wait and see if there are any technical or driver issues before taking
the plunge. I very much look forward to seeing this newsgroup next week!

I'm sure the game will still look better than most average FPS games
with medium details, but to me "Doom 3" is one of those games where I
can't turn the details off if my life depended on it. It's meant to
be played in full regalia. I might have to crush my piggy bank as
well to purchase a new monitor, which is as expensive as getting a new
graphics card.
 
N

Nada

HeadRusch said:
Agreed.......seriously, they couldn't test a Radeon 9800 Pro?? Which was the
definitive ATI card to buy for more than a year's time........Another thing:
Is there a particular reason why these guys claim to be "Just publishing
straight up FPS numbers", and yet they dont test with AA and Filtering OFF?

Those last batches of tests leave 8x AF *ON*.......seriously, there are
plenty of gamers out there (like...ME) who never turn on AA or AF.....AF
puts more of a hit on framerates than low-level AA does.....I'm guessing
those Radeon XT tests would be higher if you turned off that 8x AF...

Hardocp do benchmark tests in a different way. In most cases they
will choose a "sweetspot" for each card where the performance won't
drop into early teens. My guess is that with 5900 XT and 9800 Pro we
have to turn AF off, but can still play "Doom 3" with maximum graphic
effects. That was just a preview test, and I'm sure the web will be
flooded with "Doom 3" benchmarks once the game is installed in most
homes. If anything, "Doom 3" will become the most used benchmark the
next two years, just like Quake III was at the time of its release.
 
N

Nada

Yes, 6800GT seems to be a great card to buy. The only difference
between this card and 6800Ultra is the clock speed. Reminds me of
Ti4200 in some regards.

Since I have every intention of keeping my 9800 (overclocked past Pro
speeds) at least till the end of next year, I find the Fx5950 and
9800XT scores very encouraging. At 1024x768 with very high settings
(4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

I think, my graphic card should be able to hit average of 30 fps at
1024x768 2xAA and 8xAF. That's all I need for Doom3 and the games that
will be based on its engine, for now.

It'll do pretty well for the next six months and perhaps can stretch
its life to late spring 2005.
As far as pricing of new graphic cards go, the next few months will be
very interesting.

It's been very harsh when it comes to the prices. I don't know how
it's in Canada and USA at the moment, but here in Europe we're seeing
prices of 500 euros for Nvidia's biggest guns abd ATI's top of the
line cards aren't too cheap either. I read from somewhere on the
internet where the Ultra was priced at 800 dollars max which is
absolutely insane! I've never had a problem of giving up 230 euros,
but over 500 euros is just way too dam much even for the performance
these new cards have to offer. I remember the 1994 when we'd struggle
with 386s with "Doom", so maybe there's a way to get through the
autumn without a panic inside the piggy bank.
 
N

noman

The funny thing is, ATI is the company that gets caught "optimizing" their
drivers in this article. Give it a close read.

Here's what John Carmack said,

"On the other hand, the Nvidia drivers have been tuned for Doom's
primary light/surface interaction fragment program, and innocuous code
changes can "fall off the fast path" and cause significant performance
impacts, especially on NV30 class cards."

It may be that the 'fast path' is the way shaders are compiled to get
around the NV3x series restrictions.

Both cards have optimizations. The valid ones are good for everybody.
I'd be worried if ATI and nVidia had given up on them and were just
relying on brute force to solve all the issues.
It's funny how different people can interpret the same data differently.
I've had an ATI card in my box for quite some time but I feel that NVidia
has the better product this round. If you feel the need to "punish" NVidia
for the FX series this go around I guess you can do that but I think it's
your loss.

Good thing about this generation is that both series of cards are
equally capable and you can't have a wrong choice.

X800 is still ahead in shader heavy DX9 games. The new FarCry
benchmarks (using SM2.0b on X800) show X800PE to be 15-20% ahead of
6800Ultra (which is using SM3.0). This should be good news for X800
owners who are waiting for STALKER or Half Life 2. 6800 is clearly
ahead in DOOM3 and it's likely that the lead will be carried over to
other DOOM3 engine games. However, to me the more important thing is
that nVidia in DX9 and ATI in openGl are competitive enough that most
people would not regret purchasing either of the 6800 or X800 cards.

It comes down to price then. It's hard to beat 6800GT if you can get
it for 300-340$ and X800XT-PE is great at 400-450$ range..........
(says the person, who doesn't buy graphic cards over 200$ :) )
 
M

Mark Steen

It's funny how different people can interpret the same data differently.
I've had an ATI card in my box for quite some time but I feel that NVidia
has the better product this round. If you feel the need to "punish" NVidia
for the FX series this go around I guess you can do that but I think it's
your loss.

B

Why do you feel they have the better product? The ATI X800PE still
beats the 6800u in many benchamrks and doesn't require a beefed up PSU
or two power connectors.
 
E

Eric

But what about at high detail with no AF -- that's what I want to see (the
hell with medium quality)? I'm hoping my new 5900XT can run doom3 at 40 or
above fps at high quality settings, at 1024 x 768 (with no AA and no AF).
Note that I have a P4 2.6 (800mhz FSB) and 1 GB of DDR ram.

This article claims that there is little visual benefit to AF:

http://www.extremetech.com/article2/0,1558,1157434,00.asp

So if I can turn off AF and turn off AA at "high quality" doom 3 settings --
and run at 1024 x 768 with at least 40 fps -- I'll be happy.
 
B

Blig Merk

Andrew said:
iD games are OpenGL, not D3D.

This is urban legend bullshit. iD (Carmack) prefers OpenGL but
market sensibilities require them to use M$-Direct3D. A bit of
clarification: M$-Direct3D is a subset of M$-DirectX. DirectX contains
Direct3D which is the primary graphics handling portion of DirectX.
OpenGL, on the other hand, is it's own API.
A game developer, or any graphics rendering programmer, can choose
whether to call into use the DirectX (Direct3D) or OpenGL APIs. And if
they want to be totally safe they can program separate modules that
let you choose which API to use. Anybody that remembers the original
Half Life would know that it had an option to choose between OpenGL or
Direct3D or Software. Most games are programmed for both OpenGL and
Direct3D. The code to select one or the other is fairly trivial. What
is not trivial are the pipelines afterwards. OpenGL does not have all
the features of Direct3D and Direct3D does not have some of the
performance that OpenGL does. Also, providing modules for both
increases the programming effort.
The difference these days is that a lot of games are trying to
decide on their own which API to use based on what hardware is being
used, sometimes with mixed results.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top