Doom3 Benchmarks out!

N

NightSky 421

rms said:
http://www2.hardocp.com/article.html?art=NjQy

Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
version was $345 shipped from provantage).

rms


I thought it was a good article and it makes me happy I have a 9800 Pro
video card. However, I can't wait to see how Doom 3 plays on systems that
are a little more "real world". For example, I hope they bench it on
processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
like to see an all-round comparison with as many combinations of CPU and
video cards as possible.

Thanks for posting that link!
 
N

noman

http://www2.hardocp.com/article.html?art=NjQy

Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny
version was $345 shipped from provantage).

Yes, 6800GT seems to be a great card to buy. The only difference
between this card and 6800Ultra is the clock speed. Reminds me of
Ti4200 in some regards.

Since I have every intention of keeping my 9800 (overclocked past Pro
speeds) at least till the end of next year, I find the Fx5950 and
9800XT scores very encouraging. At 1024x768 with very high settings
(4xAA, 16xAF) they are close to 30 fps and 45+ (with no AA and 8xAF).

I think, my graphic card should be able to hit average of 30 fps at
1024x768 2xAA and 8xAF. That's all I need for Doom3 and the games that
will be based on its engine, for now.

As far as pricing of new graphic cards go, the next few months will be
very interesting.
 
D

David Besack

As far as pricing of new graphic cards go, the next few months will be
very interesting.

The last couple months saw some good price drops when the 6800 and x800
became available. Now you can get a 9800 PRO for under $200. I'm still
clinging to my 5200 ultra until I am forced to part with it :)
 
H

Humga

David Besack said:
The last couple months saw some good price drops when the 6800 and x800
became available. Now you can get a 9800 PRO for under $200. I'm still
clinging to my 5200 ultra until I am forced to part with it :)

I follow this rule (Humga's 1st Law of Graphics Card Upgrade):

Buy the new card when the performance (frame rate usually being a good
measure) drops to roughly half of that of the new card. Then you must get at
least **some** cash back for the 'old' card.

This will ensure that you'll be able to play with your the old and new games
with decent performance without costing you too much :D

Please note that the 'new' card isn't necessarily the fastest card in the
market...think about it.
 
T

Toby Newman

They didn't bench anything older than 5950... what a bunch of clowns.

It says in the article that a broader range of CPU and GFX cards will be
checked for a future feature
 
L

Larry Roberts

They didn't bench anything older than 5950... what a bunch of clowns.

I'm wondering if the low benchmark scores on those cards are
because of the heavy DX9 shader use. Since a card such as the GF4 Ti
doesn't support DX9, I'm guessing the game will either switch to a DX8
code for effects, or just omit the shaders all together. If so, would
that mean that a GF4 Ti might get framerates that are about the same
as the DX9 cards, but just not look as good?
 
M

magnulus

ATI's OpenGL drivers aren't so great. They are workable but not great.

The only thing impressive about the new Geforce cards is instancing
support in Vertex Shader 3.0. And so far it's been used in exactly one
game, and I don't expect that to change much for a long time.

ATI hard their cards out first. Unlike NVidia, they don't need to cook
their drivers. NVidia will have to work very hard to earn back my trust.
 
N

Nick Vargish

NightSky 421 said:
For example, I hope they bench it on processors 1.5GHz and up with
GeForce4 MX and GeForce3 cards and up.

From the article:

"As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4
box with a GeForce 4 MX440 video card and having a surprisingly good
gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce
3 video card that is two years old will deliver a solid gaming
experience that will let you enjoy the game the way id Software
designed it to be."

Not a benchmark, but at least it's positive (if subjective).

Nick
 
M

Mark Morrison

They didn't bench anything older than 5950... what a bunch of clowns.
Yep - so much for seeing how my GeForce 3 performed at 1024x768.

I also loved the line "We figured that 1600x1200 resolution would be
the place to start..."

WTF ?

I assume this article is aimed at people with high-end system,
presumably overclocked ones.

Who the **** plays games in 1600x1200 ? And on what ? A 23" monitor
???

--

Bunnies aren't just cute like everybody supposes !
They got them hoppy legs and twitchy little noses !
And what's with all the carrots ?
What do they need such good eyesight for anyway ?
Bunnies ! Bunnies ! It must be BUNNIES !
 
G

ginfest

magnulus said:
ATI's OpenGL drivers aren't so great. They are workable but not great.

The only thing impressive about the new Geforce cards is instancing
support in Vertex Shader 3.0. And so far it's been used in exactly one
game, and I don't expect that to change much for a long time.

ATI hard their cards out first. Unlike NVidia, they don't need to cook
their drivers. NVidia will have to work very hard to earn back my trust.
Sour grapes?
 
S

Stoneskin

Mark Morrison left a note on my windscreen which said:
Yep - so much for seeing how my GeForce 3 performed at 1024x768.

I also loved the line "We figured that 1600x1200 resolution would be
the place to start..."

WTF ?

I assume this article is aimed at people with high-end system,
presumably overclocked ones.

Who the **** plays games in 1600x1200 ? And on what ? A 23" monitor
???

I do a fair bit. 22" monitor.
 
N

Nada

Darkfalz said:
They didn't bench anything older than 5950... what a bunch of clowns.


I thought it was an okay preview benchmarking article, and I'm pretty
sure that once the game is out, we'll see plenty of good
benchmarkings. Keep an eye on www.xbitlabs.com in the upcoming weeks.
I'd say that if we with our average graphics cards cut out the
anisotropic filtering seen on the 5950 Ultra benchmark table, the
framerate will most likely stay around the same speeds with 9800 Pros
and 5900 XTs. As far as the engine's flexibility goes, I'd take that
with a grain of ginger when it comes to the "high detail" modes. I
personally won't consider playing the game anything less than Radeon
9800 or GeForce 5900. Will GeForce 3 be able to swoop it with high
details? Hell, no. That dog won't hunt.
 
N

Nada

NightSky 421 said:
I thought it was a good article and it makes me happy I have a 9800 Pro
video card. However, I can't wait to see how Doom 3 plays on systems that
are a little more "real world". For example, I hope they bench it on
processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd
like to see an all-round comparison with as many combinations of CPU and
video cards as possible.

GeForce 4 MX will perform like a turd stuck in toilet seat. Heck,
even GeForce 3 will drown into the quicksand. I have no idea how much
difference there is between the "medium detail"- mode and "high
detail"- mode, but I just refuse to believe that "GeForce 3" would
surf the game with high details. I couldn't even turn on all the
details in "Unreal 2" without diving into the bottom of the chart.
 
M

magnulus

ginfest said:
Sour grapes?

No... I parted with 400 dollars for the GeForce FX 5900 card. I'm not
going to fall for NVidia's crap a second time. Nothing sucks worse than to
have a brand new videocard become underpowered technology in only five
months.

ATI is more honest with their products. They don't rewrite other peoples'
shaders so that they use lower precision. And they don't require two molex
power connectors or large fans.

Sure, Geforce is faster for ONE GAME. Wow. That's justification for
plopping down 500 dollars on a new videocard! For all we know, the ATI and
NVidia cards aren't even running on the same codepaths, don't have the same
visual quality, etc. (NVidia cards running with a 16-bit precision would of
course run faster than ATI's 24 bit precision). When the FX 5900 came out,
it was faster in Unreal Tournament 2003/2004- alot faster. But as history
showed, that really didn't matter because it ran like crap in games like
Deus Ex or Thief III. And Doom III may not be an important engine in the
future of gaming, you never now. Right now the Unreal engine is pulling in
alot of developers, and it runs on Direct 3D. OpenGL is pretty much dead in
PC gaming. Doom III doesn't do anything that you cannot do with the Unreal
engine, and it will no doubt cost more to license. So why would developers
use it?

The time has come for gamers to put away childish things and grow up a
little beyond these stupid pecker contests. You just cannot compare two
benchmarks now days without also comparing image quality. People should
also be considering power requirements, thermal and cooling requirements,
and so on. On all these accounts, the GeForce 6 loses.

If you go out and buy a GeForce FX 6800 just because it runs faster in
Doom III, you're a fool. End of line.
 
N

NightSky 421

Nada said:
GeForce 4 MX will perform like a turd stuck in toilet seat.


LOL, I love that description!

Heck,
even GeForce 3 will drown into the quicksand. I have no idea how much
difference there is between the "medium detail"- mode and "high
detail"- mode, but I just refuse to believe that "GeForce 3" would
surf the game with high details. I couldn't even turn on all the
details in "Unreal 2" without diving into the bottom of the chart.


Well when I read the article, I was under the impression myself that the
game details would have to be turned down in order to get a decent playing
experience with GeForce3 and Radeon 8500 cards. As to what low detail
will actually look like, we will see. Not that I'm immediately inclined
to find out myself, of course. :)

As the release date for Doom 3 draws nearer, I for whatever reason find
myself willing to loosen up the purse strings somewhat. Still, I'm going
to wait and see if there are any technical or driver issues before taking
the plunge. I very much look forward to seeing this newsgroup next week!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top