John Carmack's official comments on NV40 (GeForce 6800 family)

Z

zmike6

I'll take SSE2 over 3DNOW! any day....


I mean, come on people... give it up... 3dNow's optimizations are now
INCLUDED on most GPU's.


3dNow only aided older cards that didn't have the bandwidth to match
the processor.


Now, the GPU's are outperforming the CPU, so it's entirely unnecessary.



AMD should definitely update this instruction set... if they did they'd
be god.

Imagine a processor with Pixel Shader 3.0, bump mapping, vertex
shading... drool


Um, you do know the Athlon64 supports SSE2, right?
 
N

Nada

Carmack announced:
"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software

When Carmack speaketh, the world stands still and the games will stutter.
 
R

rms

He also seems to be in bed with Intel. His engines always run better on
Because SSE is mysteriously disabled in Q3 engined games when run on
AMD CPU's.

Exactly. To be fair, the SSE-enabled AthlonXP did not exist when Q3 was
first programmed, and you can't blame Carmack for not wishing to revisit old
code.

There's a good chance he used a no doubt convenient Intel compiler
instruction that tests for the presence of an Intel cpu, rather than
actually test for SSE itself.

rms
 
J

John Reynolds

John Lewis said:
So, better start saving now for the 6800U ??? And power-supply.
Might as well just get a new case and power supply of the style
with the fan (hopefully quiet) in the middle of the cover and
close to the 6800U.........

John Lewis

Before the R420 previews are even out? Nope.

John
 
P

papasurf

ATI has been rather quiet so far but I will hold onto my $ till they have
spoken. May the best card win...
 
K

K

I'll take SSE2 over 3DNOW! any day....

SSE2 is 2nd generation SIMD instructions, 3DNow is first generation
and were the first FP SIMD instructions ever used on a CPU. You're
comparing old and new.

Besides, all AMD64 CPUs support SSE2 anyway.

K
 
P

Pluvious

||||> From the nVidia news release:-
||
||> "As DOOM 3 development winds to a close, my work has turned to
||> development of the next generation rendering technology. The NV40 is
||> my platform of choice due to its support of very long fragment
||> programs, generalized floating point blending and filtering, and the
||> extremely high performance," said John Carmack, president and
||> technical director of id Software
||> --------------------------------------------------------------------------
||-----------------------------
||
||Here is another John Carmack quote:
||
||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
||

I don't care what Carmack or Sweeny say about video cards folks...
they are in bed together.. so its invalid. I'll let the specs and head
to head reviews make my decisions for me thank you. Use your heads
guys..

Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 XT
is a joke. Of course I expect the 3rd party companies to juice up the
card a bit. We'll see.. but my money is on ATI for retaining the crown
this next generation.

Pluvious
 
C

Courseyauto

I don't care what Carmack or Sweeny say about video cards folks...
they are in bed together.. so its invalid. I'll let the specs and head
to head reviews make my decisions for me thank you. Use your heads
Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 >is a joke.
Of course I expect the 3rd party companies to juice up the
card a bit. We'll see.. but my money is on ATI for retaining the crown
this next generation.


If you noticed
the 5950 beat the 9800 on most of the benchmarks also and the 6800 killed the
9800 on a lot of the games.
 
D

Derek Baker

Pluvious said:
||||> From the nVidia news release:-
||
||> "As DOOM 3 development winds to a close, my work has turned to
||> development of the next generation rendering technology. The NV40 is
||> my platform of choice due to its support of very long fragment
||> programs, generalized floating point blending and filtering, and the
||> extremely high performance," said John Carmack, president and
||> technical director of id Software
||> ------------------------------------------------------------------------
--
||-----------------------------
||
||Here is another John Carmack quote:
||
||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
||

I don't care what Carmack or Sweeny say about video cards folks...
they are in bed together.. so its invalid. I'll let the specs and head
to head reviews make my decisions for me thank you. Use your heads
guys..

Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 XT
is a joke. Of course I expect the 3rd party companies to juice up the
card a bit. We'll see.. but my money is on ATI for retaining the crown
this next generation.

Pluvious

Yes, but note how HardOCP test - with different settings for different cards
for many of the tests.

Look here for a case of a doubling of performance:

http://www.anandtech.com/video/showdoc.html?i=2023&p=13
 
J

John Lewis

Before the R420 previews are even out? Nope.

Ah, and at your age, I was rather hoping that a residual spark of true
adventure still existed......

As for me, when I can afford it, the 6800U, regardless of the X800 ---
not only for the graphics power, but the very powerful VPU and the
strong likelihood that it will perform excellently with both current
and my classic-legacy software, just as my current FX5900/56.72 does.
Excluding Glide games, of course. My second machine has a V5 5500
with Win Me just for that purpose.

John Lewis
 
J

John Lewis

||||> From the nVidia news release:-
||
||> "As DOOM 3 development winds to a close, my work has turned to
||> development of the next generation rendering technology. The NV40 is
||> my platform of choice due to its support of very long fragment
||> programs, generalized floating point blending and filtering, and the
||> extremely high performance," said John Carmack, president and
||> technical director of id Software
||> --------------------------------------------------------------------------
||-----------------------------
||
||Here is another John Carmack quote:
||
||"The G-Force 3 is going to be THE card to run our next game, Doom3!"
||

I don't care what Carmack or Sweeny say about video cards folks...
they are in bed together.. so its invalid. I'll let the specs and head
to head reviews make my decisions for me thank you. Use your heads
guys..

Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 XT
is a joke.

a) Different AA and Aniso settings.

b) 60.72 is the VERY FIRST PUBLISHED driver for this card.
It is probably going to take a year to fully optimise the
compiler, so what you are seeing is the WORST performance
for this card. Also, it is going to take a while for game-
developers to "get their arms around" the new features.
Of course I expect the 3rd party companies to juice up the
card a bit.

Not much, if you are thinking hardware, like overclock. Silicon design
tools have got more accurate and process variations smaller. I suspect
that 450 max overclock will be a norm on this giant part without
water-cooling or similar. Hoiwever, this is a hugely-programmable
part. Any spectacular performance improvements will come from
software magic.
We'll see.. but my money is on ATI for retaining the crown
this next generation.

It will be a little tilted regardless. NVidia has executed a
brilliant design; let's hope for Ati's sake that their crown
doesn't fall off and be trampled by the herd queueing up
to buy variants of the 6800.

John Lewis
 
J

John Reynolds

John Lewis said:
Ah, and at your age, I was rather hoping that a residual spark of true
adventure still existed......

As for me, when I can afford it, the 6800U, regardless of the X800 ---
not only for the graphics power, but the very powerful VPU and the
strong likelihood that it will perform excellently with both current
and my classic-legacy software, just as my current FX5900/56.72 does.
Excluding Glide games, of course. My second machine has a V5 5500
with Win Me just for that purpose.

John Lewis

The wise man waits, while fools rush in. ;)

Besides, both parts are liable to reach the market relatively close to one
another. No sense making up my mind when neither are available yet.

John
 
J

John Reynolds

John Lewis said:
It will be a little tilted regardless. NVidia has executed a
brilliant design; let's hope for Ati's sake that their crown
doesn't fall off and be trampled by the herd queueing up
to buy variants of the 6800.

John Lewis

The herd is a good word to use to describe anyone who's already made up
their minds at this point in time.

John
 
J

John Reynolds

K said:
Much rather have a herd of people who want the best for their money rather
than a herd of fanboys saying that they'll wait for their beloved gfx
company to come out with something better.

And where have I written that R420 will be better? I merely suggested
waiting until both parts were announced, previewed, and then making a
decision. If you can find a fanboy-like behavior in that course of action,
more power to 'ya.
As far as I'm concerned the numbers speak for themselves and my next card
will be by Nvidia. I currently own a R9600 Pro, it was the best bang for
the buck at the time but ATI's shitty Linux support has left a bad taste
in my mouth.

The #s speak for themselves, but against what? Last generation's products?
Well, I would certainly expect a next gen. product to outperform last gen.
parts, but I'm just funny that way.

John
 
K

K

Actually I was rather amused to see the preview at HardOCP
(http://www.hardocp.com/article.html?art=NjA2). The performance is
now-where near where I expected. 10-15 fps better then the ATI 9800 XT
is a joke. Of course I expect the 3rd party companies to juice up the
card a bit. We'll see.. but my money is on ATI for retaining the crown
this next generation.

It's a bad review that's typical of those ****wits at HardOCP. It's not
that they're biased, just that most of their reviews have more holes in
them than a Swiss cheese. What they are trying, and totally failing to do,
is set each card at it's maximum settings that will produce similar
results and get a feel for 'the overall gaming experience'. For example
they set the GF6800 to 1600x1200 with x8AF and 4xAA and compare it to the
9800XT at 1280x1024 with no AA or AF. So what happens is that many people
like yourself will quickly scan over the graphs and think 'hmm, the GF
isn't much faster than the Radeon.' I suggest you check out Anandtech if
you want credible reviews.

K
 
K

K

The herd is a good word to use to describe anyone who's already made up
their minds at this point in time.

Much rather have a herd of people who want the best for their money rather
than a herd of fanboys saying that they'll wait for their beloved gfx
company to come out with something better.

As far as I'm concerned the numbers speak for themselves and my next card
will be by Nvidia. I currently own a R9600 Pro, it was the best bang for
the buck at the time but ATI's shitty Linux support has left a bad taste
in my mouth.

K
 
T

Tim

Destroy said:
devil!!

He also seems to be in bed with Intel. His engines always run better on
nonAMD systems.

LOL I know you're kidding. I hear he's also cozy with Microsoft since his
games are first released on their platforms. Like Intel, their lion's share
of the market has *nothing* to do with it.
 
P

Pluvious

||On Fri, 16 Apr 2004 16:35:08 +0000, Pluvious wrote:
||
||
||> Actually I was rather amused to see the preview at HardOCP
||> (http://www.hardocp.com/article.html?art=NjA2). The performance is
||> now-where near where I expected. 10-15 fps better then the ATI 9800 XT
||> is a joke. Of course I expect the 3rd party companies to juice up the
||> card a bit. We'll see.. but my money is on ATI for retaining the crown
||> this next generation.
||>
||>
||
||It's a bad review that's typical of those ****wits at HardOCP. It's not
||that they're biased, just that most of their reviews have more holes in
||them than a Swiss cheese. What they are trying, and totally failing to do,
||is set each card at it's maximum settings that will produce similar
||results and get a feel for 'the overall gaming experience'. For example
||they set the GF6800 to 1600x1200 with x8AF and 4xAA and compare it to the
||9800XT at 1280x1024 with no AA or AF. So what happens is that many people
||like yourself will quickly scan over the graphs and think 'hmm, the GF
||isn't much faster than the Radeon.' I suggest you check out Anandtech if
||you want credible reviews.
||
||K

HardOCP was the first 'preview' I read. This one is a lot better and
has the numbers I would have expected from a "next generation" card.
We're talking 100's of frames now..

http://www.xbitlabs.com/articles/video/display/nv40.html

Pluvious
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top