John Carmack's official comments on NV40 (GeForce 6800 family)

J

John Lewis

From the nVidia news release:-
----------------------------------------------------------------------------------------------------------------
NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
models of graphics processing units (GPUs) for high-performance
desktop computers. The NVIDIA GeForce 6 Series, which includes the
flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

-- Industry-leading 3D performance -- new superscalar 16-pipe
architecture delivers more than twice that of current industry leading
NVIDIA GPUs

-- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
feature set -- for ultra-realistic cinematic effects

-- Unprecedented on-chip video processing engine -- enabling high-
definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA. "In addition to the raw performance increase, we had two
fundamental strategies with the 6800 models. First was to take
programmability to the next level with the industry's only GPU with
Shader Model 3.0. Second was to extend the reach of GPUs to the
consumer electronics market with a powerful and fully programmable
video processor capable of multiple video formats and 'prosumer' level
image processing."

"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software
-------------------------------------------------------------------------------------------------------

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.

For me personally the 6800 is as exciting a step forward in PC
peripherals as the Voodoo1 was when it first emerged. Not only for
the 6800s enormous graphical power, but also for its potential
contribution to PC-based video production and editing, which is an
active business for me. The very powerful integrated video processor
is as important to me as the graphics capability, particularly the
MPEG-2 encoding hardware elements. Adobe After Effects have
already declared support for the NV40 and no doubt other video
toolmakers like Pinnacle are looking hard at its capability. Now if
Intel would only reduce the price of the P4 EE to that of the retail
list of the 6800Ultra, or less, instead of fleecing potential
customers at $999 a pop, then I would be very happy indeed with my
video production/editing hardware after those two were installed.

John Lewis
 
K

K

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.


Gabe's still counting his $5,000,000 in change he got for selling ATI
worthless pieces of paper to bundle in with their cards :)

K
 
J

John Reynolds

John Lewis said:
From the nVidia news release:-
-------------------------------------------------------------------------- --------------------------------------
NVIDIA Corporation ( NASDAQ: NVDA), the worldwide leader in visual
processing solutions, introduced today the NVIDIA(R) GeForce(TM) 6800
models of graphics processing units (GPUs) for high-performance
desktop computers. The NVIDIA GeForce 6 Series, which includes the
flagship GeForce 6800 Ultra and GeForce 6800, is designed to deliver:

-- Industry-leading 3D performance -- new superscalar 16-pipe
architecture delivers more than twice that of current industry leading
NVIDIA GPUs

-- New features, including Microsoft DirectX(R) 9.0 Shader Model 3.0
feature set -- for ultra-realistic cinematic effects

-- Unprecedented on-chip video processing engine -- enabling high-
definition video and DVD playback

"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA. "In addition to the raw performance increase, we had two
fundamental strategies with the 6800 models. First was to take
programmability to the next level with the industry's only GPU with
Shader Model 3.0. Second was to extend the reach of GPUs to the
consumer electronics market with a powerful and fully programmable
video processor capable of multiple video formats and 'prosumer' level
image processing."

"As DOOM 3 development winds to a close, my work has turned to
development of the next generation rendering technology. The NV40 is
my platform of choice due to its support of very long fragment
programs, generalized floating point blending and filtering, and the
extremely high performance," said John Carmack, president and
technical director of id Software
-------------------------------------------------------------------------- -----------------------------

Still have to hear from Gabe@Valve. All quiet from him, so
far......... Must be busy tweaking the HL2 code for Shaders 3.0 ?
Shaders 2.0 must now be just a little passe..... Far Cry V1.1
implementation of Shader 3.0 is apparently little rough at the
moment, but Crytek says that they are working on it. No doubt
it will be in a polished patch by the time the NV40 is retail
available.

For me personally the 6800 is as exciting a step forward in PC
peripherals as the Voodoo1 was when it first emerged. Not only for
the 6800s enormous graphical power, but also for its potential
contribution to PC-based video production and editing, which is an
active business for me. The very powerful integrated video processor
is as important to me as the graphics capability, particularly the
MPEG-2 encoding hardware elements. Adobe After Effects have
already declared support for the NV40 and no doubt other video
toolmakers like Pinnacle are looking hard at its capability. Now if
Intel would only reduce the price of the P4 EE to that of the retail
list of the 6800Ultra, or less, instead of fleecing potential
customers at $999 a pop, then I would be very happy indeed with my
video production/editing hardware after those two were installed.

John Lewis

Your post would be. . .hmmm, what's the word. . .more legit if you weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at running
DX9 code at floating point precision. Hell, these NV40 previews show that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and for
Far Cry whether those new screenshots require SM 3.0 support is still up in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat and the fact that both AA and AF
could be better. It'll be interesting to see if the R420 from ATI can
compete. Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board and
Tim said, about R420, that "It rocks!". This next generation is definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds
 
J

John Reynolds

Skippy said:
$7 million, actually...

And how much did nVidia pay Activison/id for the Doom 3 deal? I've heard
4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh devil!!

John
 
D

Destroy

And how much did nVidia pay Activison/id for the Doom 3 deal? I've heard
4-5 million. Gwarsch! Carmack has sold out to nVidia!! He's deh devil!!

He also seems to be in bed with Intel. His engines always run better on
nonAMD systems.
 
J

John Lewis

Your post would be. . .hmmm, what's the word. . .more legit if you weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at running
DX9 code at floating point precision. Hell, these NV40 previews show that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and for
Far Cry whether those new screenshots require SM 3.0 support is still up in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat

the NV40 GPU consumes ~ 25 watts more than the NV35 or R350.
The whole board consumes a max of 110 watts. Compare the
Prescott 3.4 CPU @ 103 watts max. ( Northwood 3.4, 89 watts )

and the fact that both AA and AF
could be better.

In what way... please be specific...
It'll be interesting to see if the R420 from ATI can
compete

Let's hope that they have a VPU on board that is competitive with
that in the NV40. For professional video applications, that feature
is almost as important as the graphics-engine features.

Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board and
Tim said, about R420, that "It rocks!". This next generation is definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds

aka John "Ati fanboy now, past- 3dfx and nVidia fan-boy" Reynolds.

John Lewis
 
D

Derek Baker

John Reynolds said:
Your post would be. . .hmmm, what's the word. . .more legit if you weren't
coming off as a nVidia fanboy flaming away at Valve, John. Newell simply
voiced what every developer knew about the FX parts: they sucked at running
DX9 code at floating point precision. Hell, these NV40 previews show that
more clearly than anything else. And what do Carmack and Newell have in
common? Their companies' new engines both required special code paths to
get good performance out of FX boards? Think about that, John. Oh, and for
Far Cry whether those new screenshots require SM 3.0 support is still up in
the air. I've heard they're created using offset mapping, not vertex
texturing; this was written by Democoder, the guy who got that Unreal 3
engine movie and some Far Cry shots from yesterday (he's a regular poster at
B3D).

Anyways, the 6800U looks like a very impressive part. The only real
negatives are the power consumption/heat and the fact that both AA and AF
could be better. It'll be interesting to see if the R420 from ATI can
compete. Anthony "Reverend" Tan just quoted Tim Sweeney on B3D's board and
Tim said, about R420, that "It rocks!". This next generation is definitely
going to be much more interesting than last year's, that's for sure.

John "fanboys suck" Reynolds

Got a link for that Tan comment?
 
J

John Reynolds

John Lewis said:
the NV40 GPU consumes ~ 25 watts more than the NV35 or R350.
The whole board consumes a max of 110 watts. Compare the
Prescott 3.4 CPU @ 103 watts max. ( Northwood 3.4, 89 watts )

I read so many previews yesterday that I don't remember which one, but one
of them did show under load the 6800U hitting around 200 watts.
In what way... please be specific...

The AA is limited to 4x for multi-sampling, and it lacks gamma correction
and programmable patterns. It's an improvement from previous nVidia parts,
but still lags behind ATI's AA. And the AF is now angle-dependent like
ATI's, which is an intentional step-down in quality.
Let's hope that they have a VPU on board that is competitive with
that in the NV40. For professional video applications, that feature
is almost as important as the graphics-engine features.



aka John "Ati fanboy now, past- 3dfx and nVidia fan-boy" Reynolds.

I used to be a 3dfx fanboy years ago. I'll never make that mistake again.
Though I've been running ATI hardware since Sept. of '02 when the 9700 Pro
came out, I'll switch to a 6800U in a heartbeat if I think it's the better
part (gotta' wait for those R420 previews).

John
 
J

John Lewis

I used to be a 3dfx fanboy years ago. I'll never make that mistake again.
Though I've been running ATI hardware since Sept. of '02 when the 9700 Pro
came out, I'll switch to a 6800U in a heartbeat if I think it's the better
part (gotta' wait for those R420 previews).

The sooner the better.... for Ati. Hopefully they will be able to
stick with the April 26 introduction and that the retail sales
availability will be close to that of nVidia. Vicious competition
is always great for the customer's pocketbook.

John Lewis
 
J

John Lewis


Quote from the above link.....
-------------------------------------------------------------------------------

Seriously though, making a solid decision about the NV40 would have to
depend on the answer to "I wonder how it compares to the R420?". On
its own, in Tim Sweeney's words :

Tim Sweeney, to Rev via email prior to lifting of embargo, wrote:
It rocks!

--------------------------------------------------------------------------------

From the comparison-question context, surely Tim is referring to the
NV40, not the R420 when he says "It rocks" ???


John Lewis
 
J

John Lewis

Kids, don't let middle-aged people post online. I misread that freakin'
post. . .Sweeney is actually referring to NV40, not R420. Double d'oh!!!

John

You are forgiven. Very easy to happen when you lose your reading
glasses....

So, better start saving now for the 6800U ??? And power-supply.
Might as well just get a new case and power supply of the style
with the fan (hopefully quiet) in the middle of the cover and
close to the 6800U.........

John Lewis
 
T

teqguy

ec said:
That's because AMD sucks ass.





I'll take SSE2 over 3DNOW! any day....


I mean, come on people... give it up... 3dNow's optimizations are now
INCLUDED on most GPU's.


3dNow only aided older cards that didn't have the bandwidth to match
the processor.


Now, the GPU's are outperforming the CPU, so it's entirely unnecessary.



AMD should definitely update this instruction set... if they did they'd
be god.

Imagine a processor with Pixel Shader 3.0, bump mapping, vertex
shading... drool
 
A

Andrew

He also seems to be in bed with Intel. His engines always run better on
nonAMD systems.

Because SSE is mysteriously disabled in Q3 engined games when run on
AMD CPU's.
 
N

NightSky 421

John Lewis said:
"This is the biggest generation-to-generation performance leap that we
have ever seen with a new GPU," said Jen-Hsun Huang, president and CEO
of NVIDIA.


I hope so, since nVidia's reputation depends on it. That said, I have
read what Tech Report had to say about the 6800 Ultra and it looks as
though nVidia has really smartened up with their technology this time
around. I'm still left with a bad taste the way their PR department
handled the NV30 series though. I also look forward to seeing
comparitive reviews once both nVidia and ATI have all of their new cards
out.
 
N

NightSky 421

John Lewis said:
You are forgiven. Very easy to happen when you lose your reading
glasses....


LOL, I know the feeling as well, except that my glasses are far more
thick than just ordinary reading glasses. :)

So, better start saving now for the 6800U ??? And power-supply.
Might as well just get a new case and power supply of the style
with the fan (hopefully quiet) in the middle of the cover and
close to the 6800U.........


Personally, I'll probably wait for the product refresh and grab a
first-gen R420 or NV40 (probably R420) when they drop in price sharply.
I too look like I'm in the position of needing a new power supply for it
though! I'm just glad that I already have a decent case with good
cooling (Lian-Li PC-60).
 
Z

zmike6

That's because AMD sucks ass.

Yeah, that Athlon64's really a dog, compared to the highly successful
and desirable Prescwatt, and the easily-obtainable Emergency
^H^H^H^H^H^H^ Extreme Edition. It's not like most benchmarks show
the Athlon64 to have better performance *and* lower power
consumption/heat issues. And, who would want a CPU that already
supports the next generation of operating systems, when you could get
an Intel 32-bit-only CPU?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top