NV40 ~ GeForce 6800 specs

N

NV55

the following is ALL quote:


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html


Tuesday, April 13, 2004

NVIDIA GeForce 6800 GPU family officially announced — Cormac @ 17:00
It's time to officially introduce the new GPU generation from NVIDIA
and shed the light on its architecture and features.

So, the GeForce 6800 GPU family, codenamed NV40, today officially
entered the distribution stage. Initially it will include two chips,
GeForce 6800 Ultra and GeForce 6800, with the same architecture.


These are the key innovations introduced in NVIDIA's novelties:

*16-pipeline superscalar architecture with 6 vertex modules, DDR3
support and *real 32-bit pipelines
*PCI Express x16, AGP 8x support
*222 million transistors
*400MHz core clock
*Chips made by IBM
*0.13µm process


40x40mm FCBGA (flip-chip ball grid array) package
ForceWare 60+ series
Supports 256-bit GDDR3 with over 550MHz (1.1GHz DDR) clock rates
NVIDIA CineFX 3.0 supporting Pixel Shader 3.0, Vertex Shader 3.0;
real-time Displacement Mapping and Tone Mapping; up to 16
textures/pass, 16-bit and 32-bit FP formats, sRGB textures, DirectX
and S3TC compression; 32bpp, 64bpp and 128bpp rendering; lots of new
visual effects
NVIDIA HPDR (High-Precision Dynamic-Range) on OpenEXR technology
supporting FP filtering, texturing, blending and AA
Intellisample 3.0 for extended 16xAA, improved compression
performance; HCT (High-resolution compression), new lossless
compression algorithms for colors, textures and Z buffer in all modes,
including hi-res high-frequency, fast Z buffer clear
NVIDIA UltraShadow II for 4 times the performance in highly shadowed
games (e.g. Doom III) comparing to older GPUs


Extended temperature monitoring and management features
Extended display and video output features, including int.
videoprocessor, hardware MPEG decoder, WMV9 accelerator, adaptive
deinterlacing, video signal scaling and filtering, int. NTSC/PAL
decoder (up to 1024x768), Macrovision copy protection; DVD/HDTV to
MPEG2 decoding at up to 1920x1080i; dual int. 400MHz RAMDAC for up to
2048x1536 @ 85Hz; 2 x DVO for external TMDS transmitters and TV
decoders; Microsoft Video Mixing Renderer (VMR); VIP 1.1 (video
input); NVIDIA nView
NVIDIA Digital Vibrance Control (DVC) 3.0 for color and image clarity
management
Supports Windows XP/ME/2000/9X; MacOS, Linux
Supports the latest DirectX 9.0, OpenGL 1.5


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news_files/nv3.png



We have almost received a GeForce 6800 sample, so it's still early to
speak of GPU power consumption. Though a giant core with 222 million
transistors imply some high appetite for power. At least NVIDIA
recommends testers to use 480W and over power supplies. By the way,
GeForce 6800 Ultra reference cards will occupy two standard slots.
However, it's not obligatory for all vendors, so we might see
single-slot models as well.

Well, having seen the GPU, we now have to wait a bit for its test
results. Please be patient, we are going to publish the respective
article in the nearest future.

Ending this news I'll mention NVIDIA partners that will support the
new release by solutions on it. They are Albatron, AOpen, ASUSTeK
Computer, Chaintech, Gainward, Leadtek Research, MSI, Palit
Microsystems, PNY Technologies, Prolink Computer, Shuttle and XFX
Technologies.


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html






quote:

"8 shader units per pipeline and 16 pipelines..."

http://www.beyond3d.com/forum/viewtopic.php?t=11484
 
S

Shep©

On 13 Apr 2004 10:51:07 -0700 As truth resonates honesty
the following is ALL quote:


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html


Tuesday, April 13, 2004

NVIDIA GeForce 6800 GPU family officially announced — Cormac @ 17:00
It's time to officially introduce the new GPU generation from NVIDIA
and shed the light on its architecture and features.

So, the GeForce 6800 GPU family, codenamed NV40, today officially
entered the distribution stage. Initially it will include two chips,
GeForce 6800 Ultra and GeForce 6800, with the same architecture.


These are the key innovations introduced in NVIDIA's novelties:

*16-pipeline superscalar architecture with 6 vertex modules, DDR3
support and *real 32-bit pipelines

*PCI Express x16, AGP 8x support

Looks like new mother boards required?

*222 million transistors
*400MHz core clock
*Chips made by IBM
*0.13µm process


40x40mm FCBGA (flip-chip ball grid array) package
ForceWare 60+ series
Supports 256-bit GDDR3 with over 550MHz (1.1GHz DDR) clock rates
NVIDIA CineFX 3.0 supporting Pixel Shader 3.0, Vertex Shader 3.0;
real-time Displacement Mapping and Tone Mapping; up to 16
textures/pass, 16-bit and 32-bit FP formats, sRGB textures, DirectX
and S3TC compression; 32bpp, 64bpp and 128bpp rendering; lots of new
visual effects
NVIDIA HPDR (High-Precision Dynamic-Range) on OpenEXR technology
supporting FP filtering, texturing, blending and AA
Intellisample 3.0 for extended 16xAA, improved compression
performance; HCT (High-resolution compression), new lossless
compression algorithms for colors, textures and Z buffer in all modes,
including hi-res high-frequency, fast Z buffer clear
NVIDIA UltraShadow II for 4 times the performance in highly shadowed
games (e.g. Doom III) comparing to older GPUs


Extended temperature monitoring and management features
Extended display and video output features, including int.
videoprocessor, hardware MPEG decoder, WMV9 accelerator, adaptive
deinterlacing, video signal scaling and filtering, int. NTSC/PAL
decoder (up to 1024x768), Macrovision copy protection; DVD/HDTV to
MPEG2 decoding at up to 1920x1080i; dual int. 400MHz RAMDAC for up to
2048x1536 @ 85Hz; 2 x DVO for external TMDS transmitters and TV
decoders; Microsoft Video Mixing Renderer (VMR); VIP 1.1 (video
input); NVIDIA nView
NVIDIA Digital Vibrance Control (DVC) 3.0 for color and image clarity
management
Supports Windows XP/ME/2000/9X; MacOS, Linux
Supports the latest DirectX 9.0, OpenGL 1.5


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news_files/nv3.png



We have almost received a GeForce 6800 sample, so it's still early to
speak of GPU power consumption. Though a giant core with 222 million
transistors imply some high appetite for power. At least NVIDIA
recommends testers to use 480W and over power supplies. By the way,
GeForce 6800 Ultra reference cards will occupy two standard slots.
However, it's not obligatory for all vendors, so we might see
single-slot models as well.

Well, having seen the GPU, we now have to wait a bit for its test
results. Please be patient, we are going to publish the respective
article in the nearest future.

Ending this news I'll mention NVIDIA partners that will support the
new release by solutions on it. They are Albatron, AOpen, ASUSTeK
Computer, Chaintech, Gainward, Leadtek Research, MSI, Palit
Microsystems, PNY Technologies, Prolink Computer, Shuttle and XFX
Technologies.


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html






quote:

"8 shader units per pipeline and 16 pipelines..."

http://www.beyond3d.com/forum/viewtopic.php?t=11484



--
Free Windows/PC help,
http://www.geocities.com/sheppola/trouble.html
email shepATpartyheld.de
Free songs to download and,"BURN" :O)
http://www.soundclick.com/bands/8/nomessiahsmusic.htm
 
T

teqguy

K said:
If there is AGP 8x support, why would you need a new motherboard?

K






Because most well known manufacturers will eventually stop carrying AGP
cards all together.



If you're into business at all, you know that it's more cost effective
to produce one version of a product than two... unless you're Microsoft
or Donald Trump (aka God).




The voltages on PCI-E and AGP are entirely different, so different
components (such as resistors) must be used.


In order to avoid confusion between the two, you'd have to hire two
different production lines, have twice as many labs, and pay for two
types of packaging, manuals, etc.




Having one version of a product cuts down on confusion and returns,
which helps both consumers and retail sales.
 
M

Mr. Grinch

Because most well known manufacturers will eventually stop carrying AGP
cards all together.

The thing is, we've had AGP slots out for over 5 years now, and yet you
still find vendors making PCI video cards. So I wouldn't be too worried
about any lack of AGP video cards for some time to come. They'll be around
long enough to follow any of the current motherboards into obsolescence, at
which point you wouldn't want to be buying a video card or any other
upgrade for them anyways.
 
K

K

Because most well known manufacturers will eventually stop carrying AGP
cards all together.

Eventually, yes, but AGP will be with us well into next year. DDR2 will
replace DDR1. Socket 939 will replace socket 940, Socket T will replace
Socket 728, BTX will eventually replace ATX, the list goes on in the never
ending upgrade cycle.

Having one version of a product cuts down on confusion and returns,
which helps both consumers and retail sales.

Absolutely, and I'm sure that the likes of ATI and Nvidia as
well as the motherboard makers will push us to PCI Express as soon as they
can. But it would be suicide for one of them to bring out a new card and
only cater for those who are prepared to buy new motherboards. It's just
the poster I replied to implied that there would be an immediate need to
replace your motherboard, which is clearly not the case.

I have a gut feeling that PCI Express will do very little for performance,
just like AGP before it. Nothing can substitute lots of fast RAM on the
videocard to prevent shipping textures across to the much
slower system RAM. You could have the fastest interface imaginable for
your vid card; it would do little to make up for the bottleneck that
is your main memory.


K
 
S

Shep©

If there is AGP 8x support, why would you need a new motherboard?

K

Because it's my understanding that although the new protocol/cards
support AGP 8X this is merely a data rate comparison and the new cards
will only fit a,"PCI-Express" slot,not an AGP one.
http://www.pcstats.com/articleview.cfm?articleID=1087

HTH :)




--
Free Windows/PC help,
http://www.geocities.com/sheppola/trouble.html
email shepATpartyheld.de
Free songs to download and,"BURN" :O)
http://www.soundclick.com/bands/8/nomessiahsmusic.htm
 
N

NightSky 421

NV55 said:
the following is ALL quote:


Regardless of if someone wants the new high-end nVidia or ATI product,
I've read that a person better have a monster power supply and excellent
case cooling before even considering such cards. I also wonder how loud
the fans on these new cards are going to need to be. It'd be
interesting to see what they can do with regards to cooling and power
consumption on future video cards too - I see this as getting to be more
and more of a problem with time.
 
T

teqguy

NightSky said:
Regardless of if someone wants the new high-end nVidia or ATI product,
I've read that a person better have a monster power supply and
excellent case cooling before even considering such cards. I also
wonder how loud the fans on these new cards are going to need to be.
It'd be interesting to see what they can do with regards to cooling
and power consumption on future video cards too - I see this as
getting to be more and more of a problem with time.





The power consumption should stay below 15v.

The Geforce FX does NOT use the 12v rail, for anyone wondering.


All 4 pins are connected for potential usage, but the overall
consumption never raises above 5.5v so 17v is not neccessary.




Most companies are starting to push for water cooling. Gainward is one
of them that announced they are going to start shipping a version of
their cards that have a waterblock in place of a conventional heatsink
and fan.



As far as the reference Nvidia cards go... I'm pretty sure we'll start
out with the dustbuster again... at least until someone can decide on a
more effective method.


Solid silver heatsink anyone? =P
 
T

teqguy

K said:
Eventually, yes, but AGP will be with us well into next year. DDR2
will replace DDR1. Socket 939 will replace socket 940, Socket T will
replace Socket 728, BTX will eventually replace ATX, the list goes on
in the never ending upgrade cycle.



Absolutely, and I'm sure that the likes of ATI and Nvidia as
well as the motherboard makers will push us to PCI Express as soon as
they can. But it would be suicide for one of them to bring out a new
card and only cater for those who are prepared to buy new
motherboards. It's just the poster I replied to implied that there
would be an immediate need to replace your motherboard, which is
clearly not the case.

I have a gut feeling that PCI Express will do very little for
performance, just like AGP before it. Nothing can substitute lots of
fast RAM on the videocard to prevent shipping textures across to the
much slower system RAM. You could have the fastest interface
imaginable for your vid card; it would do little to make up for the
bottleneck that is your main memory.


K






Current high end graphics cards do very little with an AGP 4x bus, let
alone an 8x bus.



The best possible optimization that could ever be made, would be to
start manufacturing motherboards with sockets for a GPU and either
sockets or slots for video memory.


This would allow for motherboards to potentially reduce in size, while
increasing in performance and upgradability.


The price would increase, but it would be worth it.
 
J

joe smith

As far as the reference Nvidia cards go... I'm pretty sure we'll start
out with the dustbuster again... at least until someone can decide on a
more effective method.

That kind of sucks with K8V if anyone taken notice of where the firewire
connector is in the motherboard.. it might fit perfectly tho, you never know
before you try.. :)

I heard that NV40 boards will have _two_ power connectors...? When RADEON's
came with just one, I thought that was already one too many, LOL, but since
it's inside the case who cares in the end of the day. But two? Huh! 200+
Million transistors sure suck some power.. but certainly 350 Watt supply
with only 5 IDE devices connected should be enough? ;-)

It would suck if find out suddenly (from the smoke coming from the PSU) that
oh shit, looks like 450-500 watts would be required anyways... though I
find it amazingly unlikely, but since someone else in this thread was
concerned about his PSU being sufficient had to ask. NV40 would rock for
programming, because that's the only way for quite a while to try out VS 3.0
and PS 3.0 if I am not mistaken? I read from this NG that ATI wouldn't have
these in their new chip, why the hell not!?

Peace.
 
J

joe smith

Ah,

http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html

I see from the pictures (assuming not fakes ;) that the card should fit
reasonably to "single" AGP (8x) slot more or less.. that's nice, but the
best part about this debacle is two DVI ports. That is the part I like the
most, currently using DVI + DB25 to two TFT's.

Looks like a winner to me compared to ATI, performance alone isn't what
turns me on, RADEON 9700 PRO - RADEON 9800 XT are plenty fast as they come,
the enhanced feature set is what turns me on. Especially these two:

- 3.0 shaders (vertex samplers will be SO cool)
- 32 bit precision for the whole pipeline from vertex to output fragment, v.
cool

The rest is yada yada yada.. but those two features are what 'do it',
atleast for me from coder's point of view. Extra performance is so
yesterday. ;-)
 
J

joe smith

No, really, I can't believe my eyes that after two year trip to the ATI
side, would again consider NV even a candidate, not to mention #1 choise as
gfx card upgrade. Must suck to base your choises to brandname aka. the
Fanboy's Choise. Looking at the offerings, this is a no-brainer for me. :)
 
C

chrisv

teqguy said:
The power consumption should stay below 15v.

The Geforce FX does NOT use the 12v rail, for anyone wondering.

All 4 pins are connected for potential usage, but the overall
consumption never raises above 5.5v so 17v is not neccessary.

Surely you can't believe that we can take the advice of someone who
thinks that power "consumption" is measured in Volts. What you wrote
is complete drivel, sorry.
 
C

chrisv

teqguy said:
The best possible optimization that could ever be made, would be to
start manufacturing motherboards with sockets for a GPU and either
sockets or slots for video memory.


This would allow for motherboards to potentially reduce in size, while
increasing in performance and upgradability.


The price would increase, but it would be worth it.

No it wouldn't.
 
L

Les

Shep© said:
Because it's my understanding that although the new protocol/cards
support AGP 8X this is merely a data rate comparison and the new cards
will only fit a,"PCI-Express" slot,not an AGP one.
http://www.pcstats.com/articleview.cfm?articleID=1087

HTH :)




--
Free Windows/PC help,
http://www.geocities.com/sheppola/trouble.html
email shepATpartyheld.de
Free songs to download and,"BURN" :O)
http://www.soundclick.com/bands/8/nomessiahsmusic.htm

They still releasing AGP 8x versions along side PCI x16. I read somewhere
nvidia is doing something with a bridging device while ATI is making totally
seperate cards, ie R420 is agp 8x and R423 is a proper PCI x16 card. I
cannot for the life of me remember where I read it though sorry. It *could*
have been anandtech
 
A

Ar Q

NV55 said:
the following is ALL quote:


http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html


Tuesday, April 13, 2004

NVIDIA GeForce 6800 GPU family officially announced - Cormac @ 17:00
It's time to officially introduce the new GPU generation from NVIDIA
and shed the light on its architecture and features.

So, the GeForce 6800 GPU family, codenamed NV40, today officially
entered the distribution stage. Initially it will include two chips,
GeForce 6800 Ultra and GeForce 6800, with the same architecture.


These are the key innovations introduced in NVIDIA's novelties:

*16-pipeline superscalar architecture with 6 vertex modules, DDR3
support and *real 32-bit pipelines
*PCI Express x16, AGP 8x support
*222 million transistors
*400MHz core clock
*Chips made by IBM
*0.13µm process

Isn't it time for NVidia to use 0.09um process? How could they put some
many features if still using 0.13 um process?
 
G

G

K said:
I have a gut feeling that PCI Express will do very little for performance,
just like AGP before it. Nothing can substitute lots of fast RAM on the
videocard to prevent shipping textures across to the much
slower system RAM. You could have the fastest interface imaginable for
your vid card; it would do little to make up for the bottleneck that
is your main memory.


But what about for things that don't have textures at all?

PCI Express is not only bi-directional, but full duplex as well. The
NV40 might even use this to great effect, with its built-in hardware
accelerated MPEG encoding/decoding plus "HDTV support" (which I assume
means it natively supports 1920x1080 and 1280x720 without having to
use Powerstrip). The lower cost version should be sweet for Shuttle
sized Media PC's that will finally be able to "tivo" HDTV.

I can also see the 16X slot being used in servers for other things
besides graphics. Maybe in a server you'd want your $20k SCSI RAID
Controller in it. Or in a cluster box a 10 gigabit NIC.

There's more to performance than just gaming. And there's more to PCI
Express than just the 16X slot which will be used for graphics cards
initially. AGP was a hack, and (as others have said) it hit the wall
at "4X". PCI Express is a *VERY* well thought out bus that should be
alot better than PCI, PCI-X, and AGP... not to mention things bolted
directly to the Northbridge. If it helps games a little in the
process, it's just gravy.
 
R

rms

The best possible optimization that could ever be made, would be to
No it wouldn't.

haha! I agree completely. Videocards have reached such a complexity
it's doubtful that a single company could produce both successfully. Not to
mention the question of upgradeability, which is why we have pci/agp in the
first place.

rms
 
R

rms

No, really, I can't believe my eyes that after two year trip to the ATI
side, would again consider NV even a candidate, not to mention #1 choise as
gfx card upgrade. Must suck to base your choises to brandname aka. the
Fanboy's Choise. Looking at the offerings, this is a no-brainer for me. :)

pfft. You don't even know what the ATI offering is as yet, much less
are you able to buy a 6800 until well into next month.

rms
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top