NV40 ~ GeForce 6800 specs

T

teqguy

chrisv said:
product, >> I've read that a person better have a monster power
supply and >> excellent case cooling before even considering such
cards. I also >> wonder how loud the fans on these new cards are
going to need to be. >> It'd be interesting to see what they can do
with regards to cooling >> and power consumption on future video
cards too - I see this as >> getting to be more and more of a problem
with time.

Surely you can't believe that we can take the advice of someone who
thinks that power "consumption" is measured in Volts. What you wrote
is complete drivel, sorry.







Uh, voltage is part of consumption... along with amperage....


The amount of amperes this card will consume is right now between 32
and 35.

The FX series is 28 to 23, ranging from the 5950 to the 5200.






If you have nothing to contribute, shut up. What's drivel is your
obsessant need to critique everything anyone ever says.
 
T

teqguy

joe said:
That kind of sucks with K8V if anyone taken notice of where the
firewire connector is in the motherboard.. it might fit perfectly
tho, you never know before you try.. :)

I heard that NV40 boards will have two power connectors...? When
RADEON's came with just one, I thought that was already one too many,
LOL, but since it's inside the case who cares in the end of the day.
But two? Huh! 200+ Million transistors sure suck some power.. but
certainly 350 Watt supply with only 5 IDE devices connected should be
enough? ;-)

It would suck if find out suddenly (from the smoke coming from the
PSU) that oh shit, looks like 450-500 watts would be required
anyways... though I find it amazingly unlikely, but since someone
else in this thread was concerned about his PSU being sufficient had
to ask. NV40 would rock for programming, because that's the only way
for quite a while to try out VS 3.0 and PS 3.0 if I am not mistaken?
I read from this NG that ATI wouldn't have these in their new chip,
why the hell not!?

Peace.




The two power connectors will eventually come down to one... right now
testing is only showing that stability is better achieved using 4 rails
instead of two.
 
T

teqguy

G said:
But what about for things that don't have textures at all?

PCI Express is not only bi-directional, but full duplex as well. The
NV40 might even use this to great effect, with its built-in hardware
accelerated MPEG encoding/decoding plus "HDTV support" (which I assume
means it natively supports 1920x1080 and 1280x720 without having to
use Powerstrip). The lower cost version should be sweet for Shuttle
sized Media PC's that will finally be able to "tivo" HDTV.

I can also see the 16X slot being used in servers for other things
besides graphics. Maybe in a server you'd want your $20k SCSI RAID
Controller in it. Or in a cluster box a 10 gigabit NIC.

There's more to performance than just gaming. And there's more to PCI
Express than just the 16X slot which will be used for graphics cards
initially. AGP was a hack, and (as others have said) it hit the wall
at "4X". PCI Express is a VERY well thought out bus that should be
alot better than PCI, PCI-X, and AGP... not to mention things bolted
directly to the Northbridge. If it helps games a little in the
process, it's just gravy.





Most MPEG encoding is processor dependent... I wish developers would
start making applications that let the graphics card do video encoding,
instead of dumping the work on the processor.




The bandwidth of AGP 2X can carry a high definition signal... so I
don't understand how you can expect PCI-Express to do it any better.





Last time I checked, an HD signal operates at 8Mb/s.... DVD @
2.5Mb/s... VCR @ 250Kb/s

PCI Express can potentially carry up to 4.3Gb/s... so do the math.





SCSI only operates at 320Mb/s.

In RAID stripe 0, it's roughly 460Mb/s.


So again... a lot more bandwidth than required.



And definitely a lot more expensive than using onboard SCSI.
 
J

joe smith

pfft. You don't even know what the ATI offering is as yet, much less
are you able to buy a 6800 until well into next month.

No, I do not. I wrote that the rumor is that ATI wouldn't have 3.0 level
shaders.. I was commenting on a rumor, if that isn't true then the situation
is naturally entirely different. The confidentially / NDA ends 19th this
month so soon after that we should begin to see cards dripping to the
shelves like always (just noticed a trend in past 5-7 years, could be wrong
but I wouldn't die if had to wait even 2 months.. or 7.. or 3 years.. the
stuff will get here sooner or later.. unless the world explodes before that
;)=

Relax dude, you don't have to pfff, obviously any intelligent person know
what you're saying.. I wasn't commenting on that, or claiming that the cards
will be here TOMORROW!!!! Or that ATI will definitely NOT have 3.0 spec
shaders, now, if you want to argue that fact look up the person who posted
the RUMOR about that, then PFFFF his ass! Pfff... <- now that is for a valid
reasons... heh :)
 
T

Tony DiMarzio

I'd have to agree. It looks like this guy is trying to masquerade anti-ATI
sentiment as nonchalance and "no-brainer" NVIDIA superiority. Sorry, but
your weak psychology is definitely not fooling me.
 
C

chrisv

teqguy said:
Uh, voltage is part of consumption... along with amperage....

That doesn't make what you said above sensible. It was senseless
drivel. Deal with it.
The amount of amperes this card will consume is right now between 32
and 35.

The FX series is 28 to 23, ranging from the 5950 to the 5200.

Better late then never, I guess.
If you have nothing to contribute, shut up.

If you're just going to post drivel, shut up.
What's drivel is your
obsessant need to critique everything anyone ever says.

Wrong again.
 
J

JLC

joe smith said:
Ah,

http://frankenstein.evilgeniuslabs.com/~pszuch/nv40/news.html

I see from the pictures (assuming not fakes ;) that the card should fit
reasonably to "single" AGP (8x) slot more or less.. that's nice, but the
best part about this debacle is two DVI ports. That is the part I like the
most, currently using DVI + DB25 to two TFT's.

It says right in that same article that the new cards will take two slots.
But it is possible for vendors to come out with single slot cards.
I find it amazing that it says that Nvidia recommended that their testers
use at lest a 480W PS. That's going to be a very expensive upgrade for a lot
of people. And a lot of guys that think they have a 480+ PS will find that
there cheap PS is not up to the task.
So the Ultra is gonna start at $499 + say another $100 for a quality PS, Wow
$599 just to play games that probably don't need a fraction of the power the
new card can deliver. Let's hope that Doom 3 runs great on with this card.
Of course by the time the game finally comes out this card will probably
cost $150. JLC
 
D

DaveL

I think Nvidia learned their lesson about that from the 5800U debacle. It
was ATI that stayed with the old standard and took the lead in performance.
Meanwhile, Nvidia was struggling with fab problems.

DaveL
 
J

JLC

They still releasing AGP 8x versions along side PCI x16. I read somewhere
nvidia is doing something with a bridging device while ATI is making totally
seperate cards, ie R420 is agp 8x and R423 is a proper PCI x16 card. I
cannot for the life of me remember where I read it though sorry. It *could*
have been anandtech
Right on ATI's site it says that they are the only company to be making a
"True PCI Express card" It's right on there front end.
JLC
 
D

DaveL

Yeah. That's one odd feature I don't get. Those power connectors are tied
to the same source. I guess the wires can only hold so much. But what
about the traces on the power supply?

DaveL
 
J

joe smith

I'd have to agree. It looks like this guy is trying to masquerade anti-ATI
sentiment as nonchalance and "no-brainer" NVIDIA superiority. Sorry, but
your weak psychology is definitely not fooling me.

Wrong. I have a RADEON 9700 PRO in Athlon64 3000+ box and it's fast enough
for everything. I'm a programmer so I am very interested in the 3.0 shaders,
that is the only reason for me to upgrade at all. That's why I don't have
RADEON 9800 XT, it only has increased speed and that is something I am not
too thrilled about at this point because the 9700 PRO is very nice already.
How is that anti-ATI sentiment?

I repeat: I am interested in programming for the 3.0 shaders. If ATI won't
deliver then I have to revert back to NV based products. GeForce4 was the
last NV card I purchased, it's now on Pentium4 2.4 Ghz box running Linux
Mandrake 1.0 Community. It works great there. I have nothing against nVidia
as they were. It just a fact that until recent rumors and reviews of the
GeForce 6800 on various websites I didn't even think NV was worth crap
against ATI's latest offerings.

How is that anti-ATI sentiment? I'm entitled to be thrilled about 3.0
shaders as much as I want, and I can post how much I am thrilled it I want.
And you can make claims about me but you don't have to be right, infact, you
are entirely mistaken and incorrect. Thanks for your time.
 
J

joe smith

use at lest a 480W PS. That's going to be a very expensive upgrade for a
lot of people.

It sure will.
new card can deliver. Let's hope that Doom 3 runs great on with this card.
Of course by the time the game finally comes out this card will probably
cost $150. JLC

That's a very good point, I was only speaking for myself. I don't play games
much at all, we do some Blackhack Down and Warcraft III TFT multiplayer a
couple of times a week. For that pretty old card would suffice. It's the
work that I need the latest features for, I won't even be paying for the
card myself anyway. :)
 
D

DaveL

It just occured to me that it may be the traces on the 6800U that Nvidia is
worried about. Why not just use wider traces?

DaveL
 
K

K

There's more to performance than just gaming. And there's more to PCI
Express than just the 16X slot which will be used for graphics cards
initially. AGP was a hack, and (as others have said) it hit the wall
at "4X". PCI Express is a *VERY* well thought out bus that should be
alot better than PCI, PCI-X, and AGP... not to mention things bolted
directly to the Northbridge. If it helps games a little in the
process, it's just gravy.

I was only talking about what PCI Express will do for graphics, which I
think will be very little. Of course it is going to be great for
applications such as RAID, 1Gb and 10Gb ethernet etc. PCI has served us
well but it's time it moved on. Lots of good reasons for PCI Express, but
not gfx.

I wish that Aureal was still around. One of the problems with Vortex 2 and
more so with Vortex 3 was that A3D was very heavy on the PCI bus with all
the positional information it to share with the CPU. PCI Express would
have gone really well with Vortex 3. But those bastards at Craptive sent
Aureal under a wave of malicious litigation and now the tech is sitting
in a vault somewhere. Now we can only dream of what could have been...

K
 
M

Mr. Grinch

JLC said:
It says right in that same article that the new cards will take two
slots. But it is possible for vendors to come out with single slot
cards. I find it amazing that it says that Nvidia recommended that
their testers use at lest a 480W PS. That's going to be a very
expensive upgrade for a lot of people. And a lot of guys that think
they have a 480+ PS will find that there cheap PS is not up to the
task. So the Ultra is gonna start at $499 + say another $100 for a
quality PS, Wow $599 just to play games that probably don't need a
fraction of the power the new card can deliver. Let's hope that Doom 3
runs great on with this card. Of course by the time the game finally
comes out this card will probably cost $150. JLC

I _really_ want a new system right now. I mean, I'm running dual P3-800
with Ti4200 video, and it just doesn't cut it for todays games. But the
game I know I want is Doom 3 and who know's when it will be out. When it
does come out, it's anyone's guess what will be the best video card for the
game. There will be the fastest, then there will be the best price /
performance cards, a little slower, a lot cheaper, etc. I'm just going to
have to wait until the game comes out if I don't want to spend too much
money and want to be really sure, making a decision based on real
benchmarks of production code and production hardware.

But I hate waiting! My current setup is killing me! I'm sure it's not
the Ti4200's fault, it's a great card, I'm just too CPU limited. But
again, it will be interesting to see which cpu / video card combo does Doom
3 the best. More waiting!

Argh!
 
G

G

teqguy said:
The bandwidth of AGP 2X can carry a high definition signal... so I
don't understand how you can expect PCI-Express to do it any better.

Nope. AGP's upstream bandwidth is only half-duplex. It's not the
bandwidth that's the problem.

Here's an article that explains it in detail (with further links to
PCI Express info as well): "PCI Express and HD Video: Marriage Made in
Heaven?"

http://www.extremetech.com/article2/0,1558,1533061,00.asp

SCSI only operates at 320Mb/s.
In RAID stripe 0, it's roughly 460Mb/s.
So again... a lot more bandwidth than required.

That's not the point. SCSI controllers don't sit in the AGP slot. If
you're switching to comparing PCI Express with PCI/PCI-X then you have
to switch to talking about total bandwidth in the whole system.
Besides, SCSI is up to 640Mb/s.
And definitely a lot more expensive than using onboard SCSI.

Being onboard has nothing to do with it either. The onboard controller
has to be connected somehow. It's on some bus or another even if it's
not sitting in a slot.
 
M

Mark Leuck

chrisv said:
Better late then never, I guess.


If you're just going to post drivel, shut up.


Wrong again.

If I recall you are the same chrisv who stated over and over in
alt.computer.storage how IBM never had a problem with it's last batch of
Deathstar hard drives

Ignore the troll folks, he knows not what he says
 
R

Ricardo Delazy

Most MPEG encoding is processor dependent... I wish developers would
start making applications that let the graphics card do video encoding,
instead of dumping the work on the processor.

I'm pretty sure I read somewhere that the (new & improved) Prescotty
processor has been given a special hard wired instruction set which is
dedicated to encoding video, so that should speed things up some what.

I remember reading an article over a year ago which had Intel giving a
demo of a future release CPU which apparently was running 3 full screen
HD videos simultaneously rotating in a 3d cube. The processor prototype
was not specified, but it may have been a Tejas as it was rated at 5GHz.

Ricardo Delazy
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top