Nvidia panics, scraps NV50 ?????

X

Xenon

I take this as just a rumor, for now.

http://www.theinquirer.net/?article=20034
Nvidia's NV50 canned

New things to come


By Fuad Abazovic: Friday 03 December 2004, 13:57

WE ARE told that Nvidia's next gen chip NV50 has been canned as well as the
NV48 chip we reported on earlier in the week. I guess both were not fitting
well into Nvidia's picture.
We don't have any idea as yet what lead to such a decision, but Nvidia does
apparently think it's now time to make its next breakthrough chip.

All we know is that Nvidia made a huge u-turn or a right turn in its roadmap
as Intel describes it, and we don't yet know where that leads.

We also know that Nvidia is very dedicated to win the graphic fight and to
move away from the deadly embrace with ATI. In the current generation, it's
not important what you get as both Nvidia and ATI cards are performing
almost identically.

Nvidia still strongly believes that SLI is something that will become very
important in the future. µ



____________________________________________________________________________
___________



hmmm. Maybe Nvidia has seen ATI R520 and/or the more advanced R500 (for
Xbox 2?) and knows that ATI's next, next gen graphics part, the R600, will
be killer. the NV50 would have to be facing the R600 in the early 2006
timeframe. so maybe Nvidia is changing their roadmap. maybe Nvidia is
accelerating NV60 design. or will simply make what would be NV55, the NV50,
or whatever. remember both Nvidia and ATI have at least 3 GPU design teams
each and are always working on several generations of chips at the same
time.

Nvidia has, so far, been shut out of the nextgen game consoles. ATI has 2
of the 3 locked up (MS, Nintendo) and Sony has, so far, done their own
graphics in-house. Nvidia needs some breakout technology or breakout deal
(maybe Sony?) to follow-up the sucessful NV4x / GeForce 6x series.

It'll be fun to see what Nvidia does, they're a survivor. remember the
disasters that were NV1 (Diamond Edge 3D card) and NV2 (for Sega) in
1995-1996? they bounced back with Riva128 (NV3) and TNT (NV4) then TNT2
(NV5) and GeForce (NV10) very nicely.
 
S

Sham B

If true it can only mean good things for us, if Nvidia can bring out an ATI
killer, ATI will kick back!, or could it be ATI are doing to Nvidia what they did to 3DFX?, only
time will tell!.


nVidia *is* 3DFX when it comes to its design staff - it absorbed a lot of 3DFX people during take
over.
I'm sure the nVidia problems are due to the same as 3DFX - the large amount of heat their designs
generate, and the increasing cost that this imposes. At the moment, high end ATI and nVidia cards
are well matched, and the only real comparison for non-fanboys is price. My opinion is that nVidia
can compete on performance but not on cost per unit. This is exactly what happened to 3DFX - they
could create cards that were as fast, but they were huge, had high chip counts, and they ran hot.

It looks like nVidia know this and are going back to the drawing board to design out the heat
bottleneck problem. IMO, you wont get faster versions of nVidia chips because of this, just a more
cost effective base design.

S
 
K

Kokoro

In alt.comp.periphs.videocards.nvidia, Sham B ordered an army of
hamsters to type:
It looks like nVidia know this and are going back to the drawing board
to design out the heat bottleneck problem. IMO, you wont get faster
versions of nVidia chips because of this, just a more cost effective
base design.

S


If Nvidia have scrapped NV50 then they are more than going back to the
drawing board. It must mean another design already exists with which they
are going to switch to. Who is to say a new design wont be any faster and
cooler at the same time?
 
J

J. Clarke

Kokoro said:
In alt.comp.periphs.videocards.nvidia, Sham B ordered an army of
hamsters to type:



If Nvidia have scrapped NV50 then they are more than going back to the
drawing board. It must mean another design already exists with which they
are going to switch to.

Either that or the design is so badly hosed at this point that they've given
up trying to fix it.
 
H

HockeyTownUSA

Nvidia still strongly believes that SLI is something that will become very
important in the future. µ

I agree with this, but only if the two cards are inexpensive enough for the
mass market to be able to buy two cards. The Voodoo 2 SLI was a great idea,
but it would only appeal to a select few due to the price. It wasn't until a
couple years later that more people invested in it because prices dropped
enough, but by that time there were single graphics card solutions that were
just a powerful.

I do hope SLI comes back in a more robust and affordable format. I mean
hell, I just spent $500 on a video card, why wouldn't I spend $300 x 2 for
something nearly twice as fast? But I'm not willing to spend $500 x 2. What
would be really nice is with PCI-E is to make it scaleable, so get better
performance by adding more cards. Of course why would you need 3 or 4 cards
when you can run at 1600x1200 max detail with two. But we've said it before,
and I'm sure we'll run into bottle necks again eventually. Parallel
processing never made a lot of sense due to the minimal performance
increase, unless you run apps or games specifically written to take
advantage of it. But SLI should work....
 
J

J. Clarke

HockeyTownUSA said:
I agree with this, but only if the two cards are inexpensive enough for
the mass market to be able to buy two cards. The Voodoo 2 SLI was a great
idea, but it would only appeal to a select few due to the price. It wasn't
until a couple years later that more people invested in it because prices
dropped enough, but by that time there were single graphics card solutions
that were just a powerful.

I do hope SLI comes back in a more robust and affordable format. I mean
hell, I just spent $500 on a video card, why wouldn't I spend $300 x 2 for
something nearly twice as fast? But I'm not willing to spend $500 x 2.
What would be really nice is with PCI-E is to make it scaleable, so get
better performance by adding more cards. Of course why would you need 3 or
4 cards when you can run at 1600x1200 max detail with two. But we've said
it before, and I'm sure we'll run into bottle necks again eventually.
Parallel processing never made a lot of sense due to the minimal
performance increase, unless you run apps or games specifically written to
take advantage of it. But SLI should work....

I think you're going to find that you not only pay full boat for the video
boards but also pay a premium for a motherboard with the necessary slots.
And the performance doesn't look to be anything like 2X.
 
N

Nicholas Buenk

Sham B said:
nVidia *is* 3DFX when it comes to its design staff - it absorbed a lot of
3DFX people during take
over.
I'm sure the nVidia problems are due to the same as 3DFX - the large
amount of heat their designs
generate, and the increasing cost that this imposes. At the moment, high
end ATI and nVidia cards
are well matched, and the only real comparison for non-fanboys is price.
My opinion is that nVidia
can compete on performance but not on cost per unit. This is exactly what
happened to 3DFX - they
could create cards that were as fast, but they were huge, had high chip
counts, and they ran hot.

It looks like nVidia know this and are going back to the drawing board to
design out the heat
bottleneck problem. IMO, you wont get faster versions of nVidia chips
because of this, just a more
cost effective base design.

Nvidia 6800 is currently out selling ATI x800 by a fair margin. They are
making a good profit on their work on the 6800 which they can put into a new
core. However, ATI is outselling nvidia in the low end and medium range
cards, as the 9700pro family of cards are still paying of big compared to
the geforce fx.
But I think Nvidia made a big come back with the 6800 and their willingness
to abandon a core shows their determination to beat ATI and give a superior
product.
 
B

BelPowerslave

Does this mean it won't be your next user handle as you continue to spam the
**** out usenet?

Bel

--

Whip Ass Gaming: http://www.whipassgaming.com/

"Suddenly aware of my presence, the Elder Gods transformed me into their
servant and gave me a new purpose: To prevent the Dragon King from merging the
realms."

- Scorpion, Mortal Kombat: Deception
 
B

Bass

But I think Nvidia made a big come back with the 6800 and their willingness
to abandon a core shows their determination to beat ATI and give a superior
product.
Superior? Nothing beats the X800XT Platinum.
 
S

Scotter

Depends on what game/benchmark you are using.
In this Tom's Hardware article, I see a few benchmarks where the 6800 Ultra
and sometimes even the 6800 GT beat the X800 XT PE: Call of Duty, Doom3,
FarCry, Sims2, Flight Simulator 2004 and many other games where all three of
these cards are neck & neck or the X800 XT PE is BARELY fastest.

Call of Duty
http://graphics.tomshardware.com/graphic/20041123/sli-performance-13.html

Doom 3
http://graphics.tomshardware.com/graphic/20041123/sli-performance-15.html
http://graphics.tomshardware.com/graphic/20041123/sli-performance-16.html

FarCry
http://graphics.tomshardware.com/graphic/20041123/sli-performance-17.html

Sims2
http://graphics.tomshardware.com/graphic/20041123/sli-performance-19.html

Flight Simulator 2004
http://graphics.tomshardware.com/graphic/20041123/sli-performance-21.html
 
J

John Lewis

I take this as just a rumor, for now.

http://www.theinquirer.net/?article=20034
Nvidia's NV50 canned

New things to come


By Fuad Abazovic: Friday 03 December 2004, 13:57

Appropriate name, can be simply abbrieviated to FUD.....
WE ARE told that Nvidia's next gen chip NV50 has been canned as well as the
NV48 chip we reported on earlier in the week. I guess both were not fitting
well into Nvidia's picture.
We don't have any idea as yet what lead to such a decision, but Nvidia does
apparently think it's now time to make its next breakthrough chip.

All we know is that Nvidia made a huge u-turn or a right turn in its roadmap
as Intel describes it, and we don't yet know where that leads.

We also know that Nvidia is very dedicated to win the graphic fight and to
move away from the deadly embrace with ATI. In the current generation, it's
not important what you get as both Nvidia and ATI cards are performing
almost identically.

DX9.0c vs DX9.0b. Yes, very obviously identical............
Nvidia still strongly believes that SLI is something that will become very
^^^
has

important in the future. µ
^^^^^
present.

Obvious totally oblivious to the slew of SLI-enabled motherboard
announcements. And an interesting cross-license agreement between
nVidia and Intel, all terms of which have not been publicly announced.

Yuk. Please pass me the Alka-Seltzer.........

Twit reporter.

The Inquirer ==== Tech "News of the World" (UK)
" National Enquirer (US)

== the tech news for morons.......

John Lewis
 
A

assaarpa

these cards are neck & neck or the X800 XT PE is BARELY fastest.

Pretty sad if it is barely 20% faster (if that!) at over twice the price.
I'd get a single 6800 GT or X800, huhu, both break 60 frames per second with
easy in these benchmarks. Beyond that it's not going to bring any value for
the investment as far as these games are concerned!

And to add salt to the wounds, in the game that has the lowest performance
(FS 2004) single X800 is faster than >2x the price SLI solution from NV. ;)


.... That said, I got 6800 GT myself because of programming fetish for 3.0
shaders. The card is:

- noisy
- big (the card infact covers two SATA connectors on K8V Deluxe, that's
right only 2 usable SATA connectors (the Promise controller ones).
- large YUV overlays don't seem to work (try those hi-definition dvd demo
videos from microsoft.com at 1080i.. completely **** up, the 720p ones work
smoothly).

Besides these small grievances, guess what, I still wouldn't switch! Simple
reason: 3.0 shaders... argh... :)
 
N

Nicholas Buenk

assaarpa said:
Pretty sad if it is barely 20% faster (if that!) at over twice the price.
I'd get a single 6800 GT or X800, huhu, both break 60 frames per second
with easy in these benchmarks. Beyond that it's not going to bring any
value for the investment as far as these games are concerned!

And to add salt to the wounds, in the game that has the lowest performance
(FS 2004) single X800 is faster than >2x the price SLI solution from NV.
;)


... That said, I got 6800 GT myself because of programming fetish for 3.0
shaders. The card is:

I have one myself.

You can buy a heatsink with a quieter fan seperately.
- big (the card infact covers two SATA connectors on K8V Deluxe, that's
right only 2 usable SATA connectors (the Promise controller ones).

Blame your motherboard. My gigabyte K8NS Pro does not have that problem.
- large YUV overlays don't seem to work (try those hi-definition dvd demo
videos from microsoft.com at 1080i.. completely **** up, the 720p ones
work smoothly).

I have noticed.. no problems with them, except that the card doesn't do
decode acceleration as the specs say it does.
Perhaps that is your problem, you don't have the CPU power to make up for
the total lack of decode acceleration, less than what a FX 5900 does. What
CPU do you have? You really need at least a 3400+ to watch 1080p with this
card.
Besides these small grievances, guess what, I still wouldn't switch!
Simple reason: 3.0 shaders... argh... :)

They are no big deal really, they just give a speed boost. What is a nice
feature that is valuable is HDR which offers far more realistic lighting
effects and perhaps is the biggest change to lighting effects since pixel
shader 2, but it slows down the rendering speed by about 50% so it has
limited usefulness....
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top