Why did 3dfx Voodoo vid card maker die ?

M

Mudfish\(Co30\)

Mojo said:
Hi guy
Now that I may look at buying nForce4 SLI. Lets go back in time to my
first big 3D card buy.........3dfx Voodoo2. I bought two to use for
SLI, and they was not cheap back then. These voodoo rocked. The best
and fastest on the market. How can a company go from being Number #
1 to out of business in just a few years. ? ANYONE remember the
history of 3dfx and what happened to them.
Still have my Voodoo2 SLI in my PC just for the hell of
it...................................... Funny how low it benchmarks
now...................hehehheheheh
Thanks



3DFX died so that dual graphics card technology
could be born again in Nvidia's Nforce4 chipset
and GeForce 6600 cards.

The engineering expertise Nvidia got from 3DFx has evolved and
is now driving their leading edge cards. The "SLI" designator
is a salute to those who lead the way back when Voodoo ruled
the world.

(I also still have my V2 SLI rig).
<{{ MudFish (Co30){('>
www.Co30.com
"Careful with that Axe Eugene."
 
V

vellu

Voodoo's approach only solved the problem of the GPU hanging due to writing
to "slow" frame buffer memory. Nvidia's SLI recognises that the Voodoo
architecture results in both cards performing the same front end geometry
calcualtions, which is a waste. Nvidia's idea is to look at the pipline as a
whole and allow the second GPU to support the first GPU in keeping the
pipeline running at full potential. Evidently memory speed is not the
problem it was!

This also means that the two GPU's don't have to be matched. So in theory
you can buy one SLI card, and then add a second later, which following
improvements in technology, will be cheaper and faster. The old one will
contribute to keeping the faster one working at full potential.

I suppose most people won't be care that this isn't Scan Line Interleaving.
For them SLI just means "two better than one"

Plus NVIDIAs implementation is transparent to applications. That is,
games don't need to support it directly. If to cards exist both are
automatically used. Don't remember if Voodoo's solution needed support
or not. Of course, I'm sure, when SLI rigs become more common, games can
and propably will be designed to use enhanced graphics when SLI is detected.
 
A

Andrew

Plus NVIDIAs implementation is transparent to applications. That is,
games don't need to support it directly. If to cards exist both are
automatically used. Don't remember if Voodoo's solution needed support
or not. Of course, I'm sure, when SLI rigs become more common, games can
and propably will be designed to use enhanced graphics when SLI is detected.

V2 SLI didn't need game support, it too was transparent.
 
T

Tim

Whatever...I use ATI now but I have never been as pleased as I was with
any
of my 3dfx products.

My Voodoo 3 was the most reliable, compatible card I ever owned, and the
drivers were superb. Most of the advantages touted by NVidia at the time (32
bit color, onboard T&L, etc), usually involved a performance hit or weren't
implemented well (or even at all, most games of the late nineties didn't use
on-board T&L anyway) - useful features, someday, but blown way out of
proportion for its time. They were actually more useful to NVidia's
marketing department.

Also, 3dfx 2D quality was superior to most of the early NVidia-based cards,
whose image quality declined at higher resolutions.
 
N

Nerdillius Maximus

Mojo said:
Hi guy


Now that I may look at buying nForce4 SLI. Lets go back in time to my
first big 3D card buy.........3dfx Voodoo2. I bought two to use for
SLI, and they was not cheap back then. These voodoo rocked. The best
and fastest on the market. How can a company go from being Number #
1 to out of business in just a few years. ?

The rise and fall of empires. Makes for good bathroom reader...It's
happening on a macroscale. Check yer six...


ANYONE remember the
history of 3dfx and what happened to them.

Several classic FUs:

1) Voodoo Rush (Harvey Fong, then at Hercules, had a whole skivvie-load of
hatemail for that one...)

2) Banshee, another top seller (NOT!)

3) Being late with V5, positioning it against the GF2

4) Top-heavy management

5) Buying STB and using them as sole point of prod/distro, effectively
factoring everyone else out of the picture, who then went to Nvidia, and the
rest is history. Of course, it resulted in the death of one of the better
OEMs out there, while 3dfx was busy lining up both barrels on both big toes
and blowing them into pulp as a final salute. STB was the prom queen in a
world of hinge-toothed beyatches, prolly the only one without pigeon-toes, a
fishy smell, and a big nose-pimple like Diamond or Creative (or *anyone* who
had S3 in the sack)...note that this is not to imply STB was the prettiest
bowlegged ginch at the meat market, just the least skanky FWTW...then they
both show up late to the dance underdressed with V5 and no makeup...

5.5) The proprietary API. Now don't get me wrong, Glide was (is) easy to
use, just about any decent codemonkey can whip something off in short time,
but it had its obvious limitations, the hardware being foremost. OpenGL and
DirectX are the two biggies, it's been that way since DX7 when M$ started
getting serious about their old Rendermorphics API, and stealing ideas from
OpenGL thru Fahrenheit, same way M$ always does, by offering cooperation and
then co-opting ideas and incorporating them in their IP so it looks like the
originator of the idea has no leg to stand on. They got this business model
from the Japanese, I'd imagine, who practice it on their less-loved rivals.
With Microsoft, who has no such restraint, the less-loved rival is *everyone
else*, including you. It is wise to use the long-handled spoon when supping
with devils and demons of their ilk. But enough about who we really love to
hate, and vice versa. This is about Glide being predestined to failure.
Which, incidentally, it was, in no small part due to a certain
aforementioned competitor, but also in that it was essentially a convenient
(and effective at the time) way to mask noncompliance with either other API,
whose future was and is set in stone. Resistance really was futile. It was a
foregone conclusion Glide would be killed off someday. Expecting M$ not to
advance their agenda of global proliferation is like standing in front of a
moving train. 3dfx pulled out one rabbit and overworked it into starvation,
so the poor bugger was a bonerack when it got pulled out of Hats #
4,5,6---so it couldn't drag that creaky wagon uphill for another season. So
they got walked over by Nvidia, who simply had a faster product, more
complete OpenGL feature set (good Linux support too...), etc. They were off
their game, so they skinned Mr. Bunny, sold the meat, lined their pockets
with fur, and gave Nvidia a much-needed lesson in display filtering quality
among other things. Then ATI finally pulled their thumb out in time to
shovel ashes on 3dfx' grave and bite Nvidia in the ass, lest they sit around
too long. And it's been back-and forth ever since (except with OpenGL and
Linux, which I have yet to see ATI reach Nvidia's level of support and
challenge their dominance thereof), complete with "Star-Bellied Sneeches"
fanboiism, "leaked" memos, dirty driver pocket pool, payola to da
Futuremarket (the Standards & Poor's of the graphics world, with all that
implies...can you say "protection racket"?), the whole shebang. Just another
day of business as usual.

6) Insufficient clock speed: Feature limitations could have been forgiven
(no EMBM, no trilinear, no T&L, etc.) if they had managed to get 200+ MHz
yields. The "w00t factor" might have pulled them out of the fire even if
they were a little late to the party. V5 @ 200+ MHz in it's day? W00t! Any
questions?
Bulky design: "Scalable Architecture" my left one, like anyone but someone
with four or more of 'em on their pro level card would care. It's all better
off on one chip, within practical limitations.


There's still driver development going on for the V5...


While we're at it, let's pick another paperweight out of the
sack...Rendition! If they'd ever got the silicon right, the drivers might
have been a little better, methinks...

Now I'se gots to go start helpin' fixin' some vittles for this here...uhh...
"extended nuclear family"...happy turkey-day, y'all...
 
G

GMAN

Whatever...I use ATI now but I have never been as pleased as I was with any
of my 3dfx products. I still have a voodo 2,3,4,5 on the shelf that I
cannot use.
I also have the matrox m2 on the shelf.
Why not? I put my old Voodoo 3 in a old pentium3 system that would not run
Unreal Tournament worth crap with any of the ATI Radeon or Nvidia cards i had
laying around but slapped that Old Voodoo 3 and found some 3rd party XP
drivers and violoa , smooth as silk and i even got Unreal 2004 running nice on
the P3 800
 
B

BeingAnonymousMakesMeObnoxious

Nerdillus, that was one of the best posts I've ever read on Usenet.
 
N

Nerdillius Maximus

BeingAnonymousMakesMeObnoxious said:
Nerdillus, that was one of the best posts I've ever read on Usenet.

aww, shucks! Now I have to stake my head down so it doesn't float away and
look like a Doom 3 cacodemon on crack! Really, you should try
comp.sys.ibm.pc.hardware.chips...a few IBM engineers and other very
knowledgeable folk in there...I am but a humble Shoeshine Boy, all things
considered ;-)...

But thanks, nonetheless, I appreciate the compliment, and it won't go to my
head...I'm just glad my off-color metaphors weren't too offensive, nor
misconstrued as misogynistic mindset...
 
J

Just Askin'

Why not? I put my old Voodoo 3 in a old pentium3 system that would not run
Unreal Tournament worth crap with any of the ATI Radeon or Nvidia cards i
had
laying around but slapped that Old Voodoo 3 and found some 3rd party XP
drivers and violoa , smooth as silk and i even got Unreal 2004 running
nice on
the P3 800

Yes, funny how the Voodoo3 runs the original UT so well.

It's because they spent a lot of the time on the glide code. I always
remember 'upgrading' to nVidia and wondering WTF happened to UT.

nVidia is all about fancy claims and marketing, but it will be their
undoing.
 
G

GMAN

Yes, funny how the Voodoo3 runs the original UT so well.

It's because they spent a lot of the time on the glide code. I always
remember 'upgrading' to nVidia and wondering WTF happened to UT.

nVidia is all about fancy claims and marketing, but it will be their
undoing.
Just the fact that the next XBOX and Nintendo Cube systems will be using ATI,
I definately agree.
 
N

NightSky 421

Mojo said:
Hi guy


Now that I may look at buying nForce4 SLI. Lets go back in time to my
first big 3D card buy.........3dfx Voodoo2. I bought two to use for
SLI, and they was not cheap back then. These voodoo rocked. The best
and fastest on the market. How can a company go from being Number #
1 to out of business in just a few years. ? ANYONE remember the
history of 3dfx and what happened to them.


You've already gotten a lot of responses, so I won't try to add to what has
been said about 3dfx's demise. What I wanted to say is that I never owned a
3dfx video card until about a year ago! A buddy of mine sold me an older
system last year and offered to include a Voodoo3 3000 AGP 16MB card for
free, so I was intrigued and accepted. I have a bunch of old games from
when 3dfx was in their heyday and it blew me away to see how much better
3dfx was with those games than any video card I had been running in the same
time period! In my testing, I used the final driver set that 3dfx
officially released, which I think came out in the first week of December
2000. It's too bad what happened to 3dfx, but they did a lot to
revolutionize the PC gaming industry.
 
R

Roy Coorne

GMAN wrote:

....
Just the fact that the next XBOX and Nintendo Cube systems will be using ATI,
....

But why?
Because NVidia cannot deliver the ordered quantities in time?
Because ATI offers better conditions?
But why?

Roy <who keeps his Voodoo 3 2000 PCI as an antique collectors' item>
 
G

GMAN

GMAN wrote:

....

....

But why?
Because NVidia cannot deliver the ordered quantities in time?
Because ATI offers better conditions?
But why?

Roy <who keeps his Voodoo 3 2000 PCI as an antique collectors' item>

They offered a "better" product.
 
S

Son of Blahguy

GMAN said:
They offered a "better" product.

"Better" why? Did they undercut nvidia by offering a similar product for
less?
It could just be MS's way of playing nvidia and ATI against each other so
they don't become too strong in their own right.
Maybe ATI has better DirectX support atm?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top