AIB Companies To Adopt XGI Volari GPUs?

  • Thread starter graphics processing unit
  • Start date
G

graphics processing unit

While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.

Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.

now here's the article of the topic:

http://www.xbitlabs.com/news/video/display/20030923124528.html
-----------------------------------------------------------------
ASUS, ABIT, Gigabyte, Club3D to Adopt XGI Volari GPUs?

by Anton Shilov
09/23/2003 | 12:46 PM

There are rumours going around Computex Taipei 2003 exhibition in Taiwan
that a number of graphics card makers are seriously considering
manufacturing of graphics cards powered by XGI Volari graphics processors.
The list of the companies includes the names of tier-one manufacturers, even
though there are no official comments from any firms mentioned.

As we managed to find out, ASUSTeK Computer, ABIT, Gigabyte Technology, CP
Technology and Club3D plan to support XGI in an attempt to successfully
enter the graphics cards market this year by adoption of XGI Volari V5 and
V8 GPUs.

Everybody in the graphics cards market is very interested in the third
provider of GPUs since the fierce competition between today's leaders NVIDIA
and ATI is not only exhausting for chip companies, but also has a negative
impact on their add-in-board partners. Furthermore, only two GPU companies
that have practically equal resources may result in a rise of the GPU cartel
that totally controls the graphics processors market. Even though it is
practically impossible for a new player to enter the market, AIB companies
want to give XGI a try. In case XGI manages to be competitive, everyone will
benefit from this.

Note that the information is totally unofficial and no formal decisions
concerning actual graphics cards have been announced yet
 
T

Thomas

graphics said:
Personally, I am most excited about the Volari V8 Duo - first
*consumer* graphics card configuration to sport twin Grahpics
Processing Units.

Hahahahahahaha...
Quite funny.

Don't remember the Ati Rage Fury MAXX, or the Voodoo 5 5500? Both had twin
GPU's...

If you dont know what you're talking about, stop posting ;-)

Thomas
 
G

graphics processing unit

Thomas said:
Hahahahahahaha...
Quite funny.

Don't remember the Ati Rage Fury MAXX, or the Voodoo 5 5500? Both had twin
GPU's...

If you dont know what you're talking about, stop posting ;-)

Thomas

Bwuhahahahahaha....

I guess you didn't notice that I said
*graphics processing unit* and not graphic accelerator or graphics chip.
neither the Ati Rage Fury MAXX nor the Voodoo 5 5500 used GPUs with on-chip
geometry processing (T&L) ;)
 
T

Thomas

graphics said:
I guess you didn't notice that I said
*graphics processing unit* and not graphic accelerator or graphics
chip. neither the Ati Rage Fury MAXX nor the Voodoo 5 5500 used GPUs
with on-chip geometry processing (T&L) ;)

The name 'GPU' was simply an invention of NVidia. For me, it's just another
name, not another 'thing', hehe. There were many more hardware-related
things added to the 'GPU', that didnt change the name, so for me, it's all
the same thing, from the Hercules chip to the Ati 9800 ;-) Just added more
features, and speed... But at least i see what you mean now ;-)

Thomas
 
C

calypso

U comp.sys.ibm.pc.hardware.video graphics processing unit said:
Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.


Voodoo5 5500 in my machine has got 2 VSA100 units... If that isn't GPU, than
what is it? Drivers are working properly under any Windows OS (right now
using Windows 2000 Pro)...

There is one thing that nobody will beat soon... :)) Voodoo5 6000... Or,
saying another words - 4 CPU's on one board... ;)

But, shhhhhh... ;)) I screwed one CPU, so it isn't working properly... :)))


And, ATI Rage Fury MAXX had 2 Rage128Pro CPU's (IIRC)... But, problematic
drivers...



--
"Ruzans li mlijekoo podmazuje ?" upita Fataa drka Zidovu povracu.
"Nisam ja nikog bombardiro !" rece coravaco mirise "Ja samo pudingu pozdravlju naklonjenm !"
By runf

Damir Lukic, (e-mail address removed)
a member of hr.comp.hardver FAQ-team
 
T

Thomas

Voodoo5 5500 in my machine has got 2 VSA100 units... If that isn't
GPU, than what is it? Drivers are working properly under any Windows OS (right
now using Windows 2000 Pro)...
And, ATI Rage Fury MAXX had 2 Rage128Pro CPU's (IIRC)... But,
problematic
drivers...

Ik know, and said so three hours ago, hahaha

Thomas
 
B

Bratboy

Thomas said:
Hahahahahahaha...
Quite funny.

Don't remember the Ati Rage Fury MAXX, or the Voodoo 5 5500? Both had twin
GPU's...

If you dont know what you're talking about, stop posting ;-)

Thomas

well and not to mention the new 9800 dual chip cards I read about recently
that someones makeing
 
A

Andy Cunningham

Sapphire built a 9800 MAXX with dual 9800 Pros. Didn't work, but until I
see the volari working I don't think that matters for this comparison ::)
 
T

Tony Hill

While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.

"I'll believe it when I see it". There have been a LOT of graphics
cards that were supposed to be the next big thing to come along. S3
has done it a handful of times (and again just recently with Delta
Chrome), Matrox has done it, BitBoys did it several times without ever
having a product, and now we've got XGI. So far none of these cards
have managed to compete very effectively with the low-end chips from
ATI or nVidia, let alone their high-end stuff.

The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get
fast and *stable* drives out for all of their products while 3dfx and
ATI were floundering with buggy drivers that were missing features and
having either very poor performance or, at best, uneven performance.
ATI has since learned from their mistakes and really improved the
quality of their drivers, but they are about the only one.

Right now there are three players in the graphics market, ATI, nVidia
and Intel (with Intel actually been the largest supplier). Most of
the world's computer users do VERY well with integrated graphics, and
have absolutely ZERO reason to buy an add-in card. That just leaves
an extremely small market at the very high-end and a decent sized but
very low-margin market in the mid range. If XGI wants to succeed,
they need to get a graphics card out for $100 that has stable drivers
and that can match or beat whatever nVidia and ATI are selling for
~$125 at the time (right now that would be the GeForceFX 5600 and the
Radeon 9600).

I ain't holding my breath. I'll be surprised if they ever get stable
drivers, let alone within the next 6 months of it's release. And
that's just talking about Windows drivers, the situation is likely to
be even worse for their Linux drivers if they even bother to make
those at all.
Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.

I'm not. I doubt that it manage to match a GeforceFX 5600 or ATI
Radeon 9600, yet it will likely cost a LOT more. It all comes back to
drivers, especially for a more complicated design with two graphics
processors.

Besides that, their claim as being the first consumer card with dual
GPUs is REALLY stretching things. They're taking a very narrow view
on just what it means to be a consumer card and what it takes to be
considered a GPU. Marketing at it's best/worst here.
 
R

Radeon350

Tony Hill said:
"I'll believe it when I see it". There have been a LOT of graphics
cards that were supposed to be the next big thing to come along. S3
has done it a handful of times (and again just recently with Delta
Chrome), Matrox has done it, BitBoys did it several times without ever
having a product, and now we've got XGI. So far none of these cards
have managed to compete very effectively with the low-end chips from
ATI or nVidia, let alone their high-end stuff.

The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get
fast and *stable* drives out for all of their products while 3dfx and
ATI were floundering with buggy drivers that were missing features and
having either very poor performance or, at best, uneven performance.
ATI has since learned from their mistakes and really improved the
quality of their drivers, but they are about the only one.

Right now there are three players in the graphics market, ATI, nVidia
and Intel (with Intel actually been the largest supplier). Most of
the world's computer users do VERY well with integrated graphics, and
have absolutely ZERO reason to buy an add-in card. That just leaves
an extremely small market at the very high-end and a decent sized but
very low-margin market in the mid range. If XGI wants to succeed,
they need to get a graphics card out for $100 that has stable drivers
and that can match or beat whatever nVidia and ATI are selling for
~$125 at the time (right now that would be the GeForceFX 5600 and the
Radeon 9600).

I ain't holding my breath. I'll be surprised if they ever get stable
drivers, let alone within the next 6 months of it's release. And
that's just talking about Windows drivers, the situation is likely to
be even worse for their Linux drivers if they even bother to make
those at all.


I'm not. I doubt that it manage to match a GeforceFX 5600 or ATI
Radeon 9600, yet it will likely cost a LOT more. It all comes back to
drivers, especially for a more complicated design with two graphics
processors.

Besides that, their claim as being the first consumer card with dual
GPUs is REALLY stretching things. They're taking a very narrow view
on just what it means to be a consumer card and what it takes to be
considered a GPU. Marketing at it's best/worst here.


I don't see why it is such a stretch. First of all, there are not many
companies that make consumer GPUs to begin with. They can be counted
on one hand, I believe. And as far as I am aware, none have released a
card with more than one GPU, for consumer use. Yeah, there are dozens
of cards that use 2 or more GPUs, from a number of companies, for all
kinds of highend, non-consumer applications. many of them predate
Nvidia's NV10/GeForce256, which was the first working consumer GPU,
but *certainly* not the first-ever GPU. that is, a chip with T&L
on-chip.

Actually I don't just use the term 'GPU' as Nvidia uses it. To myself
and to many who use graphics processors, something that takes the
geometry processing load OFF the CPU, putting on the graphics chip,
that's a 'graphics processor' or graphics processing unit / GPU as
Nvidia coined it. The 3Dfx Voodoo chips, including VSA-100s used in
Voodoo5 5500 and 6000 did NOT do that at all. Neither did any of the
pre-Radeon ATI chips, inluding the duel Rage Fury chips in the MAXX
card. And basicly any consumer 3D PC chip before the GeForce256. Any
graphics chip that lacks what used to be called 'geometry processing'
or what was commenly called T&L in late 1999 when GeForce came out,
and is now called Vertex Shading, if it lacks that, it's usually
concidered a 3D accelerator or rasterizer, rather than a complete
'graphics processor' or GPU. At least that is the way I have
understood things for a long time.

On the other hand,
I suppose one can argue that any graphics chip, be it 2D or 3D is a
'GPU', anything from a 1990 VGA chip, to a Voodoo1, to the Graphics
Synthesizer in the PS2. However it is commen practice in the graphics
industry to differentiate between a rasterizer and a complete graphics
processor with geometry & lighting (now vertex shading) on board.

So therefore, I do not find the marketing of XGI to be outrageous in
their claims of having the first duel GPU card for consumer use. Of
course, they *will* have to bring Volari to market. it will have to
work. in other words "believe when we see it" that still applies. but
the specific claim of having the first duel GPU card is not a stretch
in and of itself, in my book :)
 
A

Andy Cunningham

The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get
fast and *stable* drives out for all of their products while 3dfx and
ATI were floundering with buggy drivers that were missing features and
having either very poor performance or, at best, uneven performance.
ATI has since learned from their mistakes and really improved the
quality of their drivers, but they are about the only one.

3DFX drivers were excellent - I never had to fiddle with anything to get my
Voodoo 3 working with any game. The reason nVidia took over was that the
GeForce performance was so far above V3 performance, although there was
nothing wrong with the drivers either. 3DFX attempts to get back on par for
performance never went anywhere while nVidia raised the bar again by a huge
amount with the GF2.
 
T

Tony Hill

I don't see why it is such a stretch. First of all, there are not many
companies that make consumer GPUs to begin with. They can be counted
on one hand, I believe.

There are 6 of them. nVidia, Intel and ATI are far and away the
leaders, with Matrox, S3/VIA and SiS following. XGI is a combo team
of SiS and the old Trident crew.
And as far as I am aware, none have released a
card with more than one GPU, for consumer use. Yeah, there are dozens
of cards that use 2 or more GPUs, from a number of companies, for all
kinds of highend, non-consumer applications. many of them predate
Nvidia's NV10/GeForce256, which was the first working consumer GPU,
but *certainly* not the first-ever GPU. that is, a chip with T&L
on-chip.

There is a lot of tricky wording going around with just what makes a
graphics chipset a "GPU" and what makes it just a video chipset.
Actually I don't just use the term 'GPU' as Nvidia uses it. To myself
and to many who use graphics processors, something that takes the
geometry processing load OFF the CPU, putting on the graphics chip,
that's a 'graphics processor' or graphics processing unit / GPU as
Nvidia coined it. The 3Dfx Voodoo chips, including VSA-100s used in
Voodoo5 5500 and 6000 did NOT do that at all. Neither did any of the
pre-Radeon ATI chips, inluding the duel Rage Fury chips in the MAXX
card. And basicly any consumer 3D PC chip before the GeForce256. Any

All of these chips, starting way back with the Matrox Millennium,
offloaded some of the graphics processing work to the video card. It
wasn't nearly as cut-and-dry as people (or, more to the point,
nVidia's marketing department) like to make it out to be. Even with
geometry processing it was by no means an all-in-one sort of deal,
different chips have taken over different stages of the geometry
processing.
graphics chip that lacks what used to be called 'geometry processing'
or what was commenly called T&L in late 1999 when GeForce came out,
and is now called Vertex Shading, if it lacks that, it's usually
concidered a 3D accelerator or rasterizer, rather than a complete
'graphics processor' or GPU. At least that is the way I have
understood things for a long time.

You're understanding the marketing terms just fine, though marketing
paints a much more black and white picture of things than the real
world.

In short, it all comes down to where you choose to put the mark of
what defines a GPU as. Personally, I really don't care one way or the
other. I'm an engineer and a computer users. I'm interested in the
technical details (for curiosity sake) and the price/performance that
it offers (from a "will I buy it" point of view). Everything else is
all just marketing, and doesn't interest me much.
 
T

Tony Hill

3DFX drivers were excellent - I never had to fiddle with anything to get my
Voodoo 3 working with any game.

By the time that the Voodoo 3 came out it was rapidly becoming too
late for 3DFX. They really blew it with crappy drivers on their
Voodoo Rush chipset and then the Voodoo Banshee after that. 3DFX's
drivers also always tended to offer quite poor performance unless you
happened to be playing a Glide game or one that would work with their
"Mini-GL" driver (ie their Quake driver).

FWIW the "buggy driver" syndrome was more a problem for ATI. 3DFX had
more problems with poor driver performance in OpenGL and DirectX as
well as some missing features (though the latter was one part
hardware, one part software).
The reason nVidia took over was that the
GeForce performance was so far above V3 performance, although there was
nothing wrong with the drivers either. 3DFX attempts to get back on par for
performance never went anywhere while nVidia raised the bar again by a huge
amount with the GF2.

By that point in time their drivers might not have been buggy, but
they were often offering rather poor performance as compared to what
the hardware was theoretically capable of.
 
R

Radeon350

Voodoo5 5500 in my machine has got 2 VSA100 units... If that isn't GPU, than
what is it? Drivers are working properly under any Windows OS (right now
using Windows 2000 Pro)...

There is one thing that nobody will beat soon... :)) Voodoo5 6000... Or,
saying another words - 4 CPU's on one board... ;)

But, shhhhhh... ;)) I screwed one CPU, so it isn't working properly... :)))


And, ATI Rage Fury MAXX had 2 Rage128Pro CPU's (IIRC)... But, problematic
drivers...


Ok this post is sort of for you, and for Tony, or anyone who doesn't
really draw the line between a rasterizer / 3D accellerator like 3Dfx
Voodoo 1,2,3, Banshee, VSA-100, PowerVR Series 1,2,3, Riva 128,
TNT1/2, Rage128, Rage Fury etc., and a full on 'graphics processor' or
GPU or polygon processor or polygon processor chipset (GeForce 1-4,
GFFX, all the Radeons, Lockheed Reald3D series, 3DLabs GLINT+Delta,
Evans & Sutherland RealIMAGE, 3DLabs Wildcat, etc)

What I am posting below is a very good (IMHO) post from 1996 from a
guy who explained the differences (and made a distinction) between
Voodoo Graphics or similar consumer 3D accelerators/rasterizers of the
time, and full 3D polygon processors (equivalent of todays GPUs) with
geometry engines/processors-like Lockheed's non-consumer Real3D/100,
which was a true 'graphics processor'/ chipset (not the horrible
consumer Intel/R3D i740 used in Starfighter cards that had not been
revealed in 1996). At that time, there were NO consumer PC 3D chips
with geometry processing / T&L. in otherwords, there were no consumer
GPUs in 1996. not until 1999's GeForce256.

This post really points out the differences quite well. Alright
without further rambling on my part, here is the post:

http://groups.google.com/[email protected]&oe=UTF-8&output=gplain

"First, let me start off by saying I am going to be buying a Voodoo
card. For low end comsumer grade flight sims and such, the Voodoo
looks like about the best thing available. Second, I am not
necessarily responding to just you, because there seems to be a hell
of a lot of confusion about Lockheed Martin's graphics accelerators. I
have been seeing posts all over the place confusing the R3D/100 with
the AGP/INTEL project that L.M. is working on. The R3D/100 is *NOT*
the chipset that is being developed for the AGP/INTEL partnership.

However, since your inference is that the Voodoo is faster than the
R3D/100, I have to say that you are totally dead wrong. While the
specs say that the Voodoo is *capable* of rendering a higher number of
pixels per second, or the same number of polygons per second as the
R3D/100, the specs fail to mention that these are not real world
performance figures any you probably will not ever see the kind of
performance that 3Dfx claims to be able to acheive. This does *not*
mean that the Voodoo is not a good (its great actually) card, just
that the game based 3D accelerator companies (all of them) don't tell
you the whole story.

The Voodoo uses a polygon raster processor. This accelerates line and
polygon drawing, rendering, and texture mapping, but does not
accelerate geometry processing (ie vertex transormation like rotate
and scale). Geometry processing on the Voodoo as well as every other
consumer (read game) grade 3D accelerator. Because the cpu must
handle the geometry transforms and such, you will never see anything
near what 3Dfx, Rendition, or any of the other manufacturers claim
until cpu's get significantly faster (by at least an order of
magnitude). The 3D accelerator actually has to wait for the cpu to
finish processing before it can do its thing.

I have yet to see any of the manufacturers post what cpu was plugged
into their accelerator, and what percentage of cpu bandwidth was being
used to produce the numbers that they claim. You can bet that if it
was done on a Pentium 200, that the only task the cpu was handling was
rendering the 3D model that they were benchmarking. For a game,
rendering is only part of the cpu load. The cpu has to handle flight
modelling, enemy AI, environmental variables, weapons modelling,
damage modelling, sound, etc, etc.

The R3D includes both the raster accelerator (see above) and a 100
MFLOP geometry processing engine. Read that last line again. All
geometry processing data is offloaded from the system cpu and onto the
R3D floating point processor, allowing the cpu to handle more
important tasks. The Voodoo does not have this, and if it were to add
a geometry processor, you would have to more than double the price of
the card.

The R3D also allows for up to 8M of texture memory (handled by a
seperate texture processor) which allows not only 24 bit texturemaps
(RGB), but also 32bit maps (RGBA) the additional 8 bits being used for
256 level transparency (Alpha). An addtional 10M can be used for
frame buffer memory, and 5M more for depth buffering.

There are pages and pages of specs on the R3D/100 that show that in
the end, it is a better card than the Voodoo and other consumer and
accelerator cards, but I guess the correct question is, for what? If
the models that are in your scene are fairly low detailed (as almost
all games are - even the real cpu pigs like Back to Bagdhad), then the
R3D would be of little added benefit over something like the Voodoo.
However, when you are doing scenes where the polys are 2x+ times more
than your typical 3D game, the R3D really shines. The R3D is and
always was designed for mid to high end professional type application,
where the R3D/1000 (much much faster than the 100) would be too
expensive, or just plain overkill. I've seen the 1000 and I have to
say that it rocks! I had to wipe the drool from my chin after seeing
it at Siggraph (We're talking military grade simulation equipment
there boys, both in performance and price!)

Now then, as I mentioned before, I'm going be buying the Voodoo for my
home system, where I would be mostly playing games. But, I am looking
at the R3D for use in professional 3D application. More comparible 3D
accelerators would not be Voodoo, Rendition based genre, but more
along the lines of high end GLINT based boards containing Delta
geometry accelerator chips (and I don't mean the low end game base
Glint chips, or even the Permedia for that matter), or possibly the
next line from Symmetric (Glyder series), or Intergraph's new
professional accelerator series."

[unqoute]


Ahem, I appologize for making a really huge deal out of this. I am not
trying to be anal or trying to flame anyone, just pointing something
out that is quite significant IMHO, and significant to most people
that work with 3D graphics. (I dont myself). I feel that person's post
is right in line with my thinking as far as making a distinction
between rasterizers / 3D accelerators, which only tackle part of the
rendering pipeline (leaving the rest for the CPU) and full polygon
processers with geometry & lighting onboard, aka 'GPUs'.
 
L

Larry Roberts

Actually I don't just use the term 'GPU' as Nvidia uses it. To myself
and to many who use graphics processors, something that takes the
geometry processing load OFF the CPU, putting on the graphics chip,
that's a 'graphics processor' or graphics processing unit / GPU as
Nvidia coined it. The 3Dfx Voodoo chips, including VSA-100s used in
Voodoo5 5500 and 6000 did NOT do that at all. Neither did any of the
pre-Radeon ATI chips, inluding the duel Rage Fury chips in the MAXX
card. And basicly any consumer 3D PC chip before the GeForce256. Any
graphics chip that lacks what used to be called 'geometry processing'
or what was commenly called T&L in late 1999 when GeForce came out,
and is now called Vertex Shading, if it lacks that, it's usually
concidered a 3D accelerator or rasterizer, rather than a complete
'graphics processor' or GPU. At least that is the way I have
understood things for a long time.

Well. By your explanation, a GPU does all proccessing itself.
My old GF2 card didn't support the DX 8 hardware shaders, so I guess
it stopped being a GPU. Now I have an actual GPU card (GF3), but I
guess since it doesn't support DX 9 hardware shaders, I can't call it
a GPU either.
 
L

Larry Roberts

By the time that the Voodoo 3 came out it was rapidly becoming too
late for 3DFX. They really blew it with crappy drivers on their
Voodoo Rush chipset and then the Voodoo Banshee after that.

I don't think the Rush, and Banshee helped kill 3DFX. The
Banshee was understood to be entry level performance when compared to
the Voodoo 2. I bought a Banshee myself, and found it to be a nice,
cheap upgrade from my previous 8MB Verite 2200 + 4MB Voodoo 1. Could
only get about 28fps in Q2 640x480 using Verite 2200, about 34fps with
Voodoo 1, and a whopping 50fps with the Banshee. I was so happy then.
Now we complain if we can't get 100fps. :)
 
M

Mark Leuck

Larry Roberts said:
I don't think the Rush, and Banshee helped kill 3DFX. The
Banshee was understood to be entry level performance when compared to
the Voodoo 2. I bought a Banshee myself, and found it to be a nice,
cheap upgrade from my previous 8MB Verite 2200 + 4MB Voodoo 1. Could
only get about 28fps in Q2 640x480 using Verite 2200, about 34fps with
Voodoo 1, and a whopping 50fps with the Banshee. I was so happy then.
Now we complain if we can't get 100fps. :)

The Banshee did help kill 3dfx because 3dFX was in such a hurry to release
the alll-in-one 3d card they took people away from the Rampage project to
work on it.
 
C

calypso

Look... GPU states for Graphics Processing Unit, right?

The acronyms says it's a unit that processes graphics... So, looking that
way, all 2D GPU's back in the time of Hercules, CGA, EGA, VGA, blahblah, to
the newest GPU's are the same thing... Units that have only one thing to do
- process graphics...

You can now talk about high-perf SGI GPU's, all the stuff you mentioned, and
yes, all of these are GPU's, just like all the stuff I mentioned...

But, if you say 3D GPU only, then it's other thing to discuss... Looking
that way, Voodoo 1 and 2 weren't true GPU's, but 3D only (which they were in
fact)... ;)


EOD...

--
Klintona boja drazesan keksu sviru na Infou prekjucer ?
By runf

Damir Lukic, (e-mail address removed)
a member of hr.comp.hardver FAQ-team
 
T

Thomas

Radeon350 said:
Ok this post is sort of for you, and for Tony, or anyone who doesn't
really draw the line between a rasterizer / 3D accellerator like 3Dfx
Voodoo 1,2,3, Banshee, VSA-100, PowerVR Series 1,2,3, Riva 128,
TNT1/2, Rage128, Rage Fury etc., and a full on 'graphics processor' or
GPU or polygon processor or polygon processor chipset (GeForce 1-4,
GFFX, all the Radeons, Lockheed Reald3D series, 3DLabs GLINT+Delta,
Evans & Sutherland RealIMAGE, 3DLabs Wildcat, etc)

What I am posting below is a very good (IMHO) post from 1996 from a
guy who explained the differences (and made a distinction) between
Voodoo Graphics or similar consumer 3D accelerators/rasterizers of the
time, and full 3D polygon processors (equivalent of todays GPUs) with
geometry engines/processors-like Lockheed's non-consumer Real3D/100,
which was a true 'graphics processor'/ chipset (not the horrible
consumer Intel/R3D i740 used in Starfighter cards that had not been
revealed in 1996). At that time, there were NO consumer PC 3D chips
with geometry processing / T&L. in otherwords, there were no consumer
GPUs in 1996. not until 1999's GeForce256.

Yes, sure, the name GPU was invented back then. It was a 'revolution' in 3D
cards. The 'GPU' has more capabilities and hardware support than the
previous generations of vid cards.

*BUT*, there have been many many more revolutions, like for instance the
pixel shader. The Directx 8 compliant cards are the first ones capable of
doing this. Great. But they didnt come up with a new name, like PSGPU, or
whatever. It's just that NVidia chose to change the name of the graphics
chip to GPU. For me, it's nonsense to claim that it's a special thing that
the Volari chips is the first dual GPU video card, since you're referring to
a dual video-chip card. It's the card with the latest version of videochips
that has been launched. But, well, since it's te most recent card, there's
no special thing in that.

Well, this doesnt really lead anywhere ;-) I think we all know what we all
mean, so no point in arguing about names ;-)

Thomas
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top