ATI to acquire XGI ?

P

PowerPC 603e

yes this could very well be, in part, to do with Xbox2 because XGI's partner
is SiS, who is providing Xbox2's south bridge (part of Xbox2's chipset)


http://www.beyond3d.com/forum/viewtopic.php?t=10671

QUOTE

"Lob this in the rumour mill for the time being.... "

Quote:
__________________________________________________________
ATI TECHNOLOGIES INC. (ATYT)
May seek to merge with or acquire a graphics unit of Taiwan's Silicon
Integrated Systems Corp., according to a report in the Economic Daily News.
__________________________________________________________

Quote:
"[Edit] I said "lob this in the rumour mill" because this is copied from a
poster over on the Yahoo finance message boards, which isn't really the
bastion of news sources!! However, it looks like the message is copied and
another separate poster posted the same thing in a very close timeframe.

What we do know: ATI and SiS are going to be working together for the XBox2
and ATI recently filed to sell of a bunch of their stock in order to raise
$500M.

Whats not clear is whether this is for XGI (they are no longer SiS's
graphics division, they are only partially owned by SiS), or whether its SiS
directly - SiS have a lot of chipset experience that ATI are no doubt keen
to get access to."












my 2 cents:

good for ATI if they do get XGI, but not so good for users because it
reduces competition in the industry.

ATI, SiS, IBM are all providing key pieces of Xbox2.
 
T

The Black Wibble

PowerPC 603e said:
yes this could very well be, in part, to do with Xbox2 because XGI's partner
is SiS, who is providing Xbox2's south bridge (part of Xbox2's chipset)

Volari. Hahahahaha! Hahahahahahahahahahahah! HAHAHAHAHHAHAHAHAHAHA!

[...]
my 2 cents:

good for ATI if they do get XGI, but not so good for users because it
reduces competition in the industry.

Volari, man! Hahahahaha! HAHAHAHAHA!
ATI, SiS, IBM are all providing key pieces of Xbox2.

Tony.
 
D

Dark Avenger

The Black Wibble said:
PowerPC 603e said:
yes this could very well be, in part, to do with Xbox2 because XGI's partner
is SiS, who is providing Xbox2's south bridge (part of Xbox2's chipset)

Volari. Hahahahaha! Hahahahahahahahahahahah! HAHAHAHAHHAHAHAHAHAHA!

[...]
my 2 cents:

good for ATI if they do get XGI, but not so good for users because it
reduces competition in the industry.

Volari, man! Hahahahaha! HAHAHAHAHA!
ATI, SiS, IBM are all providing key pieces of Xbox2.

Tony.

Well though you sound a little hysterical, you are right.

XGI's top product, costing aroun 350 euro atleast has definit faults
in their design! For instance, only a very slow memory bus between
both gpu's! The memory bus from GPU to their own little memory chips
is fast but.. to then have a very small memory bus between both
gpu's... killing!

Also to have no inbuild compression, that sounds like the
parphelia....I mean, we life in 2004, no compression inbuild in your
card is so 1998! A little Fast Z Culling, Z compression and Color
Compression and you can pump much more over your memory busses!

And then the GPU's, fair to be said the gpu's do their work... maybe
the output is not yet of good quality but..they do their work! And the
Dual GPU cards basicly are as fast as their... one gpu cards. Again
because of that no compression and the very small memory bus between
both gpu's!

So basicly xgi has allot of work to do, this board never should have
left the drawing board. This is definitly a sign of a board being
PUSHED on the market way to early!

Now hopefully xgi has had allot of time to think... and have been
looking to improving their designs..and adding compression!
 
D

Dirk Dreidoppel

Well though you sound a little hysterical, you are right.
XGI's top product, costing aroun 350 euro atleast has definit faults
in their design! For instance, only a very slow memory bus between
both gpu's! The memory bus from GPU to their own little memory chips
is fast but.. to then have a very small memory bus between both
gpu's... killing!

Also to have no inbuild compression, that sounds like the
parphelia....I mean, we life in 2004, no compression inbuild in your
card is so 1998! A little Fast Z Culling, Z compression and Color
Compression and you can pump much more over your memory busses!

And then the GPU's, fair to be said the gpu's do their work... maybe
the output is not yet of good quality but..they do their work! And the
Dual GPU cards basicly are as fast as their... one gpu cards. Again
because of that no compression and the very small memory bus between
both gpu's!

So basicly xgi has allot of work to do, this board never should have
left the drawing board. This is definitly a sign of a board being
PUSHED on the market way to early!

Now hopefully xgi has had allot of time to think... and have been
looking to improving their designs..and adding compression!

Anyone yet thinking what could be the result if Ati were to aquire XGI ?
Like Ati building dual-chip cards with next generation GPUs that are as
great in their time as their current ones ? And the shortcomings of XGI's
original design supposedly fixed ? A card that actually delivers nearly
twice the power of a single-chip one, while the single-chip one is still up
to par with or even superior to any Nvidia offering ? That could well
destroy NV and leave Ati with a monopoly. And as much as I dislike NV, an
Ati monopoly would hurt everyone. Both in pricing and speed of further
development. MAybe NV could counter that if they'd be able to make use of
the 3dfx SLI tech, which they own. But that's just a maybe, and it surely
won't work with the GF line of GPUs. The time to develop a new SLI-capable
GPU could possibly make NV miss yet another product cycle.
 
E

Eric Pobirs

Anyone yet thinking what could be the result if Ati were to aquire XGI ?
Like Ati building dual-chip cards with next generation GPUs that are as
great in their time as their current ones ? And the shortcomings of XGI's
original design supposedly fixed ? A card that actually delivers nearly
twice the power of a single-chip one, while the single-chip one is still up
to par with or even superior to any Nvidia offering ? That could well
destroy NV and leave Ati with a monopoly. And as much as I dislike NV, an
Ati monopoly would hurt everyone. Both in pricing and speed of further
development. MAybe NV could counter that if they'd be able to make use of
the 3dfx SLI tech, which they own. But that's just a maybe, and it surely
won't work with the GF line of GPUs. The time to develop a new SLI-capable
GPU could possibly make NV miss yet another product cycle.
http://www.tomshardware.com/graphic/19991230/

ATI already did this previously with the Rage Fury MAXX. It was a
stopgap measure while they were getting the first generation Radeon
prepared. (One of NVIDIA's customers created a dual GPU card a long time ago
but it wasn't never released.) This is not something they or any other video
chip company would need help from XGI to do. This sort of thing is really
fairly trivial as video rendering lends it self very to divvying up among
multiple processors. In fact, most current video chips perform multiple
parallel operations internally. Doing it with separate chips is little
different but for the performance losses for communication between the chips
for coordination.
 
D

Dirk Dreidoppel

ATI already did this previously with the Rage Fury MAXX. It was a
stopgap measure while they were getting the first generation Radeon
prepared. (One of NVIDIA's customers created a dual GPU card a long time ago
but it wasn't never released.) This is not something they or any other video
chip company would need help from XGI to do. This sort of thing is really
fairly trivial as video rendering lends it self very to divvying up among
multiple processors. In fact, most current video chips perform multiple
parallel operations internally. Doing it with separate chips is little
different but for the performance losses for communication between the chips
for coordination.

Well, the MAXX wasn't overly effective. The only ones with decent multichip
solutions to date were 3dfx.
 
D

Dark Avenger

Anyone yet thinking what could be the result if Ati were to aquire XGI ?
Like Ati building dual-chip cards with next generation GPUs that are as
great in their time as their current ones ? And the shortcomings of XGI's
original design supposedly fixed ? A card that actually delivers nearly
twice the power of a single-chip one, while the single-chip one is still up
to par with or even superior to any Nvidia offering ? That could well
destroy NV and leave Ati with a monopoly. And as much as I dislike NV, an
Ati monopoly would hurt everyone. Both in pricing and speed of further
development. MAybe NV could counter that if they'd be able to make use of
the 3dfx SLI tech, which they own. But that's just a maybe, and it surely
won't work with the GF line of GPUs. The time to develop a new SLI-capable
GPU could possibly make NV miss yet another product cycle.

Pff, well .. I don't think ati should go duo chip... I guess 3dfx and
xgi now has shown that unless you truly put considerable work on it to
have the performance maxed on such cards...they are just to expensive!

The problem, how to get both GPU's to work good AND work good
together.

Now, if XGI.. rethinks their position..and does like S3 ( via ) ..
sneak up from the mainstream cards! Then you are smart, now they are
overhyped, way to slow, card is simply not ready. It's just bad
news...
 
T

The Black Wibble

Dark Avenger said:
"The Black Wibble" <[email protected]> wrote in message

Well though you sound a little hysterical, you are right.

XGI's top product, costing aroun 350 euro atleast has definit faults
in their design! For instance, only a very slow memory bus between
both gpu's! The memory bus from GPU to their own little memory chips
is fast but.. to then have a very small memory bus between both
gpu's... killing!

Well, exactly... It makes the other GPU a waste of space.
Also to have no inbuild compression, that sounds like the
parphelia....I mean, we life in 2004, no compression inbuild in your
card is so 1998! A little Fast Z Culling, Z compression and Color
Compression and you can pump much more over your memory busses!

And then the GPU's, fair to be said the gpu's do their work... maybe
the output is not yet of good quality but..they do their work! And the
Dual GPU cards basicly are as fast as their... one gpu cards. Again
because of that no compression and the very small memory bus between
both gpu's!

So basicly xgi has allot of work to do, this board never should have
left the drawing board. This is definitly a sign of a board being
PUSHED on the market way to early!

You'd think one of XGI's engineers would have said something early on. "Hey, guys... Let's be realistic.
This is a ~crap~ design, and so we should redo this, this, and this before we reach the point of no return."
But no! Evidently, there has been more than a few blonde moments during the development process because the
result is an extremely flawed product. The blatant cheats in the driver code have only compounded the shame.
Now hopefully xgi has had allot of time to think... and have been
looking to improving their designs..and adding compression!

I'm not so sure. If you had shares in XGI would you keep them?

Tony.
 
T

The Black Wibble

Dirk Dreidoppel said:
Anyone yet thinking what could be the result if Ati were to aquire XGI ?
Like Ati building dual-chip cards with next generation GPUs that are as
great in their time as their current ones ? And the shortcomings of XGI's
original design supposedly fixed ? A card that actually delivers nearly
twice the power of a single-chip one, while the single-chip one is still up
to par with or even superior to any Nvidia offering ? That could well
destroy NV and leave Ati with a monopoly. And as much as I dislike NV, an

If ATI wanted to build a dual GPU card to destroy nVidia's monopoly, why would ATI seek to acquire XGI in
order to do it? The dual processor Volari is a complete disaster in its design. It would make more sense to
acquire my girlfriend to draw the GPU to GPU interconnections with her lipstick.
Ati monopoly would hurt everyone. Both in pricing and speed of further
development. MAybe NV could counter that if they'd be able to make use of
the 3dfx SLI tech, which they own. But that's just a maybe, and it surely
won't work with the GF line of GPUs. The time to develop a new SLI-capable
GPU could possibly make NV miss yet another product cycle.

I suppose having two powerful PCI-X cards running in SLI mode is kind of drool worthy, but there is no hint of
nVidia moving towards that end. It would be aiming for big improvents in the single GPU/single card design as
per usual. nVidia's CEO said "if we're not a lot more than 2 times faster I'm going to be very disappointed"
in referring to the NV4x cards. I don't think there is any enthusiasm for SLI, neither by nvidia nor by
hardcore gamers.

Tony.
 
E

Eric Pobirs

Dirk Dreidoppel said:
Well, the MAXX wasn't overly effective. The only ones with decent multichip
solutions to date were 3dfx.

No surprise there. The ATI chips of that generation weren't very
competitive individually and thus doubling them up was only limited
compensation. Much the same can be said for XGI's product, which is why I
always regard such designs as suspect. Unless they can deliver the dual chip
design at a remarkably low price it is simply a dead giveaway that the
company behind the product hasn't got the good to compete.

The question wasn't whether it was a good idea, only if ATI needed to
acquire technology from another company to do it. The answer is that they
nor any other company worth mentioning would find this a more than trivial
task.

It should be noted that the multichip designs from 3Dfx weren't much to
write home about either. (Unless you were in the arcade business like
Quantum, which was the reason the SLI and later multichip AGP boards were
originally created. Much less price sensitivity for that application.)
Remember, the company released those products while it was in its death
spiral. I was at the Comdex press conference where they introduced that chip
generation. (8:00 AM at the Venetian with no coffee until afterwards!) When
they went on for over an hour talking about what swell company they were but
not about any new product I got a very bad feeling. I'd been to a lot of
these events and this was shaping up into the sort of behavior expressed by
companies that have lost their direction. By the time it was over it was
apparent that the new boards were months away (almost six months, it turned
out) and the demos were so vague it gave us hardly anything to really write
about.

They did give out a really nice laptop backpack that I still use, so it
wasn't a complete waste of time.
 
J

John Lewis

yes this could very well be, in part, to do with Xbox2 because XGI's partner
is SiS, who is providing Xbox2's south bridge (part of Xbox2's chipset)


http://www.beyond3d.com/forum/viewtopic.php?t=10671
QUOTE

"Lob this in the rumour mill for the time being.... "

Quote:
__________________________________________________________
ATI TECHNOLOGIES INC. (ATYT)
May seek to merge with or acquire a graphics unit of Taiwan's Silicon
Integrated Systems Corp., according to a report in the Economic Daily News.
__________________________________________________________

Quote:
"[Edit] I said "lob this in the rumour mill" because this is copied from a
poster over on the Yahoo finance message boards, which isn't really the
bastion of news sources!! However, it looks like the message is copied and
another separate poster posted the same thing in a very close timeframe.

What we do know: ATI and SiS are going to be working together for the XBox2
and ATI recently filed to sell of a bunch of their stock in order to raise
$500M.

Whats not clear is whether this is for XGI (they are no longer SiS's
graphics division, they are only partially owned by SiS), or whether its SiS
directly - SiS have a lot of chipset experience that ATI are no doubt keen
to get access to."

May it be a long and happy marriage........ Ati and XGI deserve each
other. By the time Microsoft finishes with them, they won't be able to
afford a divorce.

John Lewis
 
J

J. Clarke

The said:
Dark Avenger said:
"The Black Wibble" <[email protected]> wrote in message
news:[email protected]... [...]
good for ATI if they do get XGI, but not so good for users because it
reduces competition in the industry.

Volari, man! Hahahahaha! HAHAHAHAHA!

ATI, SiS, IBM are all providing key pieces of Xbox2.

Tony.

Well though you sound a little hysterical, you are right.

XGI's top product, costing aroun 350 euro atleast has definit faults
in their design! For instance, only a very slow memory bus between
both gpu's! The memory bus from GPU to their own little memory chips
is fast but.. to then have a very small memory bus between both
gpu's... killing!

Well, exactly... It makes the other GPU a waste of space.
Also to have no inbuild compression, that sounds like the
parphelia....I mean, we life in 2004, no compression inbuild in your
card is so 1998! A little Fast Z Culling, Z compression and Color
Compression and you can pump much more over your memory busses!

And then the GPU's, fair to be said the gpu's do their work... maybe
the output is not yet of good quality but..they do their work! And the
Dual GPU cards basicly are as fast as their... one gpu cards. Again
because of that no compression and the very small memory bus between
both gpu's!

So basicly xgi has allot of work to do, this board never should have
left the drawing board. This is definitly a sign of a board being
PUSHED on the market way to early!

You'd think one of XGI's engineers would have said something early on.
"Hey, guys... Let's be realistic. This is a ~crap~ design, and so we
should redo this, this, and this before we reach the point of no return."
But no! Evidently, there has been more than a few blonde moments during
the development process because the
result is an extremely flawed product. The blatant cheats in the driver
code have only compounded the shame.

You've clearly never worked for the kind of PHB who shoots the messenger.
Perhaps one of the engineers _did_ say that and after the others saw what
happened to him they decided to keep mum. Not saying that's what happened
in this case but I've seen it happen often enough that it wouldn't surprise
me.
 
D

Dirk Dreidoppel

It should be noted that the multichip designs from 3Dfx weren't much
to
write home about either. (Unless you were in the arcade business like
Quantum, which was the reason the SLI and later multichip AGP boards were
originally created. Much less price sensitivity for that application.)
Remember, the company released those products while it was in its death
spiral. I was at the Comdex press conference where they introduced that chip
generation. (8:00 AM at the Venetian with no coffee until afterwards!) When
they went on for over an hour talking about what swell company they were but
not about any new product I got a very bad feeling. I'd been to a lot of
these events and this was shaping up into the sort of behavior expressed by
companies that have lost their direction. By the time it was over it was
apparent that the new boards were months away (almost six months, it turned
out) and the demos were so vague it gave us hardly anything to really write
about.

To be fair, the VSA-100 chip wasn't that bad at all. I bought my Voodoo 5
5500 at the end of 2000, just when 3dfx went down for good. Even without
driver updates (except for fan made ones) this card played everything I
wanted it to until 2002. And even in 2003 it still ran most titles, though
lack of T&L and OpenGL 1.2 and higher standards started to hurt. I still
waited until christmas 2003 before I got my Radeon. That old V5 even played
UT2K3 well enough. Granted, it wasn't Rampage, which might have saved 3dfx
if it had been out in 2001 as planned.
 
J

John Lewis

To be fair, the VSA-100 chip wasn't that bad at all. I bought my Voodoo 5
5500 at the end of 2000, just when 3dfx went down for good. Even without
driver updates (except for fan made ones) this card played everything I
wanted it to until 2002. And even in 2003 it still ran most titles, though
lack of T&L and OpenGL 1.2 and higher standards started to hurt. I still
waited until christmas 2003 before I got my Radeon. That old V5 even played
UT2K3 well enough. Granted, it wasn't Rampage, which might have saved 3dfx
if it had been out in 2001 as planned.

Agreed.

I have two machines on a KB/Mouse/Monitor/Audio switch

One has a Celeron 1.1GHz with a V5 5500, WinMe which I use for email,
news, web-browsing and my legacy DOS and Glide game titles. With the
1.1GHz CPU/V5 5500, it runs these titles like a dream.

The other machine has a 3.06/533 P4/GF FX5900, WinXP, which
I use for all heavy-duty apps including video-editing and current game
titles.

John Lewis
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top