Intel buying Nvidia rumours begin

S

Sebastian Kaliszewski

Tony said:
Wow, that would be an extremely stupid move by Intel if they did
that... almost as stupid as AMD buying ATI!

Buing NVidia by Intel being a stupid move is understandable -- the companies
are 'incompatible' (completely different 'culture') so the effect might be
just wasted resources and AMD/ATI gaining near monopoly in a few years in
higher (and more profitable) part of the GPU market (leaving bottom feeder
to nIntel, which Intel has anyways without becoming nIntel).

But why buying ATI is stupid?


rgds
 
Y

Yousuf Khan

Tony said:
Wow, that would be an extremely stupid move by Intel if they did
that... almost as stupid as AMD buying ATI!

It'll never pass by anti-trust regulators anyway. AMD buying ATI to get
into a market where it had previously had 0% marketshare is one thing.
But Intel buying Nvidia to go from a 60% marketshare to somewhere around
80% marketshare is unlikely to ever pass muster.

The anti-trust regulators in the US might be pussies when it comes to
enforcing anti-trust rules against Intel's existing CPU monopoly, but
there's no way that they are going to approve the creation of a new
monopoly.

Yousuf Khan
 
Y

Yousuf Khan

Sebastian said:
But why buying ATI is stupid?

To me, the idea has grown on me. I myself had initial concerns about the
price that AMD is paying for ATI. Paying $5.4B after previous
announcements of expenditures of $2.5B (Dresden) and $3.0B (New York),
seemed a bit nutty. However, it's not like as if AMD is buying a dotcom
bubble economy "potential value" company here, it's buying a company
with real revenues and profits. The revenues will help AMD pay down the
cost of the purchase itself.

Yousuf Khan
 
T

Tony Hill

Buing NVidia by Intel being a stupid move is understandable -- the companies
are 'incompatible' (completely different 'culture') so the effect might be
just wasted resources and AMD/ATI gaining near monopoly in a few years in
higher (and more profitable) part of the GPU market (leaving bottom feeder
to nIntel, which Intel has anyways without becoming nIntel).

But why buying ATI is stupid?

I've mentioned this a few times in the past. Basically I think there
are a few things. First, AMD is spending a LOT of money to get into a
very low proft business. Second, they are endangering their
relationship with their #1 partner nVidia who supplies more mainboard
chipsets and video chipsets than any other company for systems with
AMD CPUs. Third, they will likely be left at a disadvantage when
trying to sell video chipsets for systems using Intel motherboard
chipsets (probably about 60% of the video card market). Similary
Intel could well pull AMT/ATI's license to produce chipsets for Intel
processors.

But finally and most importantly, loss of focus. The main reason why
I see that AMD has done so well in the past few years and Intel done
so poorly is focus. Intel has LOTS of other businesses in addition to
their CPUs, virtually every one of which is losing money.

Besides which, as you rightly say, Intel and nVidia's corporate
cultures likely wouldn't mesh that well. Similarly I'm not sure that
ATI and AMD's corporate cultures would mesh all that well either. All
in all, I see a lot of downsides and very few (if any) upsides to the
deal.
 
S

Sebastian Kaliszewski

Playing devlis advocate here... ;)
I've mentioned this a few times in the past. Basically I think there
are a few things. First, AMD is spending a LOT of money to get into a
very low proft business.

AMD's primary business was low profit to them most of the time (but last 2
years) ;)
Second, they are endangering their
relationship with their #1 partner nVidia who supplies more mainboard
chipsets and video chipsets than any other company for systems with
AMD CPUs.

As long AMD play it nice, NVidia has not much space left. Intel is known for
screwing their chipset (and other) partners more that once. Telling f**k off
to AMD is not their interest.

Third, they will likely be left at a disadvantage when
trying to sell video chipsets for systems using Intel motherboard
chipsets (probably about 60% of the video card market).

AMD/ATI combined is stronger here than ATI alone. Plus AMD has varius IP
cross-licesing agreements with Intel.

Similary
Intel could well pull AMT/ATI's license to produce chipsets for Intel
processors.

It's not so easy. And AMD has much more money for lawyers than ATI alone
(and past performance indicates that AMD is willing to use that)
But finally and most importantly, loss of focus. The main reason why
I see that AMD has done so well in the past few years and Intel done
so poorly is focus. Intel has LOTS of other businesses in addition to
their CPUs, virtually every one of which is losing money.

Well, maybe they foresee the change of focus on the market. Look at this --
CPU's are less & less important for PC's perfromance. With stuff like
physics coprocessors enetering arena importance of CPU as key performance
component even decreases.

In Austria, in the first half of XX centurey, there was a company which kept
allmost total monopoly in a production of horse wagons. They even had
various govement aids like high import taxes for foreign products. They were
so big that they had their own iroworks producing only for them. Then
50-ties came, and all was kaput. The market has vanished. They don't exist
anymore, of course.

CPUs is a business which made both Intel and AMD significant. But will it be
able to keep those companies up in the future (with all their R&D costs and
expenses)?
Besides which, as you rightly say, Intel and nVidia's corporate
cultures likely wouldn't mesh that well.

Thats allost a given.
Similarly I'm not sure that
ATI and AMD's corporate cultures would mesh all that well either. All
in all, I see a lot of downsides and very few (if any) upsides to the
deal.

The main upsides are:
* Ability to create Centrino counterpart
* Better ability to play in commoditised market
* Better ability to play in the middle of the market where bread&butter of
the desktop PC is -- high performance integrated AMD/ATI solutions for stuff
like media center PCs, with quite good playability of games and stuff.
Embedded graphics on coherent HyperTransport link might enable peroformance
unseen in embedded arena.
* AMD's one of the few companies in the world which have apropriate inhouse
know-how as well as state-of-the-art software&hardware for high peroformance
IC design & development. If used properly that ability could possibly
translate into significant improvement in combined company GPU designs.
* Combined company has finally whole platform in their hands -- look how
long it took to have decent chipsets for K7/K8 platforms. VIA & their
chipses which for a few years (since 586B southbridge) f***ing up data in
multi harddrive systems without even acknowledging the problem (only
releasing driver updates which never fully got rid of it) can't be takes
seriously. Now when AMD want's to change something they can just do it and
the chipsets will be there (as they'll make their own).


rgds
 
C

chrisv

Sebastian said:
Well, maybe they foresee the change of focus on the market. Look at this --
CPU's are less & less important for PC's perfromance. With stuff like
physics coprocessors enetering arena importance of CPU as key performance
component even decreases.

Well, just when you think that CPU's are less important, along comes
M$ with "Vista" to bring your machine to it's knees...
 
S

Sebastian Kaliszewski

chrisv said:
Well, just when you think that CPU's are less important, along comes
M$ with "Vista" to bring your machine to it's knees...

But give "Vista" half-decent GPU and enough RAM and it will fly even without
Core 2 Duo.

GPU is the key here... And look what AMD bought ;)
Then look if Intels GPUs are half-decent or not, while ATI's even cheap
solutions can be redarded as such. :)


rgds
 
G

George Macdonald

I've mentioned this a few times in the past. Basically I think there
are a few things. First, AMD is spending a LOT of money to get into a
very low proft business. Second, they are endangering their
relationship with their #1 partner nVidia who supplies more mainboard
chipsets and video chipsets than any other company for systems with
AMD CPUs. Third, they will likely be left at a disadvantage when
trying to sell video chipsets for systems using Intel motherboard
chipsets (probably about 60% of the video card market). Similary
Intel could well pull AMT/ATI's license to produce chipsets for Intel
processors.

But finally and most importantly, loss of focus. The main reason why
I see that AMD has done so well in the past few years and Intel done
so poorly is focus. Intel has LOTS of other businesses in addition to
their CPUs, virtually every one of which is losing money.

Besides which, as you rightly say, Intel and nVidia's corporate
cultures likely wouldn't mesh that well. Similarly I'm not sure that
ATI and AMD's corporate cultures would mesh all that well either. All
in all, I see a lot of downsides and very few (if any) upsides to the
deal.

I understand your caution here and agree with some of it... *depending* on
how things play out in the various current "markets" and how they evolve.
There's an interesting article here
http://www.edn.com/article/CA6262535.html on the "plateau" in GPU
processing as we know it: "pixel/vertex/triangle growth". The current
artificial layering/segementation of the GPU market has gotten ridiculous -
just look at all the hacks to turn a GeForce into a Quadro, which actually
worked until nVidia decided to add a "fix"(??).

From the above article, the GPU is going to change dramatically in the
relatively near future... with more custom logic, a more CPU-like layout
and that's certainly an area where AMD(/IBM)'s bleeding edge process
technology can make a difference. Without this evolution, GPUs are hitting
the wall of what can be done with standard(-ish) cells and, it would
appear, have nowhere to go; i.e. your "low-profit business" profile gets
lower and lower.:)

I can see some *potential* synergies here. The AMD/nVidia relationship is
a difficult one to predict as far as future viabilty: as long as the
proposed Torrenzo initiative is an external HT link, it could work and
nVidia can still compete with ATi in the AMD space... users can have their
choice; if it goes on-die, the whole game changes but that's a ways in the
future. With quad core dies on the horizon, how long before heterogeneous
multi-core chips?

It could be that the downside for GPU designers otherwise is err,
oblivion.;-)
 
Y

Yousuf Khan

Tony said:
I've mentioned this a few times in the past. Basically I think there
are a few things. First, AMD is spending a LOT of money to get into a
very low proft business. Second, they are endangering their
relationship with their #1 partner nVidia who supplies more mainboard
chipsets and video chipsets than any other company for systems with
AMD CPUs. Third, they will likely be left at a disadvantage when
trying to sell video chipsets for systems using Intel motherboard
chipsets (probably about 60% of the video card market). Similary
Intel could well pull AMT/ATI's license to produce chipsets for Intel
processors.

I think AMD & ATI has already assumed that the Intel i/o chipset
business is going away. As for video chipsets, if we're talking about
video cards, then there's nothing Intel could do to stop AMD video cards
from being installed on their systems, it goes through PCI-E slots
anyways (i.e. open-standard, just as Intel wanted it to be). If we're
talking about motherboard-based non-integrated video chipsets, that's
not a huge market anyways, and it too goes through PCI-E connections
anyways. In other words, there's not a lot Intel could do to not allow
AMD video devices to work under their systems. The danger is the other
way around, there's a lot that AMD can do (if they wanted to) to cripple
video on Intel systems.

As for Nvidia, they'll have to just hold their noses and accept it. It's
either that, or deal with Intel.

As for AMD spending a lot of money for a low-margin business, they're
not likely buying it for the existing businesses, though they won't mind
the revenue from it, they're likely buying it for the future products
that will come out of it.
But finally and most importantly, loss of focus. The main reason why
I see that AMD has done so well in the past few years and Intel done
so poorly is focus. Intel has LOTS of other businesses in addition to
their CPUs, virtually every one of which is losing money.

If what the next version of Windows will bog down these days is the GPU
rather than the CPU, then AMD is entering the right business at the
right time. There's bound to be a lot of video upgrades coming because
of this.
Besides which, as you rightly say, Intel and nVidia's corporate
cultures likely wouldn't mesh that well. Similarly I'm not sure that
ATI and AMD's corporate cultures would mesh all that well either. All
in all, I see a lot of downsides and very few (if any) upsides to the
deal.

ATI was a sad-sack by itself, rocked by scandals, having to run to stay
in place. The ATI employees are probably cheering for this as much as
the shareholders. And AMD is buying them out because of their technology
and engineers, should mean that AMD will keep them happy.

Yousuf Khan
 
G

George Macdonald

Well, just when you think that CPU's are less important, along comes
M$ with "Vista" to bring your machine to it's knees...

.... and new video cards needed to support DirectX 10, and which don't exist
yet.
 
T

Tony Hill

Playing devlis advocate here... ;)


AMD's primary business was low profit to them most of the time (but last 2
years) ;)

Prop up one low-profit business with a second low-profit business?
As long AMD play it nice, NVidia has not much space left. Intel is known for
screwing their chipset (and other) partners more that once. Telling f**k off
to AMD is not their interest.

I don't see them directly telling nVidia to screw off, but if they use
this new combined AMD-ATI to try and gain a competitive advantage in
the PC graphics market, then they are basically telling nVidia to
screw off. If they aren't doing this to gain a competitive advantage
then what was the point in the first place?
AMD/ATI combined is stronger here than ATI alone. Plus AMD has varius IP
cross-licesing agreements with Intel.

AMD has some cross-licensing agreements, but none of them are likely
to cover video chipsets. Besides it's not a matter of being legally
allowed to sell the chips that is the worry, it's about getting
pre-release info and help. Right now ATI and nVidia both get access
to Intel's chipsets LONG before they are released so that they can
develop video cards that will work with these chipsets. If Intel
stops providing these early chipsets and support in getting the cards
to work with them, ATI could be left at a serious dissadvantage to
nVidia.
It's not so easy. And AMD has much more money for lawyers than ATI alone
(and past performance indicates that AMD is willing to use that)

AMD is explicitly forbidden from using Intel's processor bus
technologies. This is part of a long-standing agreement dating back
to the early 90's in an effort to prevent AMD from selling processors
that will work in the same motherboards as Intel chips. It isn't much
of a stretch at all to think this could apply to chipsets as well as
processors.

Right now ATI has a license that grants them the right to build
chipsets for Intel processors, but that license is definitely going to
be full of limitations. Intel HAS pulled companies licenses in the
past. Serverworks is a prime example here, Intel all but terminated
their license after they were bought out by Broadcom. I *FULLY*
expect to see history repeat itself here, and like it or not, Intel
has every legal right to do so.
Well, maybe they foresee the change of focus on the market. Look at this --
CPU's are less & less important for PC's perfromance. With stuff like
physics coprocessors enetering arena importance of CPU as key performance
component even decreases.

Independant physics co-processors are a lost cause. If the technology
proves useful (somewhat questionable) then they'll get integrated into
a CPU. Having separate chips to handle these sorts of math things has
proven to be a bad idea.

Video, on the other hand, is a different story. Integrating video
onto the CPU has proven to be excrutiatingly difficult. The problems
are two-fold: first, GPUs have a LOT of transistors, even more than
CPUs. Second, GPUs needs HUGE amounts of memory bandwidth while CPUs
need LOTS of memory and flexible memory configurations. While a GPU
can get by very nicely with 512MB of memory soldered onto a board,
that just isn't an option for a CPU. A server might need 64GB of
memory, while a desktop might only need 1GB. Both scenarios would
benefit little from the huge bandwidth offered but would suffer very
badly from the increased cost of the faster memroy.
In Austria, in the first half of XX centurey, there was a company which kept
allmost total monopoly in a production of horse wagons. They even had
various govement aids like high import taxes for foreign products. They were
so big that they had their own iroworks producing only for them. Then
50-ties came, and all was kaput. The market has vanished. They don't exist
anymore, of course.

CPUs is a business which made both Intel and AMD significant. But will it be
able to keep those companies up in the future (with all their R&D costs and
expenses)?

It's a business that has huge R&D and capital (if you own fabs) costs,
combined with low profit margins. Not an easy business to succeed in.
The end result is that we're left with little more than companies that
focus ONLY on building CPUs. Look at companies like Hitachi, TI,
Motorola, Digital/Compaq/HP, etc. etc. All used to build high-end
CPUs but they either got out of the market or spun that division off
on it's own. Sun and Fujitsu are still struggling at it, but mostly
failing. The only exception to the above is IBM, who are the
exception to most rules in the computer world.
Thats allost a given.


The main upsides are:
* Ability to create Centrino counterpart

That requires marketing much more than any technology, and marketing
is an area that neither AMD or ATI are hugely strong at. Both are
out-marketted by their main rivals (Intel and nVidia respectively).
* Better ability to play in commoditised market

Commoditized for even less profit?
* Better ability to play in the middle of the market where bread&butter of
the desktop PC is -- high performance integrated AMD/ATI solutions for stuff
like media center PCs, with quite good playability of games and stuff.
Embedded graphics on coherent HyperTransport link might enable peroformance
unseen in embedded arena.

Embedded graphics performance is 99% memory bandwidth, 1% everything
else. Unless you plan on dropping memory on the motherboard to
connect to your video card, you're actually better off with the
external memory controller as Intel does things. Ohh, and ATI already
tried an AMD Hypertransport compatible chipset with memory on the
system board... it was a miserable failure due to costs (motherboards
are the one part of the PC where profit margins are worse than CPUs
and GPUs).
* AMD's one of the few companies in the world which have apropriate inhouse
know-how as well as state-of-the-art software&hardware for high peroformance
IC design & development. If used properly that ability could possibly
translate into significant improvement in combined company GPU designs.

I'll grant this as the one real advantage of the whole deal, combined
with AMD having their own fabs.
* Combined company has finally whole platform in their hands -- look how
long it took to have decent chipsets for K7/K8 platforms. VIA & their
chipses which for a few years (since 586B southbridge) f***ing up data in
multi harddrive systems without even acknowledging the problem (only
releasing driver updates which never fully got rid of it) can't be takes
seriously. Now when AMD want's to change something they can just do it and
the chipsets will be there (as they'll make their own).

Honestly I think that this "advantage" is doesn't really exist. When
the Athlon64 was released there were plenty of chipsets immediately
available. VIA was still screwing the pooch as usual, but nVidia was
there right from the get-go. AMD has done their own chipsets in the
past and they've proven to be inferior to nVidia's solutions for the
most part. This goes to show that just because AMD and ATI would be
one company, they aren't necessarily going to be any better at making
chipsets.

Besides, with AMD's current processors the real magic in chipsets is
in the "extra" stuff, ie PCI-Express, network chips, SATA, audio, etc.
The memory hangs off the processor so it's out of the equation and the
processor connects to the rest of the system by Hypertransport, which
is an open standard and relatively easy to implement. What this means
is that having knowledge of the CPU doesn't really buy you much of
anything when building the chiset and vice versa.
 
T

Tony Hill

I think AMD & ATI has already assumed that the Intel i/o chipset
business is going away. As for video chipsets, if we're talking about
video cards, then there's nothing Intel could do to stop AMD video cards
from being installed on their systems, it goes through PCI-E slots
anyways (i.e. open-standard, just as Intel wanted it to be).

The funny thing about standards is that they aren't. Esepcially when
Intel is involved. You can't just take a PCI-E card from one system
and expect that it will work, without fail, in another system. You
need to write new drivers and do extensive testing to make sure that
the thing will work *properly*, and that's where ATI's video chipset
business will suffer. If nVidia gets a 4-6 month lead on testing
their video cards with Intel's latest and greatests chipsets it will
pretty much kill ATI's chances of competing at the high-end.
 
M

Mark A

Tony Hill said:
The funny thing about standards is that they aren't. Esepcially when
Intel is involved. You can't just take a PCI-E card from one system
and expect that it will work, without fail, in another system. You
need to write new drivers and do extensive testing to make sure that
the thing will work *properly*, and that's where ATI's video chipset
business will suffer. If nVidia gets a 4-6 month lead on testing
their video cards with Intel's latest and greatests chipsets it will
pretty much kill ATI's chances of competing at the high-end.

Perhaps AMD/ATI doesn't really care that much about the high-end Intel
motherboard video market.

Intel is already the largest video chip maker in the world (by number of
units anyway) with their motherboards that have on-board video. Apparently
AMD feels that they need to be in the same market.
 
S

Sebastian Kaliszewski

Tony said:
Prop up one low-profit business with a second low-profit business?

Those business combined together might be higher profit (if combination is
executed properly)

I don't see them directly telling nVidia to screw off, but if they use
this new combined AMD-ATI to try and gain a competitive advantage in
the PC graphics market, then they are basically telling nVidia to
screw off. If they aren't doing this to gain a competitive advantage
then what was the point in the first place?

Things are the other way around. It's nVidia which has little choice but to
work with them. As long as AMD plays it nice.

AMD has some cross-licensing agreements, but none of them are likely
to cover video chipsets.


That's not a problem for AMD. ATI has enough IP in that area. AMD brings IP
cross lincencing in aras outside core GPU design, and AMD has rich patent
portfolio (they used to generate more patent than Intel sometimes -- both
companies are comparablevin that area)
Besides it's not a matter of being legally
allowed to sell the chips that is the worry, it's about getting
pre-release info and help. Right now ATI and nVidia both get access
to Intel's chipsets LONG before they are released so that they can
develop video cards that will work with these chipsets. If Intel
stops providing these early chipsets and support in getting the cards
to work with them, ATI could be left at a serious dissadvantage to
nVidia.

First if ATI cards are better than nVidia then Intel would shoot itself in
the foot maging problems. Second, Intel must be careful now with current
anti-comepetitive case thrown by AMD against them.

AMD is explicitly forbidden from using Intel's processor bus
technologies. This is part of a long-standing agreement dating back
to the early 90's in an effort to prevent AMD from selling processors
that will work in the same motherboards as Intel chips.

Is this deal still effective? AMD has since signed other agreements with Intel.
It isn't much
of a stretch at all to think this could apply to chipsets as well as
processors.

Right now ATI has a license that grants them the right to build
chipsets for Intel processors, but that license is definitely going to
be full of limitations. Intel HAS pulled companies licenses in the
past. Serverworks is a prime example here, Intel all but terminated
their license after they were bought out by Broadcom. I *FULLY*
expect to see history repeat itself here, and like it or not, Intel
has every legal right to do so.

First, I doubdt their legal right to do so. It's just an interface and AMD
has rights to use patents covering it (via cross-lincesing agreement). It
might be hard fight for Intel.

The the situation is now different that AMD has now stronger relationships
with first tier systems producers -- pissing of major customers to fight
some chipset competition (an area of tiny margins) is plain stupid, even if
you're Intel.

Independant physics co-processors are a lost cause. If the technology
proves useful (somewhat questionable) then they'll get integrated into
a CPU. Having separate chips to handle these sorts of math things has
proven to be a bad idea.

They'll rather get integrated into an GPU. Similarity is much stronger there.

Video, on the other hand, is a different story. Integrating video
onto the CPU has proven to be excrutiatingly difficult.

First of all those trying the integration either had poor GPU experience
(Intel, Cyrix), or poor CPU experience or both. That's the first and most
important problem.

The problems
are two-fold: first, GPUs have a LOT of transistors, even more than
CPUs.

CPU vendors have enough transistors to put 4 CPU cores with cache end stuff
onto single die.
Second, GPUs needs HUGE amounts of memory bandwidth while CPUs
need LOTS of memory and flexible memory configurations. While a GPU
can get by very nicely with 512MB of memory soldered onto a board,
that just isn't an option for a CPU. A server might need 64GB of
memory, while a desktop might only need 1GB. Both scenarios would
benefit little from the huge bandwidth offered but would suffer very
badly from the increased cost of the faster memroy.

There are solutions to those problems as well.

First -- Put CPU onto PCB module (good old PII & earlier PIII & Athlon
times) and put RAM here treating it as gfx buffer and L3/L4 cache. Such
local RAM is ways faster that pluggable motherboard RAM.

Second -- just use stuff in the middle-level of the market (where
accidentially vast majority of revenue lies) and just use motherboard RAM.
It's a business that has huge R&D and capital (if you own fabs) costs,
combined with low profit margins. Not an easy business to succeed in.
The end result is that we're left with little more than companies that
focus ONLY on building CPUs. Look at companies like Hitachi, TI,
Motorola, Digital/Compaq/HP, etc. etc. All used to build high-end
CPUs but they either got out of the market or spun that division off
on it's own. Sun and Fujitsu are still struggling at it, but mostly
failing. The only exception to the above is IBM, who are the
exception to most rules in the computer world.

The explanation is quite simple. Only AMD & Intel produce high end CPU's
which have big enough market to keep them going. They both now have 99% of
desktop market and vast majority of server market. All the rest like
DEC/Cpq/Hp, Motorola, Hitachi etc never even appraoched such position.

That requires marketing much more than any technology, and marketing
is an area that neither AMD or ATI are hugely strong at. Both are
out-marketted by their main rivals (Intel and nVidia respectively).

Well, when nVidia had just slightly inferior product line, they immediatley
lost the market leader position to ATI as well, despite their marketing
push. Now ATI with AMD aid stand a chance to gain that lead again.

Commoditized for even less profit?

They have no choice but adopt to changing conditions. If PC market
commoditiezes they must be prepared or die.

Embedded graphics performance is 99% memory bandwidth, 1% everything
else.

Well, nVidia's embedded solutions for K7 were rather good and they used
standard DDR SDRAM DIMM modules.
Unless you plan on dropping memory on the motherboard to
connect to your video card, you're actually better off with the
external memory controller as Intel does things.

Why? Putting second memory channel on coherentHT connected GPU might allow
really good performance (for and modo embedded stuff). And the HT 2.0 is
going to be somewhat faster than current one.

Ohh, and ATI already
tried an AMD Hypertransport compatible chipset with memory on the
system board... it was a miserable failure due to costs (motherboards
are the one part of the PC where profit margins are worse than CPUs
and GPUs).

Just use standard DRAM modules put into mobo slots.

I'll grant this as the one real advantage of the whole deal, combined
with AMD having their own fabs.




Honestly I think that this "advantage" is doesn't really exist. When
the Athlon64 was released there were plenty of chipsets immediately
available.

As AMD delayed teir product pretty close to planed release, chipset vendors
had enough time.
VIA was still screwing the pooch as usual, but nVidia was
there right from the get-go. AMD has done their own chipsets in the
past and they've proven to be inferior to nVidia's solutions for the
most part. This goes to show that just because AMD and ATI would be
one company, they aren't necessarily going to be any better at making
chipsets.

ATI can make chipsets (the do it for Intel systems now).

Besides, with AMD's current processors the real magic in chipsets is
in the "extra" stuff, ie PCI-Express, network chips, SATA, audio, etc.
The memory hangs off the processor so it's out of the equation and the
processor connects to the rest of the system by Hypertransport, which
is an open standard and relatively easy to implement. What this means
is that having knowledge of the CPU doesn't really buy you much of
anything when building the chiset and vice versa.

Exactly. And ATI more or less knows that other needed stuff.


rgds
 
K

Keith

Tony Hill wrote:



That's not a problem for AMD. ATI has enough IP in that area. AMD brings IP
cross lincencing in aras outside core GPU design, and AMD has rich patent
portfolio (they used to generate more patent than Intel sometimes -- both
companies are comparablevin that area)

More in a year than Intel? I did a quick search:

company Total 2006 YTD
Issued Issued
ATI 677 64
AMD 8713 341
INTC 12781 1543
 
Y

YKhan

The funny thing about standards is that they aren't. Esepcially when
Intel is involved. You can't just take a PCI-E card from one system
and expect that it will work, without fail, in another system. You
need to write new drivers and do extensive testing to make sure that
the thing will work *properly*, and that's where ATI's video chipset
business will suffer. If nVidia gets a 4-6 month lead on testing
their video cards with Intel's latest and greatests chipsets it will
pretty much kill ATI's chances of competing at the high-end.

If Intel did that, then it's simply a matter of ATI showing a killer
new video card going like stink on an AMD system, but loping around on
the Intel competition. The gamers will assume it's because Intel didn't
give ATI access to their systems, and end up going with an AMD system,
just to make sure they get their full performance whether they go with
ATI or Nvidia graphics. It's not in Intel's interest to let this
happen.

So I doubt that Intel would do something like this, there is simply not
enough worthwhile intellectual property in an i/o chipset worth
protecting, to get yourself into such a competitive disadvantage.

Yousuf Khan
 
Y

YKhan

More in a year than Intel? I did a quick search:

company Total 2006 YTD
Issued Issued
ATI 677 64
AMD 8713 341
INTC 12781 1543

Not this year, but there were a few years in a row back in the early
2000's and late 1990's, when AMD did indeed generate more IP than
Intel. It was reported here too back then, if anybody can find the old
articles.

Yousuf Khan
 
G

George Macdonald

Prop up one low-profit business with a second low-profit business?


I don't see them directly telling nVidia to screw off, but if they use
this new combined AMD-ATI to try and gain a competitive advantage in
the PC graphics market, then they are basically telling nVidia to
screw off. If they aren't doing this to gain a competitive advantage
then what was the point in the first place?

I can't help thinking that AMD has been told by the likes of Mikey himself,
that to play in the mass market, i.e. have Dell et.al. as their customers,
they need the credibility of an in-house chipset & integrated video...
something like Intel's low-end and G series chipset stuff. Just look at
the Dell AMD systems for one reason why: nForce NI does not work(!!!) so we
have a Broadcom NI, i.e. wasted $$ and I have to assume that Dell spent
quite a bit of time qualifying HDDs which work with nForce SATA II. The
nForce 410/430 61xx also has issues with some DVI and DVI-HDMI devices.
AMD has some cross-licensing agreements, but none of them are likely
to cover video chipsets. Besides it's not a matter of being legally
allowed to sell the chips that is the worry, it's about getting
pre-release info and help. Right now ATI and nVidia both get access
to Intel's chipsets LONG before they are released so that they can
develop video cards that will work with these chipsets. If Intel
stops providing these early chipsets and support in getting the cards
to work with them, ATI could be left at a serious dissadvantage to
nVidia.

Conversely, with AMD systems, nVidia doesn't need intimate knowledge of
proprietary bus design elements - sure they have to be able to do drivers
which program the North Bridge components in the AMD CPU but even that's
gotten simpler with PCI-e and the disappearance of GART from the NB; the
rest is fairly well cast and is not going to change much with new CPUs.
AMD is explicitly forbidden from using Intel's processor bus
technologies. This is part of a long-standing agreement dating back
to the early 90's in an effort to prevent AMD from selling processors
that will work in the same motherboards as Intel chips. It isn't much
of a stretch at all to think this could apply to chipsets as well as
processors.

Any previous patent agreements were superseded by the 2001 cross-license:
http://contracts.corporate.findlaw.com/agreements/amd/intel.license.2001.01.01.html
- with all the obliteration, difficult to say what's in current
arrangements but I think it's safe to assume that it does not include the
Intel FSB.
Right now ATI has a license that grants them the right to build
chipsets for Intel processors, but that license is definitely going to
be full of limitations. Intel HAS pulled companies licenses in the
past. Serverworks is a prime example here, Intel all but terminated
their license after they were bought out by Broadcom. I *FULLY*
expect to see history repeat itself here, and like it or not, Intel
has every legal right to do so.

Sure, any rights which ATi acquired would not be assignable in a case such
as this of a merger, OTOH, Ati was only filling in the bottom end of the
Intel integrated graphics chipset market because Intel found itself with
insufficient low-end chipset production. I'm not sure that Intel could
find anyone else to trust for such a collaboration... not nVidia IMO.
Independant physics co-processors are a lost cause. If the technology
proves useful (somewhat questionable) then they'll get integrated into
a CPU. Having separate chips to handle these sorts of math things has
proven to be a bad idea.

And yet, that's precisely what IBM is talking about right now with Power 6:
http://www.edn.com/article/CA6379673.html - a decimal floating point
processor!! Who woulda thunk it?

Embedded graphics performance is 99% memory bandwidth, 1% everything
else. Unless you plan on dropping memory on the motherboard to
connect to your video card, you're actually better off with the
external memory controller as Intel does things. Ohh, and ATI already
tried an AMD Hypertransport compatible chipset with memory on the
system board... it was a miserable failure due to costs (motherboards
are the one part of the PC where profit margins are worse than CPUs
and GPUs).

Hmmm, I don't recall ATi doing an AMD64 system with mmeory off a chipset
attached through the HT - didn't get out the door? ISTR ALi or Sis talking
about such a thing but didn't pay much attention since it was obviously a
non-starter.
I'll grant this as the one real advantage of the whole deal, combined
with AMD having their own fabs.


Honestly I think that this "advantage" is doesn't really exist. When
the Athlon64 was released there were plenty of chipsets immediately
available. VIA was still screwing the pooch as usual, but nVidia was
there right from the get-go. AMD has done their own chipsets in the
past and they've proven to be inferior to nVidia's solutions for the
most part. This goes to show that just because AMD and ATI would be
one company, they aren't necessarily going to be any better at making
chipsets.

As I recall nVidia was a bit laggard on Athlon64 - nForce3 150, which
nobody wanted, nForce3 which was tolerable with some flaws and nForce4
which has more flaws again. I have to say that for a non-esoteric business
class system, I've had more trouble with nVidia chipsets than any VIA-based
system.
Besides, with AMD's current processors the real magic in chipsets is
in the "extra" stuff, ie PCI-Express, network chips, SATA, audio, etc.
The memory hangs off the processor so it's out of the equation and the
processor connects to the rest of the system by Hypertransport, which
is an open standard and relatively easy to implement. What this means
is that having knowledge of the CPU doesn't really buy you much of
anything when building the chiset and vice versa.

But it's the "extra stuff" where nVidia is falling down: while ATi is no
better with "support" for their Express and Crossfire chipsets, they do not
want to hear from users, so hardware flaws never get properly reported and
never get fixed. It's >2 years now and they can't make their network
interface work right - the offloading is utterly F/U; nForce3 has been
abandoned as far as driver updates are concerned... absolutely nothing
since 3Q04! Now nForce4 is to be abandoned with nForce5 coming out and the
network interface is still F/U. Driver updates are a joke - they just
don't work and often screw the system up instead of fixing it.

Then there's SATA II which has been a fiasco; Intel has taken some stick
too but the number of HDDs which just don't work with nForce4 is a disgrace
and apparently nForce5 is no better. If I'm ranting a bit here it's
because I just pulled an all-nighter, and the next day, getting a nForce4
system to quit falling over. I'm pissed with nVidia!
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top