My lack of enthusiasm for AMD

Y

Yousuf Khan

Robert said:
By all appearances, Intel's operations are driven by margin. The only
sound way to decide between an MCM and a single chip quad-core design
that I know of is the difference in margin. If you know of arguments
to the contrary, I'd love to hear them, if you can say.

It's entirely based on development cost reasons for going with a
dual-die design over a single-die. But it's not based as much on
production cost reasons. Though the argument is made that a single-die
quad-core is a larger die than a dual-core die, therefore you get less
potential defects and higher yields on a dual-core, I think it's a
specious argument. They would be making all of the single-die quad-cores
on separate wafers from single-die dual-cores. Therefore all quad-cores
will be optimized for higher yield in a different way than all dual-cores.

But the real reason for going with a dual-die quad-core (i.e. two
dual-cores) is time-to-market. They can simply boast having quad-cores
first. However, I don't think the results will stand up to the test of
time. AMD is already talking about 40% better performance for its
single-die quad-core over Intel's dual-die quad-core.
I'll agree that the memory controller on the die is a balance of
technical and business considerations. One question is: do the watts
you free up by moving the memory controller off the die compensate for
the added latency you incur in doing so?

If anything, you save watts by putting the memory controller within the
CPU die. The CPU is usually a smaller, more power-efficient
manufacturing process than a chipset manufacturing process. So not only
do you get less latency, but lower power consumption. Win-win in every case.
One the other hand, there are powerful business reasons for keeping
the CPU and memory controller separate, one of them being that Intel
has said it really doesn't want to be in the memory controller
business, another being that memory controllers can be engineered,
changed, and specialized separately from the CPU. I've always


How can Intel claim that it doesn't want to be in the memory controller
business, when the main function of its chipsets is to be a memory
controller?

Yousuf Khan
 
Y

Yousuf Khan

You assume that market share numbers reflect acceptance of current
designs. That's like assuming that the current through a capacitor or
inductor is proportional to the voltage across it. The article cites
*design wins*, not sales to Best Buy and CompUSA.

Well, it's hard to create reports based on *future* market share
numbers. As for citing *design wins*, the article doesn't even put
numbers on those, it's just a feeling some analyst has.

Besides, Intel has been selling its AMD-killer products now in quantity
for well over six-months already. The only thing that's happened so far
is that AMD has managed to take over some of its biggest market share
numbers *ever*. And it did that while still selling a product mix that
was mostly 90nm. AMD is now fully on 65nm, therefore it's going to get a
big manufacturing cost savings from it now. And its next-generation
products are also just a few months away too, so the performance
advantage Intel had will now be erased.

Who knows when Intel threw in the towel on Pentium 4, or knew that it
would have to. Your assumption is that Intel decided, say around 2003
Fall IDF, that P4 was a lame duck. Many of us might have agreed, but
there is just no way to know what Intel thought or planned
internally. I sure wish I knew, but I don't.

Intel had no choice but to sell a Pentium 4. P4 was the only product
that had the 64-bit extensions already grafted on by that point. They
couldn't yet implement it on to the Pentium M/Core1 products, as those
products were so heavily optimized for low-power and mobility that those
extensions would've probably added power consumption to their delicately
balanced architectures. It basically took Intel 3 years to add x64 to
Pentium-M, which they called Core 2 eventually.
Alpha had the controller on the die. Somewhere I have an email from
an alpha advocate about alpha's low latency...something like 75 ns, if
I recall correctly. That email is at least six years old. The
downside of the low IPC of NetBurst was known, as it is known that
frequency as the overriding consideration was a marketing ploy. Intel
could have changed the x86 architecture any time and knew the
advantages but didn't because it wanted Itanium to take over the 64-
bit market. Itanium, not x86, was intel's next-gen processor. If you
want to argue that Itanium was a big mistake, you'd find many, even in
the corner offices of intel, who'd agree with you.

I don't think anybody is arguing about Itanium, it's now universally
considered a big mistake. Coverage about it has dwindled to nothing
these days.

Yousuf Khan
 
Y

Yousuf Khan

In that sense, yes. What position will AMD find itself in (and,
indeed, is in already)? Pretty much the same position it found itself
in pre-Opteron: ruinous price competition with Intel. Whether the
analysts quoted in the article got it right or not, they agree with my
perception of the playing field: if AMD can't pull another ace out of
its sleeve, Intel will simply grind it into the ground. Nobody has
"permanent" market share, but companies and products have positions
that are hard to attack. AMD had such a position. The temporary (and
significant) advantage that AMD got from its Itanium-killer strategy
has evaporated, thus putting the game back on rules by which Intel
will win.

AMD's strategy was never about being an "Itanium-killer". Itanium pretty
much killed itself. AMD's strategy was simply "the next generation of x86".

As for AMD having to compete in a ruinous price competition against
Intel, yes it's having to do that right now. But that's simply because
Intel got its next-generation products out the door just a little ahead
of AMD. It took Intel at least 3 years to come up with the design of
Core 2 that it could compete against K8, and it caught AMD just before
its own next-generation products came out. It takes 3 years to design a
new architecture, so having a gap of only a few months between the
competitors' introductions of new products is not a big deal. What was
really ruinous was to Intel where it had to wait those 3 years for Core
2 to finally be ready. Intel had to stop major development on Netburst
products, and start from scratch. AMD is not going to have to wait that
long for its own answer back to Core 2 to be ready.
Maybe one of these days we'll know the real story: how much of AMD's
market share was Intel being more cautious about sales tactics and how
much was because AMD simply had the better product. I tend to believe
that it is the latter.

The answer is already pretty evident. K8 came out in 2003, where it got
very little attention for at least a year. Then in March 2005, AMD
launched its anti-trust lawsuit against Intel (after AMD got a favorable
ruling from the Japanese anti-trust authorities), and within one quarter
its market share was going up, starting a string of consistently up
quarters which continued to the present day, this past quarter.

Basically the major technological improvements were all introduced in
2003, but the market share only started going up after the lawsuit was
filed. Basically the lawsuit felt pretty airtight to AMD and Intel's OEM
customers, that they no longer feared Intel, while Intel feared getting
caught. The lawsuit opened the doorway, not the technology. Now that
Intel's monetary threats are neutralized, it now just competes on
technological merits.
As to Intel marketing, I continue to be impressed by its
aggressiveness. Core 2 Duo is everywhere you look right now. AMD
just doesn't have the resources.

Intel has always advertised, and it has always advertised more than AMD.
But now it can't use its advertising money as a threat against OEMs, to
not use AMD anymore.

Intel's ads didn't seem to prevent AMD from taking over 50% of the
retail desktop and laptop markets this past year. People aren't
impressed with the Intel ads anymore, they just buy on price nowadays.
For the IBM'ers here who don't think that marketing and perception are
important, consider IBM falling all over itself to rush out news of
its own high-K dielectric on exactly the same day as Intel. Marketing
and perception are important, independent of technical details. That
is to say: name brands *do* matter. Every computer I've seen
advertised recently is very explicit about what kind of processor is
inside.

I think that IBM announcement had more to do with AMD than IBM itself.
Normally, IBM could care less what Intel is announcing, since its chips
don't usually compete against Intel all that directly. I think the
frantic scramble to announce as soon as possible after Intel announced
was because IBM now has technology partners that do directly compete
against Intel. Intel decided to announce on Friday night. In the past,
I'm sure IBM would've been content to wait till Monday morning (or even
afternoon) to put out its own press-release -- if it was doing this on
its own. But it got it out later Friday night. It was a soon enough
afterwards announcement that by Monday morning the usual headlines that
would've only been talking about Intel, all had to be corrected to also
mention IBM's breakthrough. AMD got mentioned prominently alongside IBM
-- thus my feeling is that the mad scramble by IBM was all AMD's making.

Yousuf Khan
 
Y

Yousuf Khan

chrisv said:
Yousuf Khan wrote:
Eh? The "Intel Inside" marketing campaign is somehow
anti-competitive? Seems like quite-fair brand-marketing, to me...

That's the public face of "Intel Inside". In public, all we ever see is
the tv ads and the jingle. The private face of it was all about threats
and coercion.

Whatever the legalities are, I'll still take exception to your
description of a marketing campaign as "threats and coercion".

The allegations are that Intel used the "Inside" campaign as an Al
Capone-style baseball bat against OEM's to make sure they didn't use
products from its competitors.

Are you saying that this alleged market-share decline coincided with
the cessation of the "Intel Inside" marketing campaign? Or that it
coincided with the instigation of the anti-trust lawsuits? Because I
believe that to be entirely false, in either case. Has not AMD been
around 20% for some years, now?

Yup, that's exactly what I'm saying.

AMD was at or over 20% market share briefly back in 2000, and then it
went down to 15% for several years. It didn't get back up over 20% until
after it started its lawsuit. Not even the introduction of K8 in 2003
moved its market share much, despite its obvious technical superiority
to anything else at that time. The market share moved only after the
March 2005 lawsuit was filed. Intel had to hide the baseball bat. All
kinds of design wins followed for AMD at that point.

"Intel Inside" was still successful as an advertising campaign, so why
replace it? Because there was likely too many skeletons in that closet.
When they started "Leap Ahead", they could likely say that they no
longer do what they used to do with "Inside".

Yousuf Khan
 
R

Robert Myers

It's entirely based on development cost reasons for going with a
dual-die design over a single-die. But it's not based as much on
production cost reasons. Though the argument is made that a single-die
quad-core is a larger die than a dual-core die, therefore you get less
potential defects and higher yields on a dual-core, I think it's a
specious argument. They would be making all of the single-die quad-cores
on separate wafers from single-die dual-cores. Therefore all quad-cores
will be optimized for higher yield in a different way than all dual-cores.

But the real reason for going with a dual-die quad-core (i.e. two
dual-cores) is time-to-market. They can simply boast having quad-cores
first. However, I don't think the results will stand up to the test of
time. AMD is already talking about 40% better performance for its
single-die quad-core over Intel's dual-die quad-core.
This issue has been discussed on comp.arch. I tend to believe the
answers that I got there. Feel free to get your information wherever
you want.
If anything, you save watts by putting the memory controller within the
CPU die. The CPU is usually a smaller, more power-efficient
manufacturing process than a chipset manufacturing process. So not only
do you get less latency, but lower power consumption. Win-win in every case.
The transistors committed to the memory controller have to moved to
the die. That raises the watts on the die. Through reduced latency,
you could perhaps save on prefetch logic, but that's the only way I
know to balance the costs. Perhaps there are other savings I'm not
aware of. I'm always happy to be educated.
How can Intel claim that it doesn't want to be in the memory controller
business, when the main function of its chipsets is to be a memory
controller?
We've discussed this here. Intel *has* said contradictory things on
the subject.

Robert.
 
R

Robert Myers

I don't think anybody is arguing about Itanium, it's now universally
considered a big mistake. Coverage about it has dwindled to nothing
these days.

Where you look, perhaps.

Robert.
 
F

Felger Carbon

Robert Myers said:
We've discussed this here. Intel *has* said contradictory things on
the subject.

And done contradictory things. Remember Timna, with its on-die memory
controller?
 
R

Robert Myers

And done contradictory things. Remember Timna, with its on-die memory
controller?

Although it was never more than press release ware.

http://www.pcworld.com/article/id,18726-page,1/article.html

<quote>

Today, discrete processors, chip sets, and motherboards are costing
less, which makes them more attractive to vendors than an integrated
product, he says.

Plus, vendors have been saying they prefer the design flexibility of
individual parts, he says. Low-cost parts such as Intel's Celeron
processors and 810e chip sets give them more options, he says.

</quote>

What a shock. Intel's decisions are driven by non-technical
considerations.

And what rock have you been hiding under, Felger?

Robert.
 
G

gaffo

Yousuf said:
AMD's strategy was never about being an "Itanium-killer".


Yes it was!!

that is why they promoted 64-bit extension, while Intel publically
stated they had no intention to offer 64-bit extension initially, and
only reluctantly offered their own version 2-yrs later when Itanic did
not meraculously replace all the x86 desktop boxes all over the world.

By 2007 (now) Itanic was to be as fully entrenched as x86 is and in
fact its total replacement!! Remember Merced project has been around
since 1993 or so.

conjecture below:

I suspect that is why the p-II/p-III was the last decent x86 offered by
Intel until the Core (which is essentially a p-III), and why the p-IV
had such a shitty x87 fpu. It was to be a "stopgap" chip until Itanic
could replace it. (this is all just conjecture).




Itanium
pretty much killed itself.



Well, yes. but not really relivant. Since the Athlon-64 was indeed
buildt to offer an alternative to the Itanic and so in effect compete
against it.

I general terms AMD offered to prevent x86 extinction by offering a
good chip that ran 64-bit software for servers.

Intel wanted the opposite since any extension to x86 would cut into
their plans for Itanic total takover of all PCs.

Itanic ended up being so doggy and impractial for most uses (and no
software)............that it was non-viable as an x86 replacement for
general use.


MS had similar plans to replace DOS-based Windows95/98 with NT: which
2000/XP/Vista is. So MS did it, Intel did not.




AMD's strategy was simply "the next
generation of x86".





no - much more than that. The original Athlon which came out in 99 sure
- back then the p-3 was the competitor. But with the Athlon-64 the plan
was to prevent x86 extinction by opening up a future for x86 in the
server space now and the desktop in years to come.

Intel wanted desperately to KILL OFF x86 and have Itanic take its place.
Core is the first REAL attempt by Intel to offer a quality x86 chip.
Looks like they finally except that Itanic was a money pit and a
failure. I expect that chip to utterly dissapear in a couple more years.







As for AMD having to compete in a ruinous price competition against
Intel, yes it's having to do that right now. But that's simply
because Intel got its next-generation products out the door just a
little ahead of AMD. It took Intel at least 3 years to come up with
the design of Core 2 that it could compete against K8, and it caught
AMD just before its own next-generation products came out. It takes 3
years to design a new architecture, so having a gap of only a few
months between the competitors' introductions of new products is not
a big deal.




one full year ahead.



What was really ruinous was to Intel where it had to wait
those 3 years for Core 2 to finally be ready. Intel had to stop major
development on Netburst products, and start from scratch.



Core is not from the ground up design. Intel did not work on this one
for 3-yrs. they wasted 2 yrs in denail over Itanic. The Core is
basically an improved p-3/p-m (mobile). They lucked out in that the
Israeli team was working on modile pentium-3 dez.

Had they not had that base to work from - Core would not be out yet.
(IMO).



AMD is not
going to have to wait that long for its own answer back to Core 2 to
be ready.




The AMD chip is only a tweaked Athlon, it will maybe be on par with
Core.

All this talk about a better fpu is silly (some irrelivant SSE-3/4 crap
I'm sure), Athlon already has a powerful classic fpu. it needs work on
its integer speed.maybe a better branch predictor?







The answer is already pretty evident. K8 came out in 2003, where it
got very little attention for at least a year. Then in March 2005,
AMD launched its anti-trust lawsuit against Intel (after AMD got a
favorable ruling from the Japanese anti-trust authorities), and
within one quarter its market share was going up, starting a string
of consistently up quarters which continued to the present day, this
past quarter.

Basically the major technological improvements were all introduced in
2003, but the market share only started going up after the lawsuit
was filed. Basically the lawsuit felt pretty airtight to AMD and
Intel's OEM customers, that they no longer feared Intel, while Intel
feared getting caught. The lawsuit opened the doorway, not the
technology. Now that Intel's monetary threats are neutralized, it now
just competes on technological merits.


Intel has always advertised, and it has always advertised more than
AMD. But now it can't use its advertising money as a threat against
OEMs, to not use AMD anymore.

Intel's ads didn't seem to prevent AMD from taking over 50% of the
retail desktop and laptop markets this past year. People aren't
impressed with the Intel ads anymore, they just buy on price nowadays.


I think that IBM announcement had more to do with AMD than IBM
itself. Normally, IBM could care less what Intel is announcing, since
its chips don't usually compete against Intel all that directly. I
think the frantic scramble to announce as soon as possible after
Intel announced was because IBM now has technology partners that do
directly compete against Intel.




yep, my thoughts too.



Intel decided to announce on Friday
night. In the past, I'm sure IBM would've been content to wait till
Monday morning (or even afternoon) to put out its own press-release
-- if it was doing this on its own. But it got it out later Friday
night. It was a soon enough afterwards announcement that by Monday
morning the usual headlines that would've only been talking about
Intel, all had to be corrected to also mention IBM's breakthrough.
AMD got mentioned prominently alongside IBM -- thus my feeling is
that the mad scramble by IBM was all AMD's making.

Yousuf Khan



--
 
G

gaffo

Robert said:
You assume that market share numbers reflect acceptance of current
designs. That's like assuming that the current through a capacitor or
inductor is proportional to the voltage across it. The article cites
*design wins*, not sales to Best Buy and CompUSA.

Who knows when Intel threw in the towel on Pentium 4, or knew that it
would have to. Your assumption is that Intel decided, say around 2003
Fall IDF, that P4 was a lame duck.



I think 2004 or 2005 even.


Many of us might have agreed, but
there is just no way to know what Intel thought or planned
internally. I sure wish I knew, but I don't.

Alpha had the controller on the die. Somewhere I have an email from
an alpha advocate about alpha's low latency...something like 75 ns, if
I recall correctly. That email is at least six years old. The
downside of the low IPC of NetBurst was known, as it is known that
frequency as the overriding consideration was a marketing ploy. Intel
could have changed the x86 architecture any time and knew the
advantages but didn't because it wanted Itanium to take over the 64-
bit market.

Exactly!!



Itanium, not x86, was intel's next-gen processor.



yep. Merced. billions and billions down the drain.


If you
want to argue that Itanium was a big mistake, you'd find many, even in
the corner offices of intel, who'd agree with you.

Robert.



--
 
C

chrisv

gaffo said:
Core is not from the ground up design. Intel did not work on this one
for 3-yrs. they wasted 2 yrs in denail over Itanic. The Core is
basically an improved p-3/p-m (mobile). They lucked out in that the
Israeli team was working on modile pentium-3 dez.

Had they not had that base to work from - Core would not be out yet.
(IMO).

Agree with pretty much everything you said, except I'd say that "luck"
has nothing to do with any of this. I think even the most dull-witted
Intel manager realized that the Netburst design wasn't right for
mobile applications. Thus the more-efficient Pentium M was
commissioned.
 
G

gaffo

chrisv said:
Agree with pretty much everything you said, except I'd say that "luck"
has nothing to do with any of this. I think even the most dull-witted
Intel manager realized that the Netburst design wasn't right for
mobile applications. Thus the more-efficient Pentium M was
commissioned.




I agree, from what little I know the p-4 is a power-drain chip. So the
Israeli team stayed with the p-3 and worked with it.

it was luck that the p-4 sucked on laptops and that there was the p3/m
to fall back on for Intel to offer a decent x86 for use from
Server-through-laptop.

IMO- I think the koolaid over Itanic was so much that Intel indeed in
fact "lucked out" in that there was the P-m to fall back on to save
themselves two yrs to make the Core chip when reality showed itself.




--
 
G

gaffo

chrisv wrote:

Has not AMD been
around 20% for some years, now?

20-pecent +/- 5 for 15 yrs.


At AMD's low point - 1995/96 (K-5-before they bought NextGen's nx686
chips/company) they had 12-percent??

Cyrix (5x86/m1) had maybe 5 percent??????????????????

Winchip (GO WINCHIP!! winchip-1=dog: winchip2 not dog) maybe 1-percent.
(still got mine (the win-2) -- somewhere). IDT (Integrated Device
Technologies) bought them out around 1997?/98?

..............we also had Rise (forgot the chip name - actually a good
chip - equal to intel's offering at the time). (WTF happened to Rise
(the company) anyway ??????)

Winchip had a shitload more sales than Rise. and that is not saying
much. rise was the ultimate "niche player".

kinda sad since all the alternatives at the time offered equal quality
alternatives to Intel/AMD and are now gone ;-/.


ST Micro Electronics was to make a x86 chip but cancelled at the last
minute.


and a couple yrs later we had Transmeta.

.................................

at the time AMD really offered only the doggy k-5. I remember. The
Cyrix, Intel, and NextGen 5x86 all ran circles around the AMD k-5.

there was a reason AMD(which had fabs) had a market share only a tad
above Cyrix (a Fabless company)

Cyrix in general offered a better product for several years from the
486 through the k-5/5x86 era.

AMD pulled ahead when they bought Nexgen and took the Nx686 and
re-named it the k-6.

even then the Winchip-2 was the k-6 equal in all respects and cost 1/2
the price!!...........and was old mobo backward compatable (took an old
motherboard and higher voltage).

............

now the k-7/k-8/etc is tops. but that is thanks to the Alpha guy
(forget name?)- which AMD hired.

IMO silicon is topped out. we will no longer see speed increases of
factors of 2 or 3 as we used to.

Core/k-8 is as good as it gets. make smaller...........and higher
clock............even that will hit a limit......then no more speed
increases worth mentioning.

--
 
R

Robert Redelmeier

Robert Myers said:
<quote> Plus, vendors have been saying they prefer the design
flexibility of individual parts, he says. Low-cost parts
such as Intel's Celeron processors and 810e chip sets give
them more options, he says. > </quote>
What a shock. Intel's decisions are driven by non-technical
considerations.

There's also the small matter of yield: a process that yields
80% of 150 mm2 dice will only yield ~41% of 600 mm2 dice. Only
if you start with a process capable of 95% yield at 150mm2
could you hope for 80% at 600.

Hmm ... I wonder if inflating yield isn't the true reason for
ECC on cache. It hides both soft errors and hard defects.

-- Robert
 
T

Tony Hill

And done contradictory things. Remember Timna, with its on-die memory
controller?

If anything I think Timna is part of the explination for WHY Intel has
chosen not to integrate a memory controller. The main reason why
Timna was canned was because of the integrated memory controller. The
chip was supposed to be a low-cost solution, but they integrated an
RDRAM memory control and RDRAM memory was just WAY to expensive to cut
it in the low-cost market. A wrong choice on the memory controller
meant that the chip was going to be a complete failure in it's target
market.

While it might seem logical that Intel could have just redesigned only
the memory controller portion of the chip to use DDR SDRAM instead, I
think Intel took this as a sign that integrated memory controller =
inflexible and bad (and perhaps in the case of Timna they were
right?).

In any case, I still think that at some point Intel is going to bring
their memory controller on-die. Their common Xeon/Itanium bus for
next year (or whenever it actually arrives.. if it arrives) will
pretty much require moving at least SOME of the memory controller
on-die, though they might still have a sort of generic memory bus with
interchangeable technology-specific controllers hanging off them.
 
T

Tony Hill

chrisv wrote:



20-pecent +/- 5 for 15 yrs.


At AMD's low point - 1995/96 (K-5-before they bought NextGen's nx686
chips/company) they had 12-percent??

If you're talking installed base, maybe. However if you're takling
new unit sales AMD bottomed out at about 4% just before teh release of
the K6 (early '97).
Cyrix (5x86/m1) had maybe 5 percent??????????????????

Just before the release of the K6 Cyrix was actually outselling AMD
(at least if you count the Cyrix branded and IBM branded chips
together).
Winchip (GO WINCHIP!! winchip-1=dog: winchip2 not dog) maybe 1-percent.
(still got mine (the win-2) -- somewhere). IDT (Integrated Device
Technologies) bought them out around 1997?/98?

The first Centaur Winchip came out in Oct. '97 according to
www.sandpile.org. At some point i think they might have managed to
creep over 1% of sales around early '99 or there abouts. I rather
liked this chip, VERY simple and cheap but offered not altogether
terrible performance for the time.

The Winchip design is essentially still around in the form of the VIA
C-series chips, though obviously they've diverged quite a bit since
VIA bought 'em out many moons ago.
.............we also had Rise (forgot the chip name - actually a good
chip - equal to intel's offering at the time). (WTF happened to Rise
(the company) anyway ??????)

Winchip had a shitload more sales than Rise. and that is not saying
much. rise was the ultimate "niche player".

I'm not sure that Rise ever really sold any meaningful quantities of
chips.
kinda sad since all the alternatives at the time offered equal quality
alternatives to Intel/AMD and are now gone ;-/.

None of the alternatives ever matched Intel in terms of performance,
and they only matched or beat AMD at times when AMD was really
struggling (ie the K5 days). Cyrix and the IDT/Centaur/Winchip both
got bought out by VIA and that's the only real 3rd player left in x86
chips, but they're really targetting some specific niches. They still
have the VERY low cost design from Winchip so apparently they might
still be making money with their 1% marketshare, but that's about it.
and a couple yrs later we had Transmeta.

Transmeta's design was a really odd-ball way of doing things that had
a lot of people scratching their heads from the get-go. I don't think
I went so far as to say that it wouldn't work out well, but I
definitely didn't have very high hopes for their idea. It turns out
that my low hopes were overly optimistic. It was just an odd-ball way
of doing things, it was a REALLY bad way of doing things! At best
they were matching the performance of VIA chips while using 4x as many
transistors and costing at least 4x as much to build!

Transmeta now seems to have taken the Rambus approach to business and
have stopped concentrating on engineering in order to focus their
efforts on litigation and questionable patents.
at the time AMD really offered only the doggy k-5. I remember. The
Cyrix, Intel, and NextGen 5x86 all ran circles around the AMD k-5.

there was a reason AMD(which had fabs) had a market share only a tad
above Cyrix (a Fabless company)

Cyrix in general offered a better product for several years from the
486 through the k-5/5x86 era.

AMD pulled ahead when they bought Nexgen and took the Nx686 and
re-named it the k-6.

even then the Winchip-2 was the k-6 equal in all respects and cost 1/2
the price!!...........and was old mobo backward compatable (took an old
motherboard and higher voltage).

The Winchip-2 was a K6 equal but two years too late! By the time IDT
had much volume of their 200MHz Winchip, AMD was selling their 350MHz
K6-2. Even then the Winchip2 could only match the performance of the
K6 on integer workloads, the floating point performance was weaker
than the already so-so K6 FP.

I did like the design of the chip because it was SO cheap. It was
kind of elegant in it's simplicity, but it was never anywhere close to
the performance level that AMD or Intel offered at the time. What it
DID offer was a really good low-cost part that was a drop-in
replacement for MANY old motherboards. At they were selling this chip
there were lots of people with older Pentium 75-133MHz boards that
could drop one of these Winchips in for a VERY noticeable improvement
in performance for only ~$50. It was an EXCELLENT deal for these
people and I recommended it highly for this exact reason. However for
anyone building a new system from scratch it made no sense at all.
Only a few dollars more got you an AMD K6-2 or Intel Celeron (which
had cache by this time) based system with MUCH better performance.
now the k-7/k-8/etc is tops. but that is thanks to the Alpha guy
(forget name?)- which AMD hired.

AMD, like Intel, highered several old Alpha guys.
IMO silicon is topped out. we will no longer see speed increases of
factors of 2 or 3 as we used to.

Speed increases haven't actually slowed down that much, though they
aren't necessarily as obvious anymore because most chips are "fast
enough" for so many applications. However if you look at a plot of
something like SPEC CPU2000 (and especially CPU_rate) vs. time you'll
see that the performance is still doubling every 24 months or so.
Pretty impressive.
Core/k-8 is as good as it gets. make smaller...........and higher
clock............even that will hit a limit......then no more speed
increases worth mentioning.

I'm glad I don't share your pessimism for computer chip development! I
still expect noticeable performance improvements for at least another
5-10 years. Beyond that it's too tough to predict, however you can
bet that both AMD and Intel aren't going to just back and give up on
things. New performance improvements might not come from the same
sorts of obvious ways that we have seen in the past, but they aren't
going to stop.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top