Dell,gateway etc.. never choose AMD why?

N

nobody

Hi,
My name is Bob Russels. I am a recent graduate of Harvard
Business School, and I have decided to start my own computer
manufacturing business, and go head to head with dell. We
would like some input from the public about what improvments could be
made in the pc business. We are welcome to any suggestions or
complaints. Every single suggestion will be looked out and evaluated.
Please send any information to (e-mail address removed).

That's easy. Outsource manufacturing to China, support to India. Use
the 'efficiencies' extracted from it to beef up sales and marketing.
Come up with an easily recognizable dude as an ad personality. Oh,
yes, avoid AMD like a plague - this will greatly enhance your
relationship with INTC. Voila - you are DULL Computer Co.
 
T

Tony Hill

5 different chips of what? Chipsets, CPUs, or video chipsets?

Processors. Dell only sells 5 different video cards for all of their
Dimension line, so they only need to stock 5 different parts to
satisfy all their video-card needs. With AMD-based systems they would
probably need at least 3 or 4 parts JUST for the CPU along to satisfy
the various speed grades for the processor. When trying to minimize
the amount of parts on-hand (which is goal #1 at Dell), this is not a
good thing.
 
T

Tony Hill

That's easy. Outsource manufacturing to China, support to India. Use
the 'efficiencies' extracted from it to beef up sales and marketing.
Come up with an easily recognizable dude as an ad personality. Oh,
yes, avoid AMD like a plague - this will greatly enhance your
relationship with INTC. Voila - you are DULL Computer Co.

This pretty much describes HPaq, Dell, Gateway/eMachines, IBM/Lenovo
and.. umm... whoever the heck all makes computers these days.


That being said, if Bob is really serious about starting a computer
company, he REALLY needs to find a niche where Dell and HP just don't
compete.

One niche which definitely is an option is a sort of "luxury"
computer. Think something like a Lexus or a BWM of computers. The
trick here is providing some really high-end quality and really good
support for the very tiny percentage of customers that are willing to
pay for it. The trouble with this niche is that a lot of companies
are already there. Alienware seems to be making a go of it reasonably
well, and others are definitely trying.

Another niche is for the very small and quiet boxes, ideally in
cool-looking cases. No necessarily the fastest machines, but
something that artsy types will want on their desk (assuming they
don't want a Mac :> ). Again, this is a niche that already has some
competitors, though it may still be an option.

Basically the answer here is, to quote the most ridiculously overused
catchphrase in business-speak, to think outside the box. If you try
to go head-to-head against Dell you're going to get creamed. You
don't see new companies starting up with the intention of building
cars for your regular driver for the simple reason that the market is
already over saturated. Same goes for computers, but profit margins
are even thinner here. If you want to build computers, you need to
offer something that no one (or at least few people) are offering.


If I were sufficiently insane to try starting my own computer company,
I would go after the small, low-noise, funky-case, Apple-wannabe
market. There is definitely the demand for such designs and the
technology exists, it's just a matter of putting two and two together
and get a foothold before Dell and HPaq clobber this market. Of
course, I wouldn't do this because I'm not insane... but maybe that's
just me.


Ohh.. and make sure that you offer Linux as a fully supported option
and do things like making Firefox/Thunderbird the default
browser/e-mail program. This will help get armies of Slashdot geeks
to do some marketing for you.
 
S

Scott Alfter

-----BEGIN PGP SIGNED MESSAGE-----
Hash: SHA1

^^^^^^ ...err Windows? ;-)

So you're telling me there isn't a difference? It's all cheap crap?
(as are all the rest in this class, BTW).

That would not be an unreasonable inference. :)

(There was a time that I refused to work on Gateway hardware because it was
too flaky and unpredictable, compared to either (1) other major-brand
equipment (IBM, HP, etc...hell, even Packard Bell was less troublesome IME)
or (2) generic screwdriver-shop equipment. The same prohibition extended to
Dell, too, for much the same reason.)

_/_
/ v \ Scott Alfter (remove the obvious to send mail)
(IIGS( http://alfter.us/ Top-posting!
\_^_/ rm -rf /bin/laden >What's the most annoying thing on Usenet?

-----BEGIN PGP SIGNATURE-----
Version: GnuPG v1.2.4 (Linux)

iD8DBQFBySpnVgTKos01OwkRAgMwAKCL2TIsM4zpjr7tS28g0uZHBifamACfcYnb
kNuviXsDbVLcJfsT/z0xE24=
=uIEZ
-----END PGP SIGNATURE-----
 
Y

YKhan

Tony said:
Processors. Dell only sells 5 different video cards for all of their
Dimension line, so they only need to stock 5 different parts to
satisfy all their video-card needs. With AMD-based systems they would
probably need at least 3 or 4 parts JUST for the CPU along to satisfy
the various speed grades for the processor. When trying to minimize
the amount of parts on-hand (which is goal #1 at Dell), this is not a
good thing.

I'm not getting this, why would AMD based systems require /any/
different video cards than Intel based ones?

Yousuf Khan
 
Y

YKhan

AJ said:
You're treading on religious grounds now. I made similar statements
(that reducing the number of configurations and suppliers is key and
that's a major reason why I personally work only with Intel HW) in here
awhile back and got raked over the coals. It really riles up the AMD
marketeers.

I wouldn't call it religious arguments, it's just a very logical
question to ask. Is Dell refusing to use AMD processors because of
inventory reasons, or "other" reasons (i.e. incentives from Intel)?
Even Tony had to admit after originally writing the above statement
that Dell says one thing and does another. So minimizing inventory is
not the real reason why Dell does it. And to answer the question about
whether any individual, such as yourself, saves any money by
standadizing on Intel parts, then the answer to that is that you're not
receiving any "Intel Inside" funding. :)

Yousuf Khan
 
Y

YKhan

Tony said:
Processors. Dell only sells 5 different video cards for all of their
Dimension line, so they only need to stock 5 different parts to
satisfy all their video-card needs. With AMD-based systems they would
probably need at least 3 or 4 parts JUST for the CPU along to satisfy
the various speed grades for the processor. When trying to minimize
the amount of parts on-hand (which is goal #1 at Dell), this is not a
good thing.

I'm not getting this, why would AMD based systems require /any/
different video cards than Intel based ones?

Yousuf Khan
 
Y

Yousuf Khan

John said:
How were they differentiated before? (Other than brand name of course.)

Well, that's easy, before they were two completely separate companies,
and they competed directly against each other.

Yousuf Khan
 
A

AJ

YKhan said:
I wouldn't call it religious arguments, it's just a very logical
question to ask. Is Dell refusing to use AMD processors because of
inventory reasons, or "other" reasons (i.e. incentives from Intel)?
Even Tony had to admit after originally writing the above statement
that Dell says one thing and does another. So minimizing inventory is
not the real reason why Dell does it.

It's all the complexity, not just inventory. The more suppliers and parts,
the worse it gets.
And to answer the question about
whether any individual, such as yourself, saves any money by
standadizing on Intel parts, then the answer to that is that you're not
receiving any "Intel Inside" funding. :)

Well I'm thinking ahead to one day where I may resell PCs I build in a
higher volume (though the issues I have with transitionary technologies
currently in vogue has put a damper on such ideas for me).

AJ
 
K

keith

Well, that's easy, before they were two completely separate companies,
and they competed directly against each other.

Perhaps it's not unlike all the Coke brands. The intention is to compete
against ones self and gain shelf space.
 
Y

Yousuf Khan

AJ said:
It's all the complexity, not just inventory. The more suppliers and parts,
the worse it gets.

Actually, no, no it isn't. When I was talking about minimizing
inventory, I was referring to this issue of minimizing complexity as
well. As I was mentioning to Tony, there was a time when AMD used to
make fully socket-compatible processors to Intel's. Even back then Dell
refused to sell AMD processors with the excuse that there wasn't enough
difference between them to bother with it. Now there's a huge difference
between them, and it still doesn't want to sell them. Now the excuse is
that AMD doesn't produce in high enough volume. AMD will address that
issue, and Dell will come up with another excuse after that.

However, I know for a fact that Dell does sell AMD-based systems, right
from desktops all of the way upto laptops and servers. You just have to
ask (forcefully). A friend of mine was working in the IT department of a
medical instruments company, and its CEO was a huge Dell fan and a huge
AMD fan. Did he have to make a choice between one or the other? Nope,
Dell made desktops and notebooks with AMD processors in them for him.
And you hear similar stories from various websites from time to time. So
it's not that Dell couldn't make these models, it just doesn't want to
be caught making these models. By whom? By Intel.
Well I'm thinking ahead to one day where I may resell PCs I build in a
higher volume (though the issues I have with transitionary technologies
currently in vogue has put a damper on such ideas for me).

Without making thousands of systems a year, you're not likely to become
an Intel Inside recipient. However, I've seen that at least two of my
local computer stores in Ottawa, Canada are recipients of the AMD
Premier Partner certification. I think that gives them access to the
latest AMD parts and discounts on the older parts. They're the only two
stores I know of who are selling Athlon 64's, whereas everybody else is
selling Athlon XP's and Semprons.

Yousuf Khan
 
A

AJ

Yousuf Khan said:
AJ said:
It's all the complexity, not just inventory. The more suppliers and parts,
the worse it gets.

Actually, no, no it isn't. [irrelevant stuff omitted]

It is a generalization that, in general, holds true. Intel/AMD has nothing to
do with it.
Without making thousands of systems a year, you're not likely to become an Intel Inside recipient.

Not a concern for me. By "volume", I didn't mean 1000's or even hundreds of
systems per year. Just enough to remain in control of the configurations and
setups and not have to recommend and then *live with* supporting Dells (or
god forbid those pesky HPs!).

AJ
 
T

Tony Hill

I'm not getting this, why would AMD based systems require /any/
different video cards than Intel based ones?

Huh? They wouldn't, they need processors! 3 or 4 PROCESSORS! That
is 3 or 4 extra parts, all of which are CPUs that would only be sold
in one (or a small number) of their systems. Dell's business is all
about minimizing the total number of parts that they needs, I was just
using the video cards for comparison purposes. A single AMD-based
system would require that Dell stock 4 or 5 extra parts for the CPU
alone, plus at least one more for the motherboard. This is on top of
the additional bits and pieces that are common among all of Dell's
systems.

The point I'm getting at here is that Dell thrives on the minimization
of choice, making all of their systems as identical as possible. To
keep even just a single extra different part in stock complicates
things. Sometimes their hand is forced and they do need to offer
choices to customers because all their marketing is all based on the
exact opposite of their manufacturing reality, ie they advertise
maximizing choice.

For the moment at least, customers have not forced Dell's hand in to
using AMD processors, sufficiently few people buying Dell systems
would go elsewhere just for an AMD processor, so Dell can avoid this.
The video cards I mentioned above are only as a sort of
counter-example. With video cards their hand HAS been forced. I'm
sure that Dell would LOVE to only supply a single video card for all
their systems, or at least just a single manufacturer. If it weren't
for the fact that sufficient customers buying Dell's will DEMAND
either an ATI card or an nVidia card, Dell would probably drop one of
those brands and go exclusively with the other. Actually, if they
could I'm sure that the Dell ideal would be to go exclusively with
integrated video and forget video cards altogether.


The real point to take away from this whole argument is that AMD will
NEVER win over Dell based on price. If they want to get Dell selling
their processors, they need to offer a product that customers are
demanding from Dell or they'll go elsewhere. With the Opteron AMD has
such a product for servers, and I'm sure that this has cost Dell some
sales. On the desktop front though, most people buying Dell desktops
either do so for businesses (where ease of maintenance is the #1
priority, and while processors don't do anything here, minimizing
motherboard drivers helps), or they are buying consumer-grade systems
where the type of processor in there probably isn't a very big worry
as long as the cost is ok.
 
Y

Yousuf Khan

Tony said:
Huh? They wouldn't, they need processors! 3 or 4 PROCESSORS! That
is 3 or 4 extra parts, all of which are CPUs that would only be sold
in one (or a small number) of their systems. Dell's business is all
about minimizing the total number of parts that they needs, I was just
using the video cards for comparison purposes. A single AMD-based
system would require that Dell stock 4 or 5 extra parts for the CPU
alone, plus at least one more for the motherboard. This is on top of
the additional bits and pieces that are common among all of Dell's
systems.

Yeah, but they'd need to stock those extra parts everytime they add
another Intel product line too, for example, Pentium M, Pentium 4 Socket
478, Pentium 4 Socket 775, P4EE, Celeron, Itanium, and their various
speed grades. Actually in the case of most of these mail-order PC
places, I've never seen them stocking much more than two speeds of the
same processor, usually the top two most common speed grades.

In fact, I think AMD produced the Socket 754 Sempron just for this case,
to minimize the number of parts needed to be stocked, at the insistence
of HP.
The real point to take away from this whole argument is that AMD will
NEVER win over Dell based on price. If they want to get Dell selling
their processors, they need to offer a product that customers are
demanding from Dell or they'll go elsewhere. With the Opteron AMD has
such a product for servers, and I'm sure that this has cost Dell some
sales. On the desktop front though, most people buying Dell desktops
either do so for businesses (where ease of maintenance is the #1
priority, and while processors don't do anything here, minimizing
motherboard drivers helps), or they are buying consumer-grade systems
where the type of processor in there probably isn't a very big worry
as long as the cost is ok.

I don't think there's much doubt anymore that Dell will have to start
selling Opteron very soon. Intel screwed up its 90nm manufacturing
process by not going with SOI, and Prescott was supposed to mask this
misstep by going fast even without it (eg. 50% extra pipeline stages).
This worked for Intel about 4 years ago when it introduced Williamette,
it masked their last screwed up manufacturing process which was 180nm
without copper. But even Prescott couldn't bypass the laws of physics
this time, and Intel's bacon is now cooked, because Intel won't be able
to design a completely new processor for at least 3 years. Opteron is
the only option for the foreseeable future. For all of Intel's vaunted
manufacturing skills, it really bets on the wrong horse quite often.

But strangely, IBM seems to think that the Xeon is a better
cost/performance part than Opteron.
Susan Whitney, general manager of IBM's X86 server business, said Xeon's lower cost was the key reason most of Big Blue's industry standard servers run on Intel's chip rather than AMD's offering.

FT.com / Industries / IT - Companies get bit between their teeth
http://news.ft.com/cms/s/79cb46d0-575d-11d9-a8db-00000e2511c8.html

I thought that Opterons cost less than Xeons?

Yousuf Khan
 
C

chrisv

Yousuf Khan said:
I don't think there's much doubt anymore that Dell will have to start
selling Opteron very soon. Intel screwed up its 90nm manufacturing
process by not going with SOI, and Prescott was supposed to mask this
misstep by going fast even without it (eg. 50% extra pipeline stages).
This worked for Intel about 4 years ago when it introduced Williamette,
it masked their last screwed up manufacturing process which was 180nm
without copper. But even Prescott couldn't bypass the laws of physics
this time, and Intel's bacon is now cooked, because Intel won't be able
to design a completely new processor for at least 3 years. Opteron is
the only option for the foreseeable future. For all of Intel's vaunted
manufacturing skills, it really bets on the wrong horse quite often.

Well, it's easy to view things from afar, and criticize Intel's
"screwed-up" decision regarding technologies. I'm sure they do their
homework before making any critical decision such as whether or not to
use SOI, and they know as much about these technologies as anyone on
the planet. It's their livelihood.

IMO, the only clearly wrong-headed decision was the one to pursue high
clock-speeds as the path to highest performance. Hind-site is always
20/20, but it seems unlikely that they could not have seen that this
strategy would fail, as it automatically makes any given
performance-level more difficult to achieve. It begs the question
"why did they do it", and I'm afraid the only plausible answer is that
Intel's marketing demanded that they have the fastest-clocking product
on the market.
 
R

Robert Redelmeier

chrisv said:
I'm sure they do their homework before making any critical
decision such as whether or not to use SOI, and they know
as much about these technologies as anyone on the planet.
It's their livelihood.

Yes, but Intel also have considerations other than pure
technical merit. Market size and ability to deliver rate
very high. What happens if they only revamp one fab for
SOI, then that market takes off and they can't supply Dell?
Egg all over their faces. Intel "enjoys" a premium reputation
that is also a straight-jacket.
IMO, the only clearly wrong-headed decision was the one to
pursue high clock-speeds as the path to highest performance.

I don't think Intel had any choice. We were all supposed
to be running IA64 processors (Itanium) by now. The iP7
(Pentium4) is a chip that was never supposed to exist.
An emergency stop-gap when it appeared that IA64 might
be a bit slow in market adoption.
they do it", and I'm afraid the only plausible answer
is that Intel's marketing demanded that they have the
fastest-clocking product on the market.

I would have agreed with you 6 months ago. But now even
Intel is using model numbers rather than GHz, at least for
their budget Celeron line.

I don't doubt that high clocks were favored by Intel marketing
& Itanium camps. So they pipelined and overclocked the iP5
into the P7. And they've finally been able to make timing
on some sections, so have "new" features.

-- Robert
 
T

Tony Hill

Yeah, but they'd need to stock those extra parts everytime they add
another Intel product line too, for example, Pentium M, Pentium 4 Socket
478, Pentium 4 Socket 775, P4EE, Celeron, Itanium, and their various
speed grades. Actually in the case of most of these mail-order PC
places, I've never seen them stocking much more than two speeds of the
same processor, usually the top two most common speed grades.

Yes, but they can share those chips between multiple product lines for
the most part. The same socket 478 P4 chips and Celeron chips were
used in all of their Dimension and Optiplex lines with only small
variations. But you do have a point here about the socket change to
Socket 775 being a bad thing for them, and I suspect that they'll try
to switch over as many products at once to minimize the impact of
this.

As for the other chips, the P4EE is a bit of an odd-ball, but they
sell in sufficiently expensive machines that it's probably worth it
for bragging rights if nothing else. They don't need to stock any
Pentium-M chips since they don't make laptops (they just sell the
final product with their name branded on top). As for the Itanium...
err.. well there's no real explaining for that one. My only guess
here is that Intel is providing a SIGNIFICANT portion of the funding
(like all of it) for Dell's Itanium systems, because they just do NOT
fit the Dell model at all.
In fact, I think AMD produced the Socket 754 Sempron just for this case,
to minimize the number of parts needed to be stocked, at the insistence
of HP.

Probably. AMD is definitely learning and it's paying off for them
with their HPaq deal. HP sells quite a large number of AMD-based
systems, including some in their commercial-grade systems.

In fact, one could probably argue that Dell would theoretically be
better off going exclusively with AMD chips than they are with only
Intel chips. However this really just isn't an option for a variety
of reasons.
I don't think there's much doubt anymore that Dell will have to start
selling Opteron very soon. Intel screwed up its 90nm manufacturing
process by not going with SOI, and Prescott was supposed to mask this
misstep by going fast even without it (eg. 50% extra pipeline stages).
This worked for Intel about 4 years ago when it introduced Williamette,
it masked their last screwed up manufacturing process which was 180nm
without copper. But even Prescott couldn't bypass the laws of physics
this time, and Intel's bacon is now cooked, because Intel won't be able
to design a completely new processor for at least 3 years. Opteron is
the only option for the foreseeable future. For all of Intel's vaunted
manufacturing skills, it really bets on the wrong horse quite often.

It certainly seems that way for the moment, though these things can
change. I suspect that once we start seeing dual-core chips coming to
market, the advantage for the Opteron is going to become even more
apparent. Combine that with the fact that AMD might have up to a
6-month head-start on dual-core chips and Dell could be in a rather
poor position if they don't sell Opteron systems.

On the flip side though, AMD's 90nm process isn't exactly making
waves. Sure they got the power consumption down nice, but so did
Intel if you look at it from a per-transistor perspective (which
doesn't help much when you more than double the number of
transistors). However the clock speed of AMD's 90nm parts is still
LOWER than that of their 130nm parts. I figured that this would
change fairly quickly as they ramped up production, but thus far it
hasn't.
But strangely, IBM seems to think that the Xeon is a better
cost/performance part than Opteron.


FT.com / Industries / IT - Companies get bit between their teeth
http://news.ft.com/cms/s/79cb46d0-575d-11d9-a8db-00000e2511c8.html

I thought that Opterons cost less than Xeons?

Remember that the price you or I would pay for the chip might not have
any relation to what IBM or Dell pays for them.

FWIW I just configured out some sample servers to compare the
near-identical Xeon-based e336 and Opteron-based e326. I configured
out the systems as a bare-bones setup with 1GB of RAM and either dual
3.6GHz Xeons or dual Opteron 250 (2.4GHz) chips and no hard drive or
extra hardware. The result was that the dual Xeons server cost $4478
and the dual Opteron server cost $4418, not exactly a significant
difference. Of course, when you add a hard drive in the Xeon system
becomes more expensive, but that is because it uses up to 4
hot-swapable drives of the laptop-style 2.5" form factor while the
Opteron uses only 2 drives in a more standard 3.5" form factor.

Anyway, long story short, the price looks to be pretty much the same.
Performance will vary from one application to the next, but they are
likely to be fairly close. As mentioned above though, I think that
the dual-core Opteron could throw a real monkey-wrench into this
argument, especially given that dual-core Xeons might end up having to
clock WELL down in order to high a viable power consumption number to
fit into a 2-socket 1U system.
 
Y

Yousuf Khan

chrisv said:
Well, it's easy to view things from afar, and criticize Intel's
"screwed-up" decision regarding technologies. I'm sure they do their
homework before making any critical decision such as whether or not to
use SOI, and they know as much about these technologies as anyone on
the planet. It's their livelihood.

IMO, the only clearly wrong-headed decision was the one to pursue high
clock-speeds as the path to highest performance. Hind-site is always
20/20, but it seems unlikely that they could not have seen that this
strategy would fail, as it automatically makes any given
performance-level more difficult to achieve. It begs the question
"why did they do it", and I'm afraid the only plausible answer is that
Intel's marketing demanded that they have the fastest-clocking product
on the market.

Well, I don't want to blame this all on the evil marketeers of Intel,
engineering managers can be right there amongst the best managers in the
world when it comes to making wrong decisions. I can definitely see that
the managers' made decisions to not adopt certain technology simply
because they wanted to get the next miniaturization node first, damn the
consequences.

Yes, it's their livelihood to know the best technology, but that doesn't
mean they are being listened to by their own managers.

Yousuf Khan
 
Y

Yousuf Khan

Tony said:
Probably. AMD is definitely learning and it's paying off for them
with their HPaq deal. HP sells quite a large number of AMD-based
systems, including some in their commercial-grade systems.

In fact, one could probably argue that Dell would theoretically be
better off going exclusively with AMD chips than they are with only
Intel chips. However this really just isn't an option for a variety
of reasons.

Yeah, thinking back to the Pentium/K6 days, Dell's pricing model
would've really benefitted being able to continue to sell Socket 7
processors for an additional year or so until the Socket 370 became more
common place. Probably at that point they could've continued to sell
Socket 7's for another year beyond that just as AMD continued to do.
It certainly seems that way for the moment, though these things can
change. I suspect that once we start seeing dual-core chips coming to
market, the advantage for the Opteron is going to become even more
apparent. Combine that with the fact that AMD might have up to a
6-month head-start on dual-core chips and Dell could be in a rather
poor position if they don't sell Opteron systems.

I doubt that six months is enough of a headstart to for Dell to start
feeling any kind of pain. It'll probably take six months just to get
dual-cores popular among people.
On the flip side though, AMD's 90nm process isn't exactly making
waves. Sure they got the power consumption down nice, but so did
Intel if you look at it from a per-transistor perspective (which
doesn't help much when you more than double the number of
transistors). However the clock speed of AMD's 90nm parts is still
LOWER than that of their 130nm parts. I figured that this would
change fairly quickly as they ramped up production, but thus far it
hasn't.

They seem to be trying to keep power consumption contained. Might be the
smart thing to start concerning themselves over these days, i.e. power
requirements rather than speed. The 130nm parts are within the 89W
envelope. While the 90nm parts under the 67W envelope.
Remember that the price you or I would pay for the chip might not have
any relation to what IBM or Dell pays for them.

FWIW I just configured out some sample servers to compare the
near-identical Xeon-based e336 and Opteron-based e326. I configured
out the systems as a bare-bones setup with 1GB of RAM and either dual
3.6GHz Xeons or dual Opteron 250 (2.4GHz) chips and no hard drive or
extra hardware. The result was that the dual Xeons server cost $4478
and the dual Opteron server cost $4418, not exactly a significant
difference. Of course, when you add a hard drive in the Xeon system
becomes more expensive, but that is because it uses up to 4
hot-swapable drives of the laptop-style 2.5" form factor while the
Opteron uses only 2 drives in a more standard 3.5" form factor.

Do the dual-core Xeons that IBM makes use their own Summit chipset?
Perhaps its trying to make more sales of the Summit chipset rather than
the Xeons themselves?

Yousuf Khan
 
K

keith

Well, I don't want to blame this all on the evil marketeers of Intel,
engineering managers can be right there amongst the best managers in the
world when it comes to making wrong decisions. I can definitely see that
the managers' made decisions to not adopt certain technology simply
because they wanted to get the next miniaturization node first, damn the
consequences.

Even if those consequences make the product not suitable for the market
(I'm thinking power here, but also P4 architecture)?
Yes, it's their livelihood to know the best technology, but that doesn't
mean they are being listened to by their own managers.

That's how companies fail! Like Goerge (I think it was the G-man) said,
Intel is trapped by their own success. They're also trapped by the silly
mistake of running the iceberg field at full speed with an "unsinkable"
chip. They've dug a very deep hole that AMD has done a rather good job of
taking advantage of. *THIS* is a marketing/executive mistake not seen
since DEC was torched by its insiders.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top