Intel COO signals willingness to go with AMD64!!

G

George Macdonald

On Thu, 29 Jan 2004 05:02:48 -0500, George Macdonald


<pure speculation>

Huh? Sorry you can't brush it off so easily. It was fairly widely
reported in the specialist press.
By the standards of a company like IBM or AMD, $46 million is cheap
for a major technology play, and one does wonder about how things are
being done at IBM these days.

It was a major technology fix - AMD backed the wrong horse for Cu/SOI and
couldn't make the bloody thing work. Whatever its value to IBM it was a
considerable *added* expense to AMD, who were drowning in red ink, for a
project which was 3months past due delivery with no viable route to
completion.
Possible real incentives for IBM:

Volume for its East Fishkill line.

Different item - this was an initial payment to put out a fire, before and
over and above the technology exchange agreement they struck later. BTW I
haven't seen where IBM is acting as a foundry for AMD???
Tactical/strategic move whose real target is Intel.

Hmmm - whatever... it seems to be working if so. said:
Some manager needed $46 million to hit his revenue targets.

Windfalls are always nice. I doubt that the figure was arrived at by such
cynical means.
Even the possibility that the third might be the real reason should be
enough to make you think twice about owning IBM stock, unless you
think someone with more strategic vision can mount a hostile takeover
and stop IBM from becoming an overpriced job shopper.

"Hostile takeover" of whom? The fact that the collaboration has been so
successful for both AMD and IBM gives the lie to your third option IMO. As
for IBM being overpriced, when you have the kind of IP portfolio they own,
your in the catbird seat. The industry, Intel excepted as far as is known,
is beating a path to their door.

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 
G

George Macdonald

Nothing specific except anecdotal evidence that customers are clamoring for
Opterons. Various articles have noted as much, without being too specific
either. For example this article:

http://www.techworld.com/news/index.cfm?fuseaction=displaynews&NewsID=943

It mentions:

"HP didn't have any choice," says James Governor, principal analyst at
research firm RedMonk. "Any market-driven organisation didn't have any
choice. If HP were making its decisions based on religious arguments, then
it wouldn't go anywhere near AMD. But if it's basing it on market reality,
it's doing the right thing."

So it seems pretty much the customer bases alone are telling these companies
to go with Opteron. That was also the case for the first major OEM Opteron
server from IBM last year -- they did it because their customers asked them
to.

Yeah but I need more proof.:) I want to hear that it's fitting into the
Xeon sector... e.g. some corporation with a huge SAP commitment "needs it".
Some of the names mentioned as buyers have seemed to be specialist tech
apps.

It was also a bit disappointing to see that Newisys couldn't hack it on its
own, even with seed money from AMD and had to be taken over by Sanmina.

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 
D

David Schwartz

If there's no 64-bit software market yet, then why did Intel make the
Itanium?


There are a huge number of reasons, but I assure you none of them
involve creating a processor on which the massive amount of existing 64-bit
software can execute.

DS
 
Y

Yousuf Khan

George Macdonald said:
Yeah but I need more proof.:) I want to hear that it's fitting into the
Xeon sector... e.g. some corporation with a huge SAP commitment "needs it".
Some of the names mentioned as buyers have seemed to be specialist tech
apps.

Well, you gotta assume that with over 10,000 Opteron servers sold in one
quarter alone, some of them must be going to the average joe corporate user
and not just to specialist shops.
It was also a bit disappointing to see that Newisys couldn't hack it on its
own, even with seed money from AMD and had to be taken over by Sanmina.

Sanmina probably wouldn't have invested in it, if it didn't think there was
quite a bit of money to be made back on it. Sanmina was acting as Newisys's
manufacturing partner before anyways. So Sanmina knew how much money could
be made from this business.

Yousuf Khan
 
T

Tony Hill

This is a very interesting thread. I guess intel miscalculated the
need for a low end 64 bit systems for home and small business users. I
wonder if the introduction of Apple's low end 64 bit systems is also
pushing intel?

I doubt it. Apple has approximately 2% of the worldwide PC market
share. There are more PCs with Intel chips in them sold in two weeks
than what Apple ships in a year.

Besides, Apple is a PC maker, Intel is a chip manufacturer. They
really aren't competing against one another.
I'm sure the main focus now is opteron but these
PowerPC systems by Apple really look nice also. The benchmarks on the
apple site look unreal, but you never know. The benchmarks really blow
the opteron away. Whatever.

The benchmarks ARE unreal, or at the very least they are very
carefully hand-picked. The PowerPC 970 (the 'G5' in Apple-speak) is a
perfectly good chip and it's pretty evenly matched to today's Opteron
and P4 processors. There are also a few areas where it really excels
as compared to the P4/Opteron because it's a very different design.
By the same notion, there are a few areas where the PPC 970 really
stinks it up as compared to the P4/Opteron. For the most part though,
performance is similar.

Of course, Apple focuses largely on those few applications where the
PPC 970 does very well. They also manage to get some EXTRODINARLY BAD
scores for PCs on tests where others score MUCH better using the same
software and hardware. Their results for SPEC tests are the most
obvious example, where Apple managed scores about 30-50% lower than
what other people have managed using the same hardware and same
compiler.

All just a part of the Steve Jobs Reality Distortion Field (tm).
 
R

Robert Myers

Huh? Sorry you can't brush it off so easily. It was fairly widely
reported in the specialist press.

Oh dear. You have to be _really_ careful when posting. I took the
$46 million as fact and was pondering what might be going on at IBM.
I wanted to acknowledge that what _I_ was engaged in was pure
speculation.
It was a major technology fix - AMD backed the wrong horse for Cu/SOI and
couldn't make the bloody thing work. Whatever its value to IBM it was a
considerable *added* expense to AMD, who were drowning in red ink, for a
project which was 3months past due delivery with no viable route to
completion.
That it was a significant and painful expense to AMD is hereby fully
acknowledged.
Different item - this was an initial payment to put out a fire, before and
over and above the technology exchange agreement they struck later. BTW I
haven't seen where IBM is acting as a foundry for AMD???
google

IBM AMD foundry "East Fishkill"

for a veritable cornucopia of speculation, including the possibility
that IBM has a stake in the Dresden facility. I can barely keep track
of it, and I don't know what the latest and most "authoritative"
speculation is.
Hmmm - whatever... it seems to be working if so.<shrug>

For a version of my exact logic, plucked from the sea of speculation
produced by the search proposed above, see

http://www.internetnews.com/infra/article.php/1566591

<begin quote>

To compete with an 800-pound gorilla like Intel (Quote, Chart), you
have to be a 800-pound gorilla - or at least join forces with one.

That's the thinking at Advanced Micro Devices (AMD) (Quote, Chart),
which Wednesday said it is teaming up with IBM (Quote, Chart) to
jointly develop chip-making technologies for use in future
high-performance products.

<snip>

IBM said it is more than happy to lend a hand in making chips with
AMD. Big Blue's microprocessor division has also had a bone to pick
with Intel with the advent of Xeon and 64-bit Itanium processors
making bigger and bigger waves in high-end systems.

<snip>.

All of the saber rattling is an attempt to get the attention of Intel,
which is focused on three major process transitions - 130nm
lithography, 300mm wafers and copper interconnects.

Deutsche Bank Securities analyst Ben Lynch says Intel's competition
will have to deal with the No. 1 chipmaker's improved performance as
its top-line has slowed.

"A key element of Intel's continued competitive advantage has been
ongoing leadership in semiconductor process technology," said Lynch.
"We expect the company's relative scale and aggressive investment
budget to allow Intel to retain its technology competitive advantage
over most, if not all, of its peers."

<end quote>

Intel can and will spend extravagantly on process, becuase it
understands that leadership in process technology is essential to
staying king of the hill. IBM has incredible depth of talent in
process, but insufficient volume on its own to stay there. AMD has to
go out of house for process, and if it really wants to compete with
Intel, there has to be some place for it to go to than can compete
with Intel. Cooperation in depth between AMD and IBM seems like a
natural for both of them.
Windfalls are always nice. I doubt that the figure was arrived at by such
cynical means.
IBM microelectronics has been losing money. It just got "merged" with
the systems division, although the heads of both divisions are somehow
supposed to keep their jobs and their titles (?). The pressure on IBM
microelectronics to generate revenue has been intense. So intense, I
was speculating, that it is not inconceivable that it took mere money
in return for something of incredible value. The old IBM would have
laughed in your face had you proposed such a deal and told you not to
let the door hit you on the ass on your way out.

If IBM _didn't_ get more than just money out of such an opportunity,
it was a grotesque mistake on their part. As it is, I think IBM did
get more than just money out of the deal, but neither AMD nor IBM is
going to be forthcoming as to just what.
"Hostile takeover" of whom? The fact that the collaboration has been so
successful for both AMD and IBM gives the lie to your third option IMO. As
for IBM being overpriced, when you have the kind of IP portfolio they own,
your in the catbird seat. The industry, Intel excepted as far as is known,
is beating a path to their door.

That being the case, what get get by building a one-year chart with
IBM, INTC, and AMD at www.bloomberg.com suggests that whoever is
running IBM is doing a lousy job of realizing value for stockholders.

I don't follow capital markets closely enough to know who does these
things how these days, but if IBM is sitting in the catbird seat and
the best it can do in the equity markets with its current management
is what it has been doing, then it sounds like leveraged buyout time
to me.

RM
 
D

Dale Pontius

The article most certainly does suggest that an analyst read things that
way. Whether Otellini meant that is another matter - Itanium for the
desktop does not "fit" either - an Iteleron??<shrug>
Maybe you've hit the nail on the head.

For the enlightened, the "Celeron" moniker is usually an insult, so
maybe Intel wants to "fit" their X86-64 into the Itanium line under
some sort of "Celeron-X" badge. Of course they wouldn't want to admit
it's an insult. If they really have to adopt X86-64, I wouldn't be at
all surprised to see some sort of spin applied that could power a
major city. (or preferably, a Space Elevator)

Dale Pontius
 
Y

Yousuf Khan

David Schwartz said:
There are a huge number of reasons, but I assure you none of them
involve creating a processor on which the massive amount of existing 64-bit
software can execute.

Here's some more information that seems to have come out as of today about
this:

http://news.com.com/2100-1006-5150336.html

Apparently CT is part of a new two-letter acronym system at Intel. HT is
Hyperthreading, VT is Vanderpool, and LT is LaGrande. The article somehow
manages to forget to define what the CT means exactly. Why not YT?

Apparently Intel will be showcasing this CT at its IDF in Feb.

I'm now convinced that the recent announcement by HP of wanting to adopt
"Opteron-like" chips in the future is somehow related to all of these other
rumours. HP never said it wanted Opteron itself, just Opteron-like. Either
HP went to Intel and said that they better come up with a 64-bit x86 fast,
because we can't keep ignoring our customers. Or Intel said to HP, we're
going to make an announcement about a 64-bit x86 soon, so you can go ahead
and announce something to your customers about your desire to implement it
now.

Yousuf Khan
 
D

David Schwartz

Here's some more information that seems to have come out as of today about
this:

http://news.com.com/2100-1006-5150336.html


Interesting. Intel knows these announcements will hurt the Itanium
platform. Perhaps Intel is giving up on Itanium, leaving just Xeon to
compete with the Opteron. In that case, this annoucement makes perfect
sense. Anyone on the fence between Xeon and Opteron is now more likely to
either wait or choose the Xeon now.

I think more than anything, these types of annoucements are calculated
to try to slow down the adoption of the Opteron. If there's going to be
something new "any day now", you don't want to take any radical steps today.

DS
 
Y

Yousuf Khan

David Schwartz said:
Interesting. Intel knows these announcements will hurt the Itanium
platform. Perhaps Intel is giving up on Itanium, leaving just Xeon to
compete with the Opteron. In that case, this annoucement makes perfect
sense. Anyone on the fence between Xeon and Opteron is now more likely to
either wait or choose the Xeon now.

I think more than anything, these types of annoucements are calculated
to try to slow down the adoption of the Opteron. If there's going to be
something new "any day now", you don't want to take any radical steps
today.

Maybe, hard to say if that will work though. Opteron is already here now, CT
won't be around for a few more quarters yet.

Yousuf Khan
 
G

George Macdonald

Oh dear. You have to be _really_ careful when posting. I took the
$46 million as fact and was pondering what might be going on at IBM.
I wanted to acknowledge that what _I_ was engaged in was pure
speculation.

And I see you're continuing in the same vein.:)

google

IBM AMD foundry "East Fishkill"

for a veritable cornucopia of speculation, including the possibility
that IBM has a stake in the Dresden facility. I can barely keep track
of it, and I don't know what the latest and most "authoritative"
speculation is.

I believe AMD is still constrained to some limited output by %(30 ?) of x86
chips it can sub-contract to foundries. Interesting Q here, of course, if
x86-64 qualifies - the x86-32 part I'm sure would. As you note it *is*
complicated but it was my impression that maybe IBM was "participating" in
the 2nd Dresden fab - I'd not heard anything about the current one.
For a version of my exact logic, plucked from the sea of speculation
produced by the search proposed above, see

http://www.internetnews.com/infra/article.php/1566591

<begin quote>

To compete with an 800-pound gorilla like Intel (Quote, Chart), you
have to be a 800-pound gorilla - or at least join forces with one.

That's the thinking at Advanced Micro Devices (AMD) (Quote, Chart),
which Wednesday said it is teaming up with IBM (Quote, Chart) to
jointly develop chip-making technologies for use in future
high-performance products.

<snip>

IBM said it is more than happy to lend a hand in making chips with
AMD. Big Blue's microprocessor division has also had a bone to pick
with Intel with the advent of Xeon and 64-bit Itanium processors
making bigger and bigger waves in high-end systems.

<snip>.

All of the saber rattling is an attempt to get the attention of Intel,
which is focused on three major process transitions - 130nm
lithography, 300mm wafers and copper interconnects.

This is where I have trouble with such reports - speculation and
prognostication is transformed into bare, hard facts.
Deutsche Bank Securities analyst Ben Lynch says Intel's competition
will have to deal with the No. 1 chipmaker's improved performance as
its top-line has slowed.

"A key element of Intel's continued competitive advantage has been
ongoing leadership in semiconductor process technology," said Lynch.
"We expect the company's relative scale and aggressive investment
budget to allow Intel to retain its technology competitive advantage
over most, if not all, of its peers."

<end quote>

Intel can and will spend extravagantly on process, becuase it
understands that leadership in process technology is essential to
staying king of the hill. IBM has incredible depth of talent in
process, but insufficient volume on its own to stay there. AMD has to
go out of house for process, and if it really wants to compete with
Intel, there has to be some place for it to go to than can compete
with Intel. Cooperation in depth between AMD and IBM seems like a
natural for both of them.

Hmmm, so is Ben Lynch an Intel cheerleader or basher? It seems to count
for more in the world of uhh, anal...ysts than the real world. To say that
IBM has to go "out of house" when they are the premier (sole ?) IP holder
when it comes to the leading edge processes like Cu/SOI seems err,
incongruous. They have foundry agreements with others like Chartered but
some of the motivation there is tactical and strategic... not to discount
volume completely.
IBM microelectronics has been losing money. It just got "merged" with
the systems division, although the heads of both divisions are somehow
supposed to keep their jobs and their titles (?). The pressure on IBM
microelectronics to generate revenue has been intense. So intense, I
was speculating, that it is not inconceivable that it took mere money
in return for something of incredible value. The old IBM would have
laughed in your face had you proposed such a deal and told you not to
let the door hit you on the ass on your way out.

If IBM _didn't_ get more than just money out of such an opportunity,
it was a grotesque mistake on their part. As it is, I think IBM did
get more than just money out of the deal, but neither AMD nor IBM is
going to be forthcoming as to just what.

There are, no doubt, many intangibles involved... such as getting to
practise/prove their process skills on another ambitious implementation
like the Opteron. IBM has also shown itself more willing than some others
to trade IP through specific cross-licensing - not sure how things stand
with broad cross-licensing those days.

Then again if the division which sells Xeon servers was not so keen to bend
over for a couple of million (talk about small potatoes) of Intel ad seed
money, the microelectronics division might have a better chance of a big
score. Obviously the time is now as ripe as it ever will be but they'd
rather play the courtesan role for nickels & dimes.

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 
G

Goose

http://news.com.com/2100-1006-5150336.html

Apparently CT is part of a new two-letter acronym system at Intel. HT is
Hyperthreading, VT is Vanderpool, and LT is LaGrande. The article somehow
manages to forget to define what the CT means exactly. Why not YT?

Concede-to-amd Technology?
Caught-with-our-pants-down Technology?
Catch-up Technology?
Cough-amd-cough Technology?
Change-course Technology?
....
 
K

Keith R. Williams

Concede-to-amd Technology?
Caught-with-our-pants-down Technology?
Catch-up Technology?
Cough-amd-cough Technology?
Change-course Technology?
...

Can Too?
 
C

CJT

Goose said:
Concede-to-amd Technology?
Caught-with-our-pants-down Technology?
Catch-up Technology?
Cough-amd-cough Technology?
Change-course Technology?
...
Copy-amd Technology?
Cut-itanium-off-at-the-knees Technology?
Close-encounters-of-the-amd-kind Technology?

<g>
 
G

geno_cyber

In a literal reading yes you can make that point. However, the significance
of this statement seems to require a little bit more than a literal reading.
It requires a political reading. If Ottelini were just referring to any old
Windows 64-bit software, then he would have been referring to Itanium, but
he never mentioned Itanium. I've never seen an Intel executive miss an
opportunity to promote Itanium, IA64, or whatever when referring to 64-bit
software.

IA64 core inside desktop P4 CPUs would be the best bet for Intel to go 64bit.
x86-64 from Intel is the worst thing they could have done, ever. Not only Intel will look weak to
follow the AMD proposed 64bit pseudo/hybrid rule at extending the ancient x86 architecture but it
would mean that IA64 might become a dead project not because its architecture its worse but because
customers are quite retarded to promote an x86-64 hybrid instead.
If Intel followed customers wishes with P4 then it should have reverted back to Pentium III
architecture since the first P4 releases had lower IPC and overall performance.... and that would
have been a huge mistake for Intel which would have let AMD to conquer the market.

If and only if Intel will reveal an hidden IA64 core inside IA32 sooner than expected (there were
unofficial roadmaps floating on the 'net quite some months ago showing IA64 going to desktop by
2006/2007 with dual-core CPUs ....so an hybrid IA32/IA64 architecture was probably on schedule long
time ago at Intel to let the desktop market move to 64bit...) it will have done the right thing,
otherwise it will be a big mistake in the next few years, unless Intel plans to trash x86-64 hybrid
thing anyway to move to IA64 but if it's going to be viceversa then x86-64 will be the worst thing
Intel could have done to itself.

The IA64 EPIC architecture it's much better than any hybrid x86-64 thing, it will be in the long run
and it's time to stop the x86 extensions, it still is an old architecture, something new was needed
and it's still needed and IA64 is still the answer, even if it's taking too much time for
engineers/designers/coders to debug and improve it, x86 won't last forever and it's already not such
an efficient architecture even though heavily used.
 
Y

Yousuf Khan

IA64 core inside desktop P4 CPUs would be the best bet for Intel to go
64bit.

There was never any serious proposal to do something like that, IA32 and
IA64 groups are separate within Intel. Besides integrating an IA64 core
would have made them woefully inefficient, neither doing IA64 right, nor
IA32, and at the same time making the processor much more expensive. Worst
of all worlds. Of course, if there was one company that could survive a
worst of all worlds scenario, it would be Intel.
x86-64 from Intel is the worst thing they could have done, ever. Not only Intel will look weak to
follow the AMD proposed 64bit pseudo/hybrid rule at extending the ancient x86 architecture but it
would mean that IA64 might become a dead project not because its
architecture its worse but because
customers are quite retarded to promote an x86-64 hybrid instead.

Yes definitely, IA64 is now basically dead. Thank goodness.
If and only if Intel will reveal an hidden IA64 core inside IA32 sooner than expected (there were
unofficial roadmaps floating on the 'net quite some months ago showing IA64 going to desktop by
2006/2007 with dual-core CPUs ....so an hybrid IA32/IA64 architecture was probably on schedule long
time ago at Intel to let the desktop market move to 64bit...)

I don't think there were ever any roadmaps for dual-core IA32/IA64
processors. There were roadmaps in the early days suggesting that IA64 would
become the desktop standard, but it would've achieved IA32 compatibility
only through the brain-dead internal x86 emulator, not with a separate IA32
core.
The IA64 EPIC architecture it's much better than any hybrid x86-64 thing, it will be in the long run
and it's time to stop the x86 extensions, it still is an old architecture, something new was needed
and it's still needed and IA64 is still the answer, even if it's taking too much time for
engineers/designers/coders to debug and improve it, x86 won't last forever and it's already not such
an efficient architecture even though heavily used.

IA64 was a silly architecture, all along. Most people saw that, and finally
Intel had to admit to it.

Yousuf Khan
 
T

Tony Hill

IA64 core inside desktop P4 CPUs would be the best bet for Intel to go 64bit.

IA64 performs like crap unless you feed it a LOT of cache and memory
bandwidth. You can't easily just stick that core inside a desktop
chip at 90nm and get remotely decent performance. A high-end Xeon
where a 300-400mm^2 die is acceptable, maybe, but not on a P4. Not at
130nm and not even at 90nm, though it might start getting practical at
65nm production. Still it would put Intel at a MAJOR cost
disadvantage as compared to their competition.
x86-64 from Intel is the worst thing they could have done, ever. Not only Intel will look weak to
follow the AMD proposed 64bit pseudo/hybrid rule at extending the ancient x86 architecture but it
would mean that IA64 might become a dead project not because its architecture its worse but because
customers are quite retarded to promote an x86-64 hybrid instead.

Ok, now you're obviously just trolling. Please tell me WHAT is wrong
with the AMD64 instruction set. Some conclusive problems with it that
are really hurting performance and for which solutions aren't either
already implemented or in the works.
If Intel followed customers wishes with P4 then it should have reverted back to Pentium III
architecture since the first P4 releases had lower IPC and overall performance.... and that would
have been a huge mistake for Intel which would have let AMD to conquer the market.

Customers want the processor that runs their software the best at the
lower cost. The P4 quickly became faster and cheaper than the PIII,
even if the first revision or two weren't much of an improvement.

The problem with Itanium is, of course, that it doesn't run customers
software. Having a great whiz-bang piece of hardware is completely
useless if it doesn't run your software. The software is what
matters, it's there for IA32, not for IA64.
If and only if Intel will reveal an hidden IA64 core inside IA32 sooner than expected (there were
unofficial roadmaps floating on the 'net quite some months ago showing IA64 going to desktop by
2006/2007 with dual-core CPUs

The roadmaps from a few months ago show IA64 hitting the desktop in
2006/2007. The roadmaps from a couple years ago show IA64 hitting the
desktop in 2004/2005. The roadmaps from a few years before that show
IA64 hitting the desktop in 2002/2003. The date is consistently
getting pushed back, not pushed forward.
....so an hybrid IA32/IA64 architecture was probably on schedule long
time ago at Intel to let the desktop market move to 64bit...)

Yes, it was, that's why the Itanium2 is a hybrid IA32/IA64 chip. But
guess what, it's performance stinks in IA32 mode! So much so that
they mostly due software translation instead because the performance
that way is better.
The IA64 EPIC architecture it's much better than any hybrid x86-64 thing,

Please give me a GOOD reason why IA64 is better than AMD64? A reason
sufficient for me to give up my installed base of dozens of software
applications to move to a new, expensive and poorly supported
platform?
it will be in the long run
and it's time to stop the x86 extensions, it still is an old architecture, something new was needed
and it's still needed

WHY is it needed? Please list some problems you see with AMD64. Keep
in mind that with AMD64 you now have a 16 GPRs (which are true GPRs
now, not the pseudo-GPRs of IA32) and about a hundred of rename
registers. You've got a flat-register based FPU with 16 registers of
128-bits a piece. If Intel implements an AMD64 chip they will
presumably make use of their trace cache, thus pretty much eliminating
any problems with tricky x86 decoders.

So what problems are left with x86 that are not already fixed or being
fixed in the near future?
 
R

Robert Myers

And I see you're continuing in the same vein.:)

Pathetic, isn't it? And I don't even get paid to write this garbage.

Not to worry. As soon as I finish my MBA at <wildly overrated local
college for morons>, I'll be joining a large consulting firm here, and
all the practice I've had writing useless garbage will begin to pay
off. :).

RM
 
Y

Yousuf Khan

Tony Hill said:
WHY is it needed? Please list some problems you see with AMD64. Keep
in mind that with AMD64 you now have a 16 GPRs (which are true GPRs
now, not the pseudo-GPRs of IA32) and about a hundred of rename
registers. You've got a flat-register based FPU with 16 registers of
128-bits a piece. If Intel implements an AMD64 chip they will
presumably make use of their trace cache, thus pretty much eliminating
any problems with tricky x86 decoders.

So what problems are left with x86 that are not already fixed or being
fixed in the near future?

The AMD64 extensions were a long overdue tidying up of the x86 instruction
set. However, even with a non-tidy instruction set as in x86-32, it was
still doing it's job just fine.

Yousuf Khan
 
J

Jan Panteltje

Concede-to-amd Technology?
Caught-with-our-pants-down Technology?
Catch-up Technology?
Cough-amd-cough Technology?
Change-course Technology?
LOL
Crisis Technology?
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top