The death of non-x86 is now at hand?

J

jack

: http://www.theinquirer.net/?article=14038

Oh Yousuf, THANK you for that link. This is all just to funny. I mean,
tooooo funny. Allow me please, to quote the 3rd paragraph:

[start quote]
In 1981 IBM announced the 5150 PC. It was the machine set to
revolutionise business computing but it had a major design flaw. For
some inexplicable reason IBM chose to use the 8088 processor in that
first PC. It was a choice that bordered on the bizarre. The 8088 was the
bottom of the line of the 8086 series of processors from Intel and most
experts agreed it was one of the worst processor designs on the market.
Its memory management has been described as "brain damaged" and register
allocation for data was like a game of Russian roulette.
[End Quote]

I feel like I'm having a major case of Deja Vu! Way back in 1983, I was
working at Zilog in Cupertino, California, and I remember the head of
the Engineering department saying almost exactly the same thing (can't
remember his name). I mean, this is almost word-for-word (the part
about "brain damaged" memory management and register allocation =
Russian roulette)! The reason I remember this is that I was SO struck
by his comments (he was x-Intel) and that I couldn't believe he actually
confided in me, a fresh out-of-college puke.

Man, what a trip down memory lane to read this article. As I said,
tooooooo funny!

Best regards,

Jack

--
 
Y

Yousuf Khan

Carlo Razzeto said:
"Yousuf Khan" <[email protected]> wrote in message

Interesting article, I honestly don't think they're way to far off
base... I wouldn't be surprised to see the vast majority of diversity
in CPU architecture disappear over the next few years.

I can't fault them for any flaws in logic either. It makes sense that x86
descendents will take over the world, especially as they get expanded and
cleaned up, through the natural evolutionary processes. They've taken the
time to explain what the remaining advantages were in proprietary
architectures over x86, and how they are now mostly disappearing too.

I think one of the main driving influences behind trying to prevent x86 from
taking over was that Intel would have too much control over the standard.
But as has now been demonstrated, alternative companies like AMD, can also
drive standards in the x86 field, so there is room for evolution without
being locked into a single vendor.

If Intel and AMD and the rest of the x86 field are smart, they will setup a
consortium or a committee to drive x86 development, much like Sparc
International does for Sparc, MIPS International does for MIPS, or Arm
Holdings does for ARM. The time is right to turn x86 from a defacto standard
to a true dejure standard.

Yousuf Khan
 
Y

Yousuf Khan

George Macdonald said:
So now we have Sun, post-Ed (Zander that is). This could kill them off....
or?? I guess they could always buy up Gateway.<guffaw>

What do you mean? I thought Sun is claiming Opteron to be their saviour?

Yousuf Khan
 
Y

Yousuf Khan

jack said:
I feel like I'm having a major case of Deja Vu! Way back in 1983, I was
working at Zilog in Cupertino, California, and I remember the head of
the Engineering department saying almost exactly the same thing (can't
remember his name). I mean, this is almost word-for-word (the part
about "brain damaged" memory management and register allocation =
Russian roulette)! The reason I remember this is that I was SO struck
by his comments (he was x-Intel) and that I couldn't believe he actually
confided in me, a fresh out-of-college puke.

Man, what a trip down memory lane to read this article. As I said,
tooooooo funny!

Well, I'm sure everyone was saying the same things about the 8086 memory
management scheme back then, it was a common sentiment.

However, Intel did make the segment mechanism completely useful when they
introduced the Protected Mode of operation.

However, here's another quote from the article:

<quote>
Now it might seem that 8086 series had nothing going for it at all. Here was
a 16bit processor that was little more than a kludged up 8bit processor and
so bad that almost nobody loved it. But that turned out to be an advantage.
Where programmers on competing processors were happy to use assembly
language, getting anywhere with an 8086 meant a decent compiler was
essential. Compiler technology came on in leaps and bounds.
</quote>

I don't know if I agree with this. I don't think there were necessarily any
more assembly language programmers for other architectures than there were
for 8086 at the time. I think the general transition towards compilers was
ongoing anyways, whether x86 spurred it or not. In fact, I'd hazard a guess
that there were more x86 assembly programmers than for any other
architecture simply because of the numbers of x86 hardware sold.

Yousuf Khan
 
R

Rob Stow

Yousuf said:
What do you mean? I thought Sun is claiming Opteron to be their saviour?

No. Sun is merely seeing 10K Opteron server sales per quarter -
and growing - and they have decided that they want to be part of
that market. Sun is a big enough company that even if they had had
*all* of that market it wouldn't have saved them from bleeding
red ink.
 
Y

Yousuf Khan

Rob Stow said:
No. Sun is merely seeing 10K Opteron server sales per quarter -
and growing - and they have decided that they want to be part of
that market. Sun is a big enough company that even if they had had
*all* of that market it wouldn't have saved them from bleeding
red ink.

I don't know, the real money is in selling support contracts to customers,
which is an ongoing revenue stream. The initial cost of the servers is not
where the profits are at.

Yousuf Khan
 
R

Robert Redelmeier

In comp.sys.ibm.pc.hardware.chips Yousuf Khan said:
I can't fault them for any flaws in logic either. It makes sense that x86
descendents will take over the world, especially as they get expanded and
cleaned up, through the natural evolutionary processes. They've taken the
time to explain what the remaining advantages were in proprietary
architectures over x86, and how they are now mostly disappearing too.

And AMD'x x86-64 is almost certainly the way forward for
64 bit. Not because it's better than Alpha (it isn't)
or Itanium (it is) but because it's cheaper.

64 bit just isn't so compelling that it _has_ to go forward.
No killer 64 bit apps. 32 bit from 16 was a h3ll of a lot
more compelling, and it still took a long time at least
from Microsoft.

So the kludgy extention of x86 to 64 will dominate because
it will give cheap and powerful 64 bit capabilities on the
back of mostly 32 bit needs.

-- Robert
 
Y

Yousuf Khan

Mike Tomlinson said:
"At least there will be less fundamental change in the industry a decade
from now, the x86-128 will be much less of an upset."

Aaaaarrrghhh! -- already talking of x86-128..

It's probably not going to be necessary until 2050 though.

Somebody pointed out that if you employ that highly convenient and highly
overused Moore's Law, where you insert whatever convenient computer metric
and double it every year-and-a-half. Every additional bit on top of 32-bit
would be another doubling of memory; 33-bits is double of 32-bits, 34-bits
is double of 33, etc. So going from 32-bit to 64-bit is 32 doublings, or
about 50 years, to completely use up the full 64-bit address space.

Yousuf Khan
 
J

Jan Panteltje

It's probably not going to be necessary until 2050 though.

Somebody pointed out that if you employ that highly convenient and highly
overused Moore's Law, where you insert whatever convenient computer metric
and double it every year-and-a-half. Every additional bit on top of 32-bit
would be another doubling of memory; 33-bits is double of 32-bits, 34-bits
is double of 33, etc. So going from 32-bit to 64-bit is 32 doublings, or
about 50 years, to completely use up the full 64-bit address space.

Yousuf Khan
LOL so we go 65 bit:)
 
D

David Schwartz

It's probably not going to be necessary until 2050 though.
Somebody pointed out that if you employ that highly convenient and highly
overused Moore's Law, where you insert whatever convenient computer metric
and double it every year-and-a-half. Every additional bit on top of 32-bit
would be another doubling of memory; 33-bits is double of 32-bits, 34-bits
is double of 33, etc. So going from 32-bit to 64-bit is 32 doublings, or
about 50 years, to completely use up the full 64-bit address space.


This is one way to look at it, but there's another way to look at it. If
you're trying to perform certain types of computations, say multiply two
large numbers, a 32-bit processor will be able to do it faster than a 16-bit
processor at the same instruction rate.

If there's no other way to make processors keep getting faster, adding
more bits will allow them to perform at least some types of computations
more rapidly. If we have to go to 128 bits to do this, we will. And that
could happen long before 2050.

However, I don't think this really applies because the percentage of
typical computations that benefit from the extra bits is very small. Few
computations fit in 8 bits, so going to 16 bits helped almost everything.
Many computations didn't fit in 16 bits, so going to 32 bits helped a lot.
Few computations don't fit in 32 bits, so going to 64 bits won't speed up
computation in general by very much (10%?). Almost everything fits in 64
bits, so going to 128 bits for computational speed will likely be a
non-started.

Possibly notable exceptions include encryption.

DS
 
Y

Yousuf Khan

David Schwartz said:
This is one way to look at it, but there's another way to look at it. If
you're trying to perform certain types of computations, say multiply two
large numbers, a 32-bit processor will be able to do it faster than a 16-bit
processor at the same instruction rate.

If there's no other way to make processors keep getting faster, adding
more bits will allow them to perform at least some types of computations
more rapidly. If we have to go to 128 bits to do this, we will. And that
could happen long before 2050.

I don't think there is much of an outcry for very large integer calculations
yet. Most of the call for 64-bit is about large address calculations. In
fact, in the AMD64's Long Mode implementation, the default address size is
64-bit, but its default register size is 32-bit. That means by default
you'll be using 32-bit sized registers unless you explicitly ask for 64-bit
sized registers when doing your integer calculations.

As matter of fact, I think if you're going to be doing calculations beyond
32-bit integers, most people would want to switch over to floating point
calculations at that point.

Yousuf Khan
 
G

Guest

Yousuf> If Intel and AMD and the rest of the x86 field are smart,
Yousuf> they will setup a consortium or a committee to drive x86
Yousuf> development, much like Sparc International does for Sparc,
Yousuf> MIPS International does for MIPS, or Arm Holdings does for
Yousuf> ARM. The time is right to turn x86 from a defacto standard to
Yousuf> a true dejure standard.

That will tough to pull off. We can all hope.

Alan
 
C

Carlo Razzeto

Yousuf Khan said:
If Intel and AMD and the rest of the x86 field are smart, they will setup a
consortium or a committee to drive x86 development, much like Sparc
International does for Sparc, MIPS International does for MIPS, or Arm
Holdings does for ARM. The time is right to turn x86 from a defacto standard
to a true dejure standard.

Yousuf Khan


I would love to see this happen too... Unfortunetly for the many
things Intel is, they are also a very greedy corperation. Let's not
forget one of the big driving motivations behind IA-64 was so intel
could once agian have (almost) complete ownership over a CPU ISA, no
3rd party licences, and then force that down on the average joe so
they can take another step twards because a true CPU monopoly.

Carlo
 
G

George Macdonald

What do you mean? I thought Sun is claiming Opteron to be their saviour?

Sorry I should have made that clearer - the "or??" was meant to suggest the
alternative. I can see it going either way but surely it reduces the
perceived value of their hardware offerings... so can they pull new
customers??

Rgds, George Macdonald

"Just because they're paranoid doesn't mean you're not psychotic" - Who, me??
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top