What does '64 bit' mean? Lame question, but hear me out :)

K

keith

Yeah, it's become common usage to refer to 16 bits as a "word" but
originally the "word size" of a CPU means the width of its data and/or
address registers. The terminology kind of ossified in the 16-bit
days, hence the usage of "word" == 16 bits has stuck...

Only in the x86 world. In the world of 'z's and PPCs a "word" is still
32bits.
 
G

George Macdonald

Well, actually the whole idea of DLL's is outdated in .NET isn't it? The
idea of .NET was to create a framework that is independent of
architecture (albeit mostly limited to Microsoft operating systems). So
a program once compiled doesn't care if its on a 32-bit processor or a
64-bit one, or even care if it's running on an x86-compatible processor
for that matter. There is no dependence on bittedness or instruction set.

Huh? They call that "compiled" nowadays?
 
G

George Macdonald

Only in the x86 world. In the world of 'z's and PPCs a "word" is still
32bits.

How much is this "16-bit word" definition due to M$'s pollution of the
computer vocabulary?... not sure how things stand in the Unix world at
present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
words over the years that I've worked with. I've always thought of the
word size as the integer register width.
 
K

Kai Harrekilde-Petersen

George Macdonald said:
How much is this "16-bit word" definition due to M$'s pollution of the
computer vocabulary?...

I don't think we can blaim this one on microsoft; If my memory serves
me right, Intel defined the 'word' as a 16 bit unit for the
assembler.
not sure how things stand in the Unix world at
present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
words over the years that I've worked with. I've always thought of the
word size as the integer register width.

Yeah, but originally the 8086/8088 was a 16 bit CPU. The 80386
extended that to 32 bit (EAX and friends), and now there are 64 bit
versions as well. IMHO keeping the definition of a "word" fixed
regardless of the implementation of the architecture is the Right
Thing(tm) - otherwise a lot of programs would crash when recompiled
for 32/64 bit machines.

Regards,


Kai
 
G

George Macdonald

I don't think we can blaim this one on microsoft; If my memory serves
me right, Intel defined the 'word' as a 16 bit unit for the
assembler.

Intel was not the first to build a computer with an extended
instruction/addressing/register set with some legacy backwards
compatibility.
Yeah, but originally the 8086/8088 was a 16 bit CPU. The 80386
extended that to 32 bit (EAX and friends), and now there are 64 bit
versions as well. IMHO keeping the definition of a "word" fixed
regardless of the implementation of the architecture is the Right
Thing(tm) - otherwise a lot of programs would crash when recompiled
for 32/64 bit machines.

"Implementation of the architecture" is the key here though and viewing all
the different x86s as a single entity is a gross error from my POV. For
the 80386, you simply needed a different compiler and linker from what was
used for 8088/86... just as you need a different compiler for AMD64/EM64T.
The fact that the instruction set sytax and mnemonics is familiar is
irrelevant - they are all really different computers.
 
B

Bob Niland

George Macdonald said:
... we've had computers with 16, 24, 32, 36, 60, 64 bit
words over the years that I've worked with.

12- and 18-bit too, as I recall. And I worked with an ISA
whose direct address space was 19 bits.
I've always thought of the word size as the integer
register width.

Works for me, but ..

The world has generally agreed that a "byte" is 8 bits,
although not always, historically.

My impression is that "word" has never had an agreed
meaning beyond the pages of any particular ISA's manuals.
It's less meaningful than an audio amplifier "watt" was
back in the heady days before the FTC stepped in (not
that they actually fully resolved the matter).

Customer: "What does '64-bit' mean?"
Marketing Dude: "What would you like it to mean?"
 
K

keith

Huh? They call that "compiled" nowadays?

Sure "they" do. Haven't you heard of a Java "compiler". DotNet is their
answer after being smacked shitless in court for trying to jijack Java.
 
K

keith

That language is at least as old as Pascal isn't it? One spoke of
compiling to p-code...no?

Not all Pascal compilers output P-code. Borland captured the market with
a real compiler and a workable development platform for *cheap*.
 
K

keith

How much is this "16-bit word" definition due to M$'s pollution of the
computer vocabulary?... not sure how things stand in the Unix world at
present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
words over the years that I've worked with. I've always thought of the
word size as the integer register width.

That's the classical definition (as I've noted earlier in this thread).
I'm sure you've missed a bunch too. The fact is that anyone
assuming any results from size_of(word) is simply asking for a rude
awakening.
 
Y

Yousuf Khan

George said:
Huh? They call that "compiled" nowadays?

Well, it's compiled into a byte-code of some sort, just not machine
code. It's just like Java, only Microsoft-oriented.

Yousuf Khan
 
Y

Yousuf Khan

George said:
How much is this "16-bit word" definition due to M$'s pollution of the
computer vocabulary?... not sure how things stand in the Unix world at
present... but yes we've had computers with 16, 24, 32, 36, 60, 64 bit
words over the years that I've worked with. I've always thought of the
word size as the integer register width.

Well, we got the bits, the nibbles, the bytes, the words, etc. The first
three are completely standardized values (remember the nibble? It's
4-bits in case you don't). Then you got everything after the word is
nebulous, but thank god the didn't decide to create a new bit-size term
based around human language, like the clause or the sentence! We already
have the paragraph, and the page, and that's more than enough.

BTW, in the Unix world, these days they always preface /word/ with an
actual bit-size description, such as "32-bit word" or "64-bit word".

Yousuf Khan

Yousuf Khan
 
G

George Macdonald

Well, it's compiled into a byte-code of some sort, just not machine
code. It's just like Java, only Microsoft-oriented.

It's just not real code and it's source is not real software.:) This
abuse of blurring the difference is going too far. What's the point of
faster and faster processors if they just get burdened with more and more
indirection. Neither Java, nor any other language, *has* to produce
interpretive object code.

Such languages have their place and reasons for use -- from security to
laziness, or just toy application -- but to suggest that DLLs, which
already have the burden of symbolic runtime linkage, are now "outdated" is
scarey.
 
G

George Macdonald

That language is at least as old as Pascal isn't it? One spoke of
compiling to p-code...no?

Pseudo code and interpretive execution goes back much further than Pascal -
many proprietary languages existed as such. I've worked on a couple of
"compilers" which produced interpretive code myself and even the end user
knew the importance of the difference - IOW if they wanted to do real work,
then a p-code Pascal was the wrong choice... same with Basic. I guess I'm
objecting more to the notion that it can replace real machine code... i.e.
"whole idea of DLLs is outdated".
 
G

George Macdonald

Well, we got the bits, the nibbles, the bytes, the words, etc. The first
three are completely standardized values (remember the nibble? It's
4-bits in case you don't). Then you got everything after the word is
nebulous, but thank god the didn't decide to create a new bit-size term
based around human language, like the clause or the sentence! We already
have the paragraph, and the page, and that's more than enough.

There was also the dibit, which I've never been sure how to pronunce:) and
the "movement" to use octet instead of byte seems to be gaining strength,
especially in Europe (French revisionism ?:))... remembering that the
first computers I used had 6-bit bytes. I don't recall what Univac called
their 9-bit field... "quarter-word"??
BTW, in the Unix world, these days they always preface /word/ with an
actual bit-size description, such as "32-bit word" or "64-bit word".

Which is how it should be... but I'd hope it doesn't use "word" for 16-bit
field on say an Athlon64.;-)

As I recall IBM introduced the concept of a variable sized word with the
System/360s but they have always been considered to have a 32-bit word size
- that's the size of the integer registers and the most efficient working
unit of integer data.
 
K

keith

Well, we got the bits, the nibbles, the bytes, the words, etc. The first
three are completely standardized values (remember the nibble? It's
4-bits in case you don't).

Actually it's spelled "nybble". ;-) "Byte" does *not* mean 8-bits.
It's the size of a character. Just because character = 8bits for all
machines we care to remember doesn't change the meaning of "byte". The
correct term for an general eight-bit entity is "octet".
 
R

Robert Myers

Pseudo code and interpretive execution goes back much further than Pascal -
many proprietary languages existed as such. I've worked on a couple of
"compilers" which produced interpretive code myself and even the end user
knew the importance of the difference - IOW if they wanted to do real work,
then a p-code Pascal was the wrong choice... same with Basic. I guess I'm
objecting more to the notion that it can replace real machine code... i.e.
"whole idea of DLLs is outdated".

"The whole idea of DLLs is outdated" sounds really attractive. It's
also a train that's been coming down the track for a long time, if
it's the same idea as virtualized architecture.

I wouldn't include tokenized Basic source, but I guess there's a good
bit of old mainframe code running on a virtual machine. Anybody
venture a guess as to how much?

I've kind of lost track of the .NET thing. It's better than Java, I
gather, and there is an open source version, mono, which is attractive
enough for open source types to work under the proprietary gunsight of
Microsoft.

Big-endian, little-endian, 64-bit, 32-bit. Yuk. Bring on the virtual
machines.

Except for us number-cruching types, I guess, but more and more number
crunching takes place in an interpreted environment like matlab,
anyway.

RM
 
K

keith

"The whole idea of DLLs is outdated" sounds really attractive. It's
also a train that's been coming down the track for a long time, if
it's the same idea as virtualized architecture.

Well, that's one way of getting rid of DLL-Hell.
I wouldn't include tokenized Basic source, but I guess there's a good
bit of old mainframe code running on a virtual machine. Anybody
venture a guess as to how much?

All of it? ...and not only the "old" stuff. Mainframes have been
virtualized for decades. ...though perhaps in a slightly different
meaning of "virtualized".

Looking at it another way, I'd propose that most modern processors
are virtualized, incuding x86. The P4/Athlon (and many before) don't
execute the x86 ISA natively, rather "interpret" it to a RISCish
processor.
I've kind of lost track of the .NET thing. It's better than Java, I
gather, and there is an open source version, mono, which is attractive
enough for open source types to work under the proprietary gunsight of
Microsoft.

I don't see it as "better" in any meaning of the word. Java's purpose in
life is to divorce the application from the processor and OS. I can't
see how .net is "better" at this. If platform independance isn't wanted,
why would anyone use Java?
Big-endian, little-endian, 64-bit, 32-bit. Yuk. Bring on the virtual
machines.

They are. You still have to decinde on a data format.
 
G

GSV Three Minds in a Can

Bitstring <[email protected]>, from the wonderful
person keith said:
That's the classical definition (as I've noted earlier in this thread).
I'm sure you've missed a bunch too.

ISTR PDPx's (7s? 15s?) had 12 bit words. Atlas/Titan mainframes were 48,
again IIRC .. it's heck of a long time ago. [No, please don't kick the
Mercury delay line memory tank .... Arrghhh.]
The fact is that anyone
assuming any results from size_of(word) is simply asking for a rude
awakening.

Indeed. Even sizeof(char) was not guaranteed on all machines. We
remember 5-track Flexowriters too. 8>.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top