Hello Jon, you said very interesting points, my thoughts below:
Jon Skeet said:
But I don't agree with the idea that it *is* wrong. As I've said
before, using a base of zero makes various things much easier. I gave
an example before of mapping (x, y) to a single index. That is exactly
what happens with multi-dimensional arrays. It's just a nice
mathematical fact that 0+x=0 and 0*x=0. It's not useful having 1*x=x
for arrays, whereas the previous two properties *are* useful.
good points, you're probably right on this case.
Would you also suggest that byte values should go from 1-256 instead of
0-255 as well? (Actually, you never answered my point about null value
types in the first post you made, which suggested that you believed a
byte should have 257 possible values instead of 256.)
Yes, this would be foolish (although SQL Server does exactly that for
all thier data types). If I had to really implement this I would
implement some sort of low level mapping hash table (probably on the
heap and stack for ref and val types). Didn't give it much thought on
the implementation.
Computing is based more on maths than on shopping lists. Ask a
mathematician where the origin is and he'll tell you it's at (0,0) (or
whatever, depending on the dimensions) - not (1,1).
As someone from a strong mathematical background, I would honestly say
this argument doesn't hold. For example, simplify the Cartesian
coordinate system by only having an x-axis. In your example, x=0
signifies the starting point. Well this intuitively describes an array
with "Nothing" in it. When x=1, then it abstracty describes an array
with "one" element, etc., etc. In your example, an array with one
element starts at x=0, well where does an array with nothing in it
lie? At x=-1? Please do tell.
If we're talking about mistakes that are far too late to remedy, it
would probably be more accurate to say that mankind makes the mistake
of being 1-based in its counting.
This statement you made shows me that the seeds of this "zero-based"
propoganda have far reaching effects in your psyche. All mankind (that
I know of) uses numbers to count things (and of course other things,
such as math, etc.). The number "one" represents for example one
apple. The number "two" represents for example two stamps. You
basically just told me that humankind got it wrong and zero should
represent "one apple".
Here's where I find your statement very intersting though. I would
argue that mankind could have chosen either counting system. We could
have chosen to start with 1 or zero or probably even 12! Whats
important is the consistancy of the mathematics and the formulas that
have been so carefully constructed over the past 2000 years. For the
same reason that there is no natural base radix point in mathimatics.
Base 10 is really a human convienience. Anyway, my point is that it's
arbitrary and that decision has already been made. My original post
wasn't argueing that humans got the number wrong in math and science
(as there is no philosophical correct way) it was that humans got the
number wrong in computers. If humans chose to start at 12, then I
would argue that computers should start at 12 (not 11).
-Dave