Yukon and .NET 2.0 Wishlist

G

Guest

You can still set youre fantasy coordinate in the first position if you want
to refer to it as an ITEM in a collection, if not you can say hey my first
coordinate is (X,y) from the ORIGIN, its still the first bloomin coordinate.
 
J

Jon Skeet [C# MVP]

Since we are at the HIGH level, why should we care about the implementation
at the compiler level and runetime because thats why we have those, so WE
DONT HAVE TO CARE.

Do you really think you don't have to care what the lowest value of a
byte is?

My point is that a lot of things in life are 0-based. Lists may not be,
but not everything in life is a list, and a lot of computer science is
based on maths which is more naturally 0-based - so choosing a 0-based
system for the whole of computing was not a mistake, IMO.
 
G

Guest

Sure that can stay that way for interop, but at the high level, let the
compiler take care of it, thats why we invent tools, so save us ths hassle.
 
J

Jon Skeet [C# MVP]

Sure that can stay that way for interop, but at the high level, let the
compiler take care of it, thats why we invent tools, so save us ths hassle.

.... and add hassle for people who are still writing things to do with
computer science (or anything else which is naturally 0-based).
 
M

malcolm

Hello Jon, you said very interesting points, my thoughts below:

Jon Skeet said:
But I don't agree with the idea that it *is* wrong. As I've said
before, using a base of zero makes various things much easier. I gave
an example before of mapping (x, y) to a single index. That is exactly
what happens with multi-dimensional arrays. It's just a nice
mathematical fact that 0+x=0 and 0*x=0. It's not useful having 1*x=x
for arrays, whereas the previous two properties *are* useful.
good points, you're probably right on this case.
Would you also suggest that byte values should go from 1-256 instead of
0-255 as well? (Actually, you never answered my point about null value
types in the first post you made, which suggested that you believed a
byte should have 257 possible values instead of 256.)
Yes, this would be foolish (although SQL Server does exactly that for
all thier data types). If I had to really implement this I would
implement some sort of low level mapping hash table (probably on the
heap and stack for ref and val types). Didn't give it much thought on
the implementation.
Computing is based more on maths than on shopping lists. Ask a
mathematician where the origin is and he'll tell you it's at (0,0) (or
whatever, depending on the dimensions) - not (1,1).
As someone from a strong mathematical background, I would honestly say
this argument doesn't hold. For example, simplify the Cartesian
coordinate system by only having an x-axis. In your example, x=0
signifies the starting point. Well this intuitively describes an array
with "Nothing" in it. When x=1, then it abstracty describes an array
with "one" element, etc., etc. In your example, an array with one
element starts at x=0, well where does an array with nothing in it
lie? At x=-1? Please do tell.
If we're talking about mistakes that are far too late to remedy, it
would probably be more accurate to say that mankind makes the mistake
of being 1-based in its counting.

This statement you made shows me that the seeds of this "zero-based"
propoganda have far reaching effects in your psyche. All mankind (that
I know of) uses numbers to count things (and of course other things,
such as math, etc.). The number "one" represents for example one
apple. The number "two" represents for example two stamps. You
basically just told me that humankind got it wrong and zero should
represent "one apple".

Here's where I find your statement very intersting though. I would
argue that mankind could have chosen either counting system. We could
have chosen to start with 1 or zero or probably even 12! Whats
important is the consistancy of the mathematics and the formulas that
have been so carefully constructed over the past 2000 years. For the
same reason that there is no natural base radix point in mathimatics.
Base 10 is really a human convienience. Anyway, my point is that it's
arbitrary and that decision has already been made. My original post
wasn't argueing that humans got the number wrong in math and science
(as there is no philosophical correct way) it was that humans got the
number wrong in computers. If humans chose to start at 12, then I
would argue that computers should start at 12 (not 11).

-Dave
 
J

Jon Skeet [C# MVP]

Yes, this would be foolish (although SQL Server does exactly that for
all thier data types). If I had to really implement this I would
implement some sort of low level mapping hash table (probably on the
heap and stack for ref and val types). Didn't give it much thought on
the implementation.

But issues which come up in lowest-level implementations repeat
themselves. For instance, say you're trying to display a grid of
elements. The rectangle for element (x, y) is
(x*boxWidth, y*boxHeight) + (origin) in a 0-based system. If everything
is 1-based, you need:

((x-1)*boxWidth+1, (y-1)*boxHeight+1) + (origin) - and what is the
natural origin in a 1-based system?

That kind of issue (its general "shape") will come up at all levels -
from memory organisation, to drawing, to paging through data. Why allow
only those dealing at the lowest level to benefit from this
mathematical property?
As someone from a strong mathematical background, I would honestly say
this argument doesn't hold. For example, simplify the Cartesian
coordinate system by only having an x-axis. In your example, x=0
signifies the starting point. Well this intuitively describes an array
with "Nothing" in it. When x=1, then it abstracty describes an array
with "one" element, etc., etc. In your example, an array with one
element starts at x=0, well where does an array with nothing in it
lie? At x=-1? Please do tell.

Any array has a range of [0, length). If the length is 0, that means
the range is [0,0) - which is an empty range.

A good real world example of this is a tape measure. Where is the first
centimetre on a tape measure? It starts at 0 and finishes on but not
including the line for 1. A tape measure starting at 1 would be very
odd, surely - you'd have to subtract one from the end of whatever you
were measuring in order to get the actual length! This is what I mean
by 0 being a more natural starting point than 1.
This statement you made shows me that the seeds of this "zero-based"
propoganda have far reaching effects in your psyche. All mankind (that
I know of) uses numbers to count things (and of course other things,
such as math, etc.). The number "one" represents for example one
apple. The number "two" represents for example two stamps. You
basically just told me that humankind got it wrong and zero should
represent "one apple".

Nope. I'm saying that if we started counting at zero rather than one,
starting with "no apples" before "one apple" then things would be more
consistent - and surely "no apples" is a more reasonable base state
anyway, isn't it? If you don't start with zero, what happens when you
take away the one and only apple?
Here's where I find your statement very intersting though. I would
argue that mankind could have chosen either counting system. We could
have chosen to start with 1 or zero or probably even 12! Whats
important is the consistancy of the mathematics and the formulas that
have been so carefully constructed over the past 2000 years. For the
same reason that there is no natural base radix point in mathimatics.

That kind of base is an entirely different matter. While a number
itself is the same in all bases, that doesn't mean that *starting*
counting at any point is equivalent. 0 has special properties which no
other numbers have, whatever base you express them in.
Base 10 is really a human convienience. Anyway, my point is that it's
arbitrary and that decision has already been made. My original post
wasn't argueing that humans got the number wrong in math and science
(as there is no philosophical correct way) it was that humans got the
number wrong in computers. If humans chose to start at 12, then I
would argue that computers should start at 12 (not 11).

Computers should start at 0 for the reasons I've given before - it's
not arbitrary, 0 has unique properties. It doesn't depend, IMO, on
where humans start counting - although to start counting at anything
other than 0 or 1 would be *very* strange. Again, this isn't the same
as what base a number is expressed in.
 
A

Arnold the Aardvark

Again though, you're implying that because *some* real world examples
are naturally 1-based, that *all* are, and I just don't believe that's
true. See my example of mapping two dimensions to one, for instance,
where 0-based arrays make things much simpler than 1-based, just
because being able to add an array index to another (and multiply them
etc) works better in a 0-based system.

Indeed. Neither system is better overall. It's like asking which is the
best day to consider as the start of the week. Is it when you start work,
or start to play, the Sabbath, after the Sabbath, whose Sabbath? It's
mostly a matter of what you are used to or what suits your circumstances.

In Pascal you can set the range yourself when you declare the array,
so it can start at 0, 1, -123, or Blue (assuming Blue is a member of
an integral type). Now everyone is happy!

Better still, access array elements through iterators, which is much
nicer than the abstraction of an array index:

// Sorry I don't know much C#.
std::vector<TMyClass> MyVec;
..
..
..
for (std::vector<TMyClass>::iterator I = MyVec.begin();
I != MyVec.end(); ++I)
{
I->SomeMethod();
}

Personally I have a C/C++ background and prefer 0-based arrays.


Arnold the Aardvark
 
G

Guest

Thats how they SHOULD have done it on C# in my view, make the start user
definable then its flexible for everybody how they want to represent theyre
data.

With this method they can still implement this change in later versions
while at the same time not breaking earlier versions, we can but hope to
have userdefinable start boundries of arrays.
 
J

Jon Skeet [C# MVP]

Thats how they SHOULD have done it on C# in my view, make the start user
definable then its flexible for everybody how they want to represent theyre
data.

However, if there's no "normal" convention used for all non-specialist
uses, you need to consult the documentation carefully every time a 3rd
party library takes or returns an array, just to find out its base. No
thanks.
With this method they can still implement this change in later versions
while at the same time not breaking earlier versions, we can but hope to
have userdefinable start boundries of arrays.

You already can, if you're prepared to use the .NET Array type rather
than just C# arrays which are more restrictive. If you cast the Array
to IList, you can even use an indexer on it to make it look like normal
array access.
 
G

Guest

Yes because consulting documention for API calls you are making is a BAD
thing if you dont know to call them.
 
J

Jon Skeet [C# MVP]

Yes because consulting documention for API calls you are making is a BAD
thing if you dont know to call them.

Obviously you should consult documentation - but generally, after
calling something a few times you know how to use it. If you have to
*also* remember what base the array returned is, you're likely to have
to consult the documentation *every* time you use it. (This is similar
to my argument about why I'm glad the assignment operator isn't
overloadable in .NET.)
 
A

Arnold the Aardvark

Thats how they SHOULD have done it on C# in my view, make the start user
definable then its flexible for everybody how they want to represent
theyre data.

With this method they can still implement this change in later versions
while at the same time not breaking earlier versions, we can but hope to
have userdefinable start boundries of arrays.

Well... maybe. I should have added that the efficiency of the underlying
implementation should be of concern to any serious developer. I suspect
that zero-based arrays are much simpler to implement. But I'm an old
dinosaur, I guess, who cares about efficieny when you have a zillion
MB and a CPU going at a zillion MHz? Seems like nobody... :-(


Arnold the Aardvark
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top