fred said:
. . . I only draw from it that I was right in the first
place. The normal chalkboard, or academic math convention has been
deviated from. . . .
True.
. . . Some have given reasons for it like "since it's digital and it
can't hold a reciprocal it has to be that way" or something along those
lines. No one seems to really have any good reason.
More subtle: some people consider -3 unambiguously a single token
representing the reciprocal of 3. They generalize from that that -x is
unambiguously the reciprocal of x. Since "negative three" raised to the
second power is nine (+9), they figure that -x^2 should work similarly.
It's not standard, but it's not unreasonable, and it can be used
consistently. That there's then no point to y = -x^2 as opposed to y = x^2
doesn't matter to these people.
I'm sure it's as you say. The early coders just got it wrong and were
kind of stuck with it for legacy reasons. Sort of like the Y2K
situation a few years ago. Or the language applications just didn't
feel it necessary to get the order right, it was enough to just explain
in the reference manuals 'their' way of doing things.
More complicated. FORTRAN got it right from the beginning. Not surprising
since it was written by mathematicians and engineers. COBOL seems to have
been the first language to get it wrong. It was designed by business people
and the sainted Grace Hopper (note: I don't use emoticons to denote
sarcasm). They seem not to have been much concerned with the standard math
convention.
Like it or not, COBOL has been around a very long time, and there's still a
very large code base written in it. Excel, like it or not, followed COBOL's
convention. There's a very large 'code base' written in Excel. There's a
perception that it'd cost too much money to fix that code, and when
confronted by cost, business people ask "is the change necessary?" The
answer to that, like it or not, is *no*.
From my own development experience it's obvious to me that the reason
many of these people are holding some position that "it's just fine",
"that's the way it is", "user emptor", is that they don't have a lot
of math experience. . . .
....
Wrong. They just have real world experience in which cost is usually the
sole deciding factor. It costs less to teach prophylactic use of parentheses
than to change/correct these languages and application development systems
and all the code that runs under them. Also, many have experience with
Microsoft, its software and its attitude towards their own design decisions.
THIS IS NOT A BUG. FOR GOOD OR ILL, IT WAS INTENTIONAL. When Microsoft does
something on purpose, it stays done. See if you can detect any contrition in
http://support.microsoft.com/default.aspx?scid=kb;en-us;132686
To repeat: bitching, whining and moaning about this won't change a darn
thing. You are free to spend as much time engaged in these activities as you
like, but they'll be unproductive. Many people have learned the hard way the
futility of complaining about what Microsoft does and has done. Apparently
many more will have to learn the same lesson the same way.
. . . It's somewhat understandable in a language, but Excel is more
of an always-available do-everything super-calculator, not what I'd
consider a language. It should conform.
....
Why would it be more important for Excel, an end-user product *intended* to
be used by nontechnical people most of whom give less than a rat's back end
what mathematicians think, to adhere more strictly to standard conventions?
Why would it be OK for COBOL, some SQL variants and a few scripting
languages to retain this nonstandard operator precedence? If it's
'necessary' for one, it should be necessary for the other. OTOH, if the
world can cope with the languages remaining like this, it will continue
revolving about the sun if Excel remains unchanged.
Here's a little truth. Most people don't remember the operator precedence
they were taught in school. When in doubt, they look it up or test it out.
Here's an awkward bit of speculation. Now that computers have become nearly
ubiquitous in OECD countries, the great untutored masses now can calculate
on a daily basis. Perhaps this is a sign that most people either think the
existing convention isn't as useful as the Excel-like alternative or are
indifferent to the convention(s) employed in the software they use.
The world gets along just fine with driving on the left in some countries
and on the right in others. There's no obvious reason it can't also get
along with two broadly used operator precedence conventions as long as the
context governs the convention used. Current standard convention in
textbooks and articles, alternative used in 'business software'.
[In case you can't tell, I'm not being entirely serious in the last
paragraph, but other than a pathological need for rigid adherence to
orthodoxy, there's no necessary reason there must be one and only one
convention in use. It just means you need to check your convention before
using software. If there were one and only one notational form for inner
products, you all would have a stronger case for one and only one operator
precedence convention.]