L
Lew
Peter said:Likewise, in Jon's reply to your "1+1=3" example, you are allowing your
prejudice of what that exact sequence of characters means in the real
world to affect your interpretation of what's going on in the computer.
While it's helpful for the compiler to be designed so that adding 1 to
itself results in 2, just as it does in the real world, some insane
compiler designer could in fact declare that his compiler will result in
3 when that operation is performed.
He could do so for any variety of reasons, including that one or more of
those symbols don't actually mean in his compiler what they mean in the
real world, or simply that he wants to be arbitrary. But regardless, if
he writes the specification for the compiler stipulating that that's
what's supposed to happen, then if the compiler does exactly that, then
the compiler is bug-free (at least with respect to that feature).
You may rightfully say that the actual design of the compiler is flawed
(or even "buggy"). But the software itself is operating exactly as it
was designed, and thus is NOT flawed or buggy. It's not a bug for that
compiler, with that design, to make 3 the result of adding 1 and 1 (or
whatever operation might actually be represented by the sequence of
characters "1+1").
A better example might be a system that produces "1 + 1 == 10". There is
clearly a domain of interpretation for which that is not "buggy" even in
design, though there could also be people who think that it is buggy, due to
preconceived notions of what "1 + 1" or "10" /should/ mean.