Peter said:
I am well-informed on the general subject of computer language design.
The
ideal computer language design involves reducing programmer effort without
reducing program performance.
I think this definition of 'ideal' is faulty. There are many ideals in
computer languages: imperative languages are not ideal for querying
(that's why we have SQL), nor pattern matching (that's why we have
regular expressions), etc.
"Programmer effort" can only be measured in terms of solving some
problem. Some languages are better at some classes of problems than
others. There is no single ideal.
Also, your statement precludes any reduction in programmer effort if it
reduces program performance. By that logic, we'd all be programming in
assembler. Programming languages are abstractions: they hide details
behind a conceptual framework. In fact, they are abstraction-building
abstractions, and different languages are better suited to different
abstractions, cf. functional languages for functional abstractions,
object oriented, logic languages, etc.
Since my suggestion reduces programmer effort, AND increases program
performance, it is therefore an optimal improvement to the current design.
It also increases language complexity, which can increase programmer
effort.
I'm not strongly opposed to your suggestion at all, BTW. Just want to
make that absolutely clear. I do think I'd hardly ever use it, though -
it's usually better to work with reference types instead, and reserve
value types for value-oriented abstractions, such as complex numbers,
Point, Matrix, int, decimal, that kind of thing.
In fact, I'd be more strongly in favour of explicitly immutable value
types, and let the CLR figure out if a 'const &' calling convention
could be applied, because I think that would more closely reflect
situations when value types are useful (in my experience).
-- Barry