I think I should clarify my position, before I'm buried under a
mountain of flame.
I, personally, would prefer to use explicit type specifiers. I'm
old-school, and you're right: the code is immediately more readable.
However, I have no illusion that "var" will not quickly become the de
facto way to declare local variables. It's too tempting. You have to be
particularly pedantic (and I can be, when I want to) in order to
continue typing even the first few characters of MyLongTypeNameType to
get Intellisense to give you the right choice rather than simply typing
"var".
Yes, the resulting code will become less readable, but I don't think it
will become horribly less readable, for several reasons.
First, there is the IDE to help me. I can mouse over anything and it
will show me its type. I use that feature occasionally now, as
sometimes I'm in the middle of some code, far (20 lines or so) from the
declaration of some variable... how do I know what type it is (if I
even care, see below)? I don't go looking for the declaration; I just
mouse over the name in question.
Second, I find that I often don't even care what type it is. Sure, if
it's a primitive numeric type then it matters in my calculations what
type it is, but then I, like the compiler, can immediately tell that
var abc = 5; // is an integer, and
var def = 1.0m; // is a decimal
Yes, it's prettier and easier to read to have the "int" and "decimal"
keywords there, but their absence isn't going to make the code
unreadable for me. I think I'll be able to adapt.
For reference types, why do I care what type it is? I can't remember a
bug in which I used the wrong reference type in the wrong place. If I
understand the problem domain, type is almost always obvious from
context, because I'm no longer concerned about semantic correctness at
that level: the compiler is more and more taking over that worry.
Again, I can't remember the last time I was reading code searching for
the type of something. I'm usually reading code trying to determine
where the implementation is flawed, or the design is bad.
I'm not saying that having that type name there is useless. It _does_
aid readability. I'm just not sure that it aids it so much that taking
it away would make my job of reading code all that much more difficult.
As I said, I think I can adapt. (It looks as though I'm going to have
to, anyway... I doubt I can convince my colleagues to forgo the use of
"var"
This leaves printed code. Here, I think that there could be a problem.
Without Intellisense to help out, Daniel is right: on a printed page
figuring out what type is returned by a method could be very difficult.
However, I see no reason why Visual Studio couldn't allow you to "turn
on" some sort of view that showed you the types along side "var"
declarations. I can see some help being put into the tool, rather than
stopping people from using shorthand, which they will want to do in
droves.
I've seen this steady progeression over the years: the IDEs and tools
take over more and more of the gruntwork of programming, becoming, in
the process, indispensible. I don't really mind it. I see it as a
natural evolution of the programming craft. Just so long as the
languages are unambiguous, and the tool helps you clarify what's going
on.
Will I continue to type
double x = 0.5;
rather than
var x = 0.5;
Probably. I'm that sort of person.
I won't fault people who go
using "var", though.