Jon Skeet said:
Sure, it's possible to read VB.NET if you're really a C# programmer and
vice versa - but that doesn't make it easy to learn other things. If I
wanted a book to teach me German, I wouldn't buy one which is written
in French. Sure, I could probably get by, with a bit of work and the
help of a dictionary, but there'd be too many obstacles in my way.
Similarly, if I were to buy any more .NET books, I would choose ones
where the examples were in C# rather than VB.NET.
I'd question this analogy, Jon.
Natural languages, even of neighbors who can't stand each other like
the British and the French, carve the world up in different ways. They
have different semantics.
Therefore you cannot be certain that referents are the same. When the
Englishman refers to "the nation" he means something different from
the Frenchman's du pays. Why even the shimmering moon refers to
something different, with different connotations inherited from
literature, than la lune.
But in .Net you can bet, your sweet patootie, that the things
underneath the words are the SAME, and have the same connotations for
good or ill to the .Net developer.
Was this not, big fella, the whole motivation behind .Net? Wasn't its
purpose to unite the warring tribes and bring upon us a world of
peace, love and understanding between users of different languages?
I concede my praxis derives from the relatively flatter object models
of VB object development. A current C# project in my day job is giving
me all sorts of kewl ideas for using more Inheritance, ideas which
scare Dan Appleman (no, Dan, I am NOT going to rewrite the compiler
and yes Dan Chapter 8 of 11 will be done this weekend). But, of
course, compilers have been written (whether generated using tools or
by hand) for years using delegation and before that modules and before
that One Big Main Routine that three people in the world understood.
[The IBM 1401 compiler for Fortran was 99 phases on punched cards and
executing it was like singing "99 bottles of beer on the wall" in
summer camp. Two living people in the world understand it and one is
J. A. N Lee of the University of Virginia, the author in 1969 of The
Anatomy of a Compiler and the other might be me...still crazy after
all these years. But I digress.]
The issue you fellows raise is genuine but it is in part one of
perception. "Oh, that book uses VB for its examples I had better put
it back on the shelf and go read Maxim." But I claim that my
deliberate choice of VB will be a more pervasive and all-embracing
message. It will say to the world, including VB programmers, that
compilers do not, necessarily, have to be written in any one language.
It will foreground the theory as applied and not this or that
language.
I am reminded in this connection of a Cambridge University Press book,
"Non-numerical Programming in Fortran."
The title has for a geek a certain je ne sais quoi, a certain frisson,
and a surprise factor. This is because it is of high
information-theoretic content, as an unexpected message. You see on
the shelf book after book with expected titles and this one sticks
out.
"Hmm, let's see. GREAT HORNED TOAD it's a compiler book with examples
in Visual Basic. I must read further."
I used Non-Numerical Programming in Fortran to learn how to construct
a program that displayed numerical information with graphics
consisting of bar charts on the printer, and many other amusing
lessons. I'd say that the surprise factor, the je ne sais quoi, and
the great horned toad factor played a role in my use of the book.
Computing science is no more about computers than astronomy is about
telescopes, to quote the late hero computer scientist Dijkstra (who
passed away in August 2002). On the face of it this statement sounds
just wrong and like one of those marvelous but false European ideas,
like zoos, Zeppelins, Gauloises or the Schlieffen Plan. It sounds like
Kant's claim, at the beginning of Groundwork of a Metaphysic of
Morals, that the only thing we can know to be good is a good will as
if the road to hell, and all that rot.
But Dijkstra was right (as was Kant, whose thought influenced
Dijsktra, but das ist ein anders...ein Kann, der Wurmen.). It's a bad
mistake, not to be "abstract" and "academic", but basically to confuse
the two levels of concrete and abstract.
For the same reason T. S. Eliot wrote "I gotta use words when I talk
to you", we have to use SOME "programming language" when we
communicate information about programming, even if it is pseudo-code
or Algol's old idea of a "publication language."
A common language would be the end of language (this may be why
Esperanto never got off the ground), and the reason, I am now
convinced by reading Derrida, is that manufacture of language is the
manufacture of difference. In computing the symptom is the constant
way in which programs, even programs written in a shop common
language, become different worlds. Different personalities don't
explain the phenomenon.
Derrida shows us how language always consists of a physical "trace",
whether the "trace" is ink on paper or bits on silicon, and our basic
delusion, which he shows confuses the hell out of us, is that there is
any exit from this situation. In computing science we dream of a
common language but the ultimate end of this goal is one software
program written by the government, and we resist this wakeup call for
the OBVIOUS reasons.
If VB still had its own runtime, it would be a poor choice, I think,
to use VB to write a compiler. But with respect to the "lifestyle
choice" of language, .Net consists of the "stars" and I think we
should "hang loose" about language choices.
Postscript: my draft of the book exercises iron self-control and
except for occasional marginalia it is free of any philosophical
skylarking as seen above.
There are enough good .NET books on the market now that there's no
reason to go for one which puts a barrier in the way of learning, in
terms of the language the examples are written in.
Perhaps I should include an exercise in each chapter. "Rewrite the
examples in this chapter in C#".