How to use generics?

C

Christof Nordiek

I would, without hesitation. The only argument I've ever head put forward
is, "Most people do it wrong, therefore we don't include it in our
language. Have you see the Dreaded Diamond Diagram?".

This argument is so lame, and the feature so powerfull, that I really
can't understand why there hasn't been a huge cry for it in both .Net and
Java.

Yes, this feature was really lame (if it were the only argument). If it were
the only cause, then there should also be no unsafe code in C#. A stronger
wersion of this argument is: It's difficult to use it in "right way".

A more important reason, I suppose is, that MI isn't easy (or even
impossible) to implement in a way conformant with the OO-Concept of C# and
CIL.
Some points:
What happens with base classes inherited in two ways. (While in C++ it is
possible to avoid this at all, in C# this will happen with in any
MI-scenario, because all classes derive from object.)
Which constructor of such base classes shall be called?
What happens, if two direct base classes have different overloads of the
same method?
How should the fields be aligned. (This is only an implementational problem,
while the points above are also semantical problems.)

Maybe others could ass more points to this.

Christof
 
M

Moty Michaely

Somehow I don't grasp one aspect of your statement: who are you (or
for that matter anyone else) to decide what usage of features is
called "abusive" and what usage is not? It's all in the eye of the
beholder; you say it yourself :


What might seem like "bad" to you might seem "good" to someone else.
I've seen "abusive" usage of - especially MI and templates - in the
past that any "sane" programmer would call abusive, but when further
investigation followed the choice made perfectly sense (and was often
from that perspective better than the straight-forward approach).

Call me insane, I would rather educate people I hire in making sane
choices than removing the hammer from my toolbox. That's more like a
discussion about "freedom" versus "self protection" than a discussion
about "personal taste".

Cheers,
Stefan.

I agree with Stefan.

We can't avoid features just because developers don't know how to
develop.

Would you drop Multi threading just because a developer can cause a
dead lock?

I think this approach is so lame. It causes growth of unprofessional
developers and reduce the professionals using these languages...

Just as we know Synchronization techniques and avoid dire consequences
when synchronizing, I think we can
MI can be used with clever. Developers that will abuse their code,
should be abused by their boss :)...

Moty
 
P

Peter Duniho

We can't avoid features just because developers don't know how to
develop.

Of course we can. Languages do it all the time. C# perhaps does it more
than other languages, but each language always puts some limitation on
what is allowed.
Would you drop Multi threading just because a developer can cause a
dead lock?

That depends on how useful multi-threading is. Given that so much of .NET
is based on use of the thread pool and background worker threads, it would
be folly to toss out multi-threading. It's integral to the design of
..NET. But multiple inheritance is not.

I can't really comment on *why* C# is missing multiple inheritence, not
being privy to the decision-making that went into the design of C#.
However, I think that the approach C# has taken is pretty sensible. You
can still inherit multiple *interfaces*, which address all of the benefits
anyone might claim for multiple inheritence, while preventing some of the
basic "gotchas".

One of my biggest complaints with MI has been when I am trying to multiply
inherit objects that ultimately inherit from some very basic class, like
Object, that is intended to support fundamental object instance
management. You can have a single object that provides multiple
behaviors, but you don't have to worry about which of the
multiply-inherited classes will be used for the basic object foundation,
since there can be only one actual inherited class.

As far as the complaint about "manually implementing it in 10 different
classes" goes, surely we all know how to use copy-and-paste here. It
should not be necessary to literally write 10 implementations of a shared
interface. You write the implementation once, contain an instance of that
implementation in each class you want to provide the interface, and then
wrap the implementation in stubs. Stubs that can be copied and pasted
into each class inheriting the interface (and since the important stuff is
actually just in one place, the usual big problem with copy & paste
doesn't apply...bugs still get fixed in just one place).

Is that more tedious than just inheriting the interface? Sure, I don't
disagree with that. But it really should not be a significant overhead.

And I definitely take issue with statements like "MI is like sight. If
you grew up with it, you miss it dearly. If you never had it, you can not
imagine what its like" and "if you write (proper) C++ for a couple of
years, half the code you will write will contain MI". Those statements
essentially dismiss prejudicially any argument not in favor of MI.

I have used multiple inheritance, I can imagine what it's like, and I am
happy to not have it in C#. Likewise, I've been writing C++ for much
longer than "a couple of years", and nowhere near "half the code" I write
has MI. In fact, I stopped using it altogether when I get sick and tired
of the headaches it caused (see above and Christof's post regarding base
classes being inherited multiple ways). The implication in the latter
quote above is that I haven't been writing "proper" C++ code; at best, the
statement is tautological, and at worst it's insulting to me and anyone
else who uses C++ without multiple inheritance.

Pete
 
R

Radek Cerny

I think you are missing the point. Code is simply an implementation of
design (and analysis) and if you analyse or model the problem space (design
pattern recognition) you will be constrained by the language if you model
with MI.

Sadly, I grew up with MI and miss it dearly.

I learned to look for patterns that are orthogonal, and create abstract base
classes for these patterns, and then combine them in appropriate ways. I
can not do that any more in .Net. So I have to 'think' differently as an
analyst/designer.
 
J

Jon Skeet [C# MVP]

Radek Cerny said:
I think you are missing the point. Code is simply an implementation of
design (and analysis) and if you analyse or model the problem space (design
pattern recognition) you will be constrained by the language if you model
with MI.

Sadly, I grew up with MI and miss it dearly.

So would a Ruby programmer who is used to dynamic behaviour be
justified in criticising C#'s static typing? I dare say they may miss
that behaviour dearly - but we can't (and shouldn't try to) include
*every* language's idioms into a single language.
I learned to look for patterns that are orthogonal, and create abstract base
classes for these patterns, and then combine them in appropriate ways. I
can not do that any more in .Net. So I have to 'think' differently as an
analyst/designer.

That shouldn't come as a surprise - the idiomatic way of doing things
is often different between different languages/platforms. Write code in
one language as if you were writing another and you'll probably get in
a mess.

(In this case, you can still look for those orthogonal patterns, but
use composition rather than inheritance, often.)
 
R

Radek Cerny

Jon,

sorry, but I still think you are missing the point. You are very
code-centric. Coding is just the simple mechanical translation of design
into machine executable form. Coding (at the business application space) is
not clever. A business analyst/designer should be able to model the problem
space using clever things like design patterns. The world at large (IMO) is
multiply-inherited; thats how I see it - many orthogonal design patterns.
Code it once and put it in the reusable library. Combine it with other
patterns as appropriate.

But not in .NET. "Over my dead body" I believe was Anders' final statement
severla years ago.

Have you ever had the pleasure of developing in a graceful MI OO
environment? I have, and I consider MI like vision; if you grew up with it
and lose it you miss it dearly and if you never had it in the first place
you dont know what the fuss is about.
 
J

Jon Skeet [C# MVP]

Radek Cerny said:
sorry, but I still think you are missing the point. You are very
code-centric. Coding is just the simple mechanical translation of design
into machine executable form. Coding (at the business application space) is
not clever. A business analyst/designer should be able to model the problem
space using clever things like design patterns. The world at large (IMO) is
multiply-inherited; thats how I see it - many orthogonal design patterns.
Code it once and put it in the reusable library. Combine it with other
patterns as appropriate.

I have a code-centric view because in my opinion if you don't take the
implementation platform into account when coming up with the design,
you often end up with code which isn't idiomatic and which is a pain to
maintain.

Would you really say that the target platform should have no impact on
design?

(I've also found that writing code by trying to model the real world as
exactly as possible is problematic. Work out what you need, and
design/model accordingly bearing in mind that the objects will only be
in a computer, and not in real life.)
But not in .NET. "Over my dead body" I believe was Anders' final statement
severla years ago.

Have you ever had the pleasure of developing in a graceful MI OO
environment? I have, and I consider MI like vision; if you grew up with it
and lose it you miss it dearly and if you never had it in the first place
you dont know what the fuss is about.

I've had the pain of having to try to debug multiply-inherited C++. I
know the C++ implementation of multiple inheritance isn't the best
around (although I wouldn't pretend to know the details). I know that
multiple inheritance adds to the complexity of not just any code that
uses it, but the language itself.

Again, if "missing it dearly" is enough of a criterion to justify a
feature's inclusion in a language, where are the simple dynamic
closures I love in Groovy? Where's duck typing? Where are optional
parameters? Where is pointer arithmetic? Where's the "with" statement?
Where are macros? All of these are dearly missed by some people.

(My personal bugbears are the problems with the switch statement and
the lack of decent, object-oriented enum support.)
 
M

Moty Michaely

On May 10, 8:43 pm, "Peter Duniho" <[email protected]>
wrote:

Dear Pete,
....

I have used multiple inheritance, I can imagine what it's like, and I am
happy to not have it in C#. Likewise, I've been writing C++ for much
longer than "a couple of years", and nowhere near "half the code" I write
has MI. In fact, I stopped using it altogether when I get sick and tired
of the headaches it caused (see above and Christof's post regarding base
classes being inherited multiple ways). The implication in the latter
quote above is that I haven't been writing "proper" C++ code; at best, the
statement is tautological, and at worst it's insulting to me and anyone
else who uses C++ without multiple inheritance.

I never said that those who are not using MI aren't writing proper c+
+. what I meant was that the MI feature was dropped to help developers
avoid the headhaces you mentioned.

Anyhow, I think that MI should be an idiom of C# as well.

That's my opinion anyway =).

Good day,
Moty
 
A

atlaste

I never said that those who are not using MI aren't writing proper c+
+. what I meant was that the MI feature was dropped to help developers
avoid the headhaces you mentioned.

[...]

Actually I have. And don't get me wrong, but I find it just as
insulting that having a preference towards MI is wrong just because
some people get headaches using it. MI is more or less regarded "bad
design" (by some people) because it's difficult to maintain for some
developers, thereby insulting the work I've done over the last I don't
know how many years. I get headaches because I have to wrap code in
interfaces while I know it can be solved by introducing an idiom and
because I know that all those stupid interfaces and proxies that do
nothing besides being an interface or proxy are bad to maintain.

Statements like "over my dead body" from friend Anders himself only
contribute to that.

I think it should also be noted as well that the world is not this
black or white. People like Moty and me would like to see a solution
to a number of design / implementation problems. The tool that used to
solve these issue is MI. That's familiar and therefore that's
referred. Perhaps star-like constructions shouldn't be implemented at
all. Perhaps a new idiom is a better solution. I don't care how the
problems that arise from the lack of MI are solved; as a consumer I
merely want a proper solution.

Cheers,
Stefan.
 
P

Peter Duniho

I never said that those who are not using MI aren't writing proper c+
+.

I never meant to imply that you did. Those quotes were from two other
posts, elsewhere in this thread. Sorry for any confusion.

Pete
 
P

Peter Duniho

Actually I have. And don't get me wrong, but I find it just as
insulting that having a preference towards MI is wrong just because
some people get headaches using it.

While I can't speak for what Anders has or has not said, I haven't seen
any posts in this thread that make that accusation. There is a difference
between pointing out why a particular language may be missing a particular
feature, and saying that that particular feature is only used in code with
"bad design". The former has been said here, but the latter has not.

In any case, if you want to have a productive discussion, I don't think
that your goals are well-served by implying that those who prefer to avoid
MI are not writing "proper" code. Even if you have been insulted
elsewhere, I don't think you have been here, and "an eye for an eye" isn't
really going to help anyway.

Pete
 
A

atlaste

While I can't speak for what Anders has or has not said, I haven't seen
any posts in this thread that make that accusation. There is a difference
between pointing out why a particular language may be missing a particular
feature, and saying that that particular feature is only used in code with
"bad design". The former has been said here, but the latter has not.

In any case, if you want to have a productive discussion, I don't think
that your goals are well-served by implying that those who prefer to avoid
MI are not writing "proper" code. Even if you have been insulted
elsewhere, I don't think you have been here, and "an eye for an eye" isn't
really going to help anyway

Pete

Okay, I think you got me wrong, so allow me correct this. The point is
that it's all a matter of personal opinion. "Bad", "good", "proper"
are all words that say nothing if a context is missing. Coding
standards are there to define what's "right" and what's "wrong" and to
distinguish "proper" from "non-proper". I'm not insulted; and I
didn't mean to insult anyone for that matter (that's pointless in any
chat / discussion forum).

I'd like to point out that this isn't a productive discussion at all.
People ask for a reason and once they get a reason, they get a
reaction that it's not a viable reason. Everything here is gut feeling
and emotion; some say not having MI is not feeling good, other people
say they are quite happy with the way it works now. That's how I see
this and that's what I wanted to point out.

Perhaps I have chosen the incorrect or inappropriate words and if
that's the case then I'm truely sorry.

Cheers,
Stefan.
 
T

tjmadden1128

Again, if "missing it dearly" is enough of a criterion to justify a
feature's inclusion in a language, where are the simple dynamic
closures I love in Groovy? Where's duck typing? Where are optional
parameters? Where is pointer arithmetic? Where's the "with" statement?
Where are macros? All of these are dearly missed by some people.

(My personal bugbears are the problems with the switch statement and
the lack of decent, object-oriented enum support.)

Ooooh! How about function templates? That would have answered the OP's
problem.

Tim
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Similar Threads


Top