Frisky said:
I'm not getting huge difference; unless you mean how fast my code is, and
how easy to read it is, etc.
We clearly have different ideas about readability. For me, making it
very easy to see the intended usual code path makes things more
readable than interspersing that with checking return values.
I don't have to check for nulls at all. If I don't it will still be at least
as fast or faster, and I will still throw.
You'll still throw *sometimes*. Not always. It depends how you use the
variable.
But, if I do check for null, I don't have to check everywhere. I only have
to check where it counts. And I should be judicuous in this.
Yup - you have to get it right, otherwise you could easily introduce a
bug. I prefer to let the runtime make the choice.
There is nothing nasty about that code using "as". Its short, easy to read,
and easy to understand. Actually, the code I showed made the example for
exceptions look ugly; in case in point where you do have to catch.
Which is very rare - you deliberately showed the worst possible case
for casting, and you're now asking me to compare that with the best
possible case for using "as" (where you don't even have to check the
result).
Also, you can't bubble everything up. Nor should you. The right object with
that responsibility should handle the exceptions he is responsible for.
Sometimes that means bubbling things up. Sometimes it means stop and pay
attention.
If you are refering to the single line of code that simply throws when
something goes wrong, I don't necessarily call that better. It comes at the
cost of increased complexity. I have now deferred that error to somewhere
else. But where? And, in how many places?
Almost everywhere. Exceptions should usually be thrown when something
truly unexpected has happened - *especially* in the case of casting. If
there *is* a reasonable chance that the cast would fail (eg if you've
loaded a type whose name has been provided by the user) then that *is*
a good place to use "as" instead.
No, really I have decided to ignore the condition as not being a problem.
Just skip this code if its not relevant.
And the code will be faster. If you have a GUI that is firing tons of
exceptions, and those exceptions fire yet more messages, that guess what,
fire more exceptions, pretty soon you will be sucking wind.
But that just doesn't happen in reality. Unless you've got your code
hooked up very badly, you shouldn't be firing tons of exceptions
anyway. I've *never* suggested that exceptions should be thrown often
in terms of actual running code, just in terms of places in the code.
If you look at the execution time, even the check alone (agreeably over very
large numbers) is serveral orders faster. And both will throw if you do not
check for null. Speaking distinctly as a comparison of casting to "as". But
if you don't throw, you just ignore, there is nothing left to do, no stack
to grab information on, no memory variables to create.
No, "as" won't throw if you check for null and then use the value
somewhere that null is a valid value, but possibly not the one that you
actually want.
But "as" is at least as fast, or faster than a cast. So, unless you mean to
cast, to get user conversions, or you have a value type, why used it? The
"as" is the straight up performance winner.
No, it's the performance equal unless you decide not to check it, at
which point I believe it's a bug trap, deferring the point at which you
find out something has gone wrong.
But you just said, "as" is equally fast in the worst case. In all other
cases, its faster. Did you look at that article on Code Project?
Yes. I actually *tried* it though. That article on Code Project left
out a lot of stuff (like code). Basically, unless you're using the
value from "as" in such a way that it *doesn't* show up errors
reasonably quickly, there's no real performance difference.
Only if you don't check the value.
You can still use exceptions.
And I will!
And while you may not find that the performance will pay off, it is better
to stick with high performance as a rule if you can.
Knuth disagrees.
And, I'm not saying you have to check for null if you use "as". You can
still stick with your current exception strategy.
But to get the exception, you have to check for null or risk making the
eventual error harder to understand.
Comparing Apples to Apples:
TextBox tb = (TextBox) control;
and:
TextBox tb = control as TextBox;
I find them equally readable.
So do I - but the latter will mask any errors until later, unless you
have the check.
I am not sure what you mean by this. ???
I only check for null where null values are not allowed. If a function takes
an object type, but does not allow nulls, it would assert. The code
provifing the null value would be in error.
But null could be an acceptable value for the method call, but one
which you (as the caller) never want to provide in this case. You also
lose the ability to distinguish between the original value being null
(which may be valid) and the original value being a reference to an
object of the wrong type.
Or, you rely on lots of reentrant code, or are handling large quantities of
business rules and data, or etc. And performance overall does matter to me.
I would rather set me design goals based on using the faster of two options
unless there is a compelling reason not to. I see no compelling reason to
use a cast over "as". In light of that, I see performance as a compelling
reason to use "as" over a cast.
Then we fundamentally disagree on what constitutes significant
performance. I believe that universally choosing "as" for performance
reasons when it *can* introduce bugs is a *really* bad idea.
I simply add that even with the null check included, it is still faster than
the cast. Not that you have to do it that way. You can stick with your own
exception strategy.
With the null check included, they're effectively the same speed in my
tests. Sometimes one version runs faster, sometimes the other.
Ok, now you are speaking my language. But, I thought I had said this in the
beinning anyway. It depends, are you expecting to throw, or are you
expecting not to throw.
And I almost always want to throw in the very rare situation that I've
got an object of an unexpected type.
And in general, my design is based on the approach you describe here. If you
never have an error, youv'e won the battle before it started.
Indeed. And if you *do* have an error, chances are you're not going to
get to the same situation thousands of times, and the performance hit
will be absolutely tiny.
I use inheritence a lot. My arrays are not always so clean cut. But, if you
don't need to cast, you don't need "as" either.
I find that inheritance is overrated. I only use it occasionally within
my own classes. It's invaluable when it's really required, but the
design requirements for a class which is meant to be inherited from are
much higher than other classes - the interaction between methods needs
to be fully documented as part of the interface, and then the
implementation has to pretty much stay the same. I prefer composition
over inheritance most of the time.
For a single instruction on a 3.0 Ghz box that's huge. At that rate, I would
only be able to execute 1000 instuctions a second.
No, you'd be able to execute nearly 1,000,000 instructions a second -
that figure was in milliseconds, not seconds. (It had a typo though - I
meant ~0.01ms, at which point you can execute it roughly 100,000 times
a second). The important thing is, however, that you won't be doing it
100,000 times a second.
But, my tests indicated it was a higher hit than that.
But your tests appear to have been run under the debugger.
Well that depends. We usually build dynamic screens. And, the users can drag
and drop UI components from one panel to the other. We mimic a lot of what
VS 2003 does but for our application.
So you validate it once early on, and then you shouldn't need to do it
afterwards. Anyway, this sounds like a specialised situation which
wasn't indicated in the OP's post. I hope you'll agree that most UIs
*aren't* dynamic.
Depends on the elsewhere. But I don't tend to get values here for use
elsewhere. Unless elsewhere is another responsibility, but then again, if
its not responsible, that must make me, so I would need to do the
validation.
That's a whole chain of reasoning I don't need to worry about. I like
not having to think, getting the error checking for free. The more I
have to think, the more chance I have to get it wrong.
Quite. True, our code is down-casting our special textbox classes to one of
the classes in its lineage.
public static implicit operator TextBox(SuperDuperTextBox control)
{
// conversion code
}
If SuperDuperTextBox is actually a TextBox already, that's not a valid
operator. However, in the case that the OP posted, the compiler isn't
going to know that the cell's value is actually a SuperDuperTextBox
anyway, so it can't use the user-defined conversion in the first place.
I wish it had been that easy. But, performance if not artificial. And, the
systems I have written do not suffer from these issues. Sure, I base the
things I do on history. Experience has been a great teacher for me. Hence my
points.
It sounds like experience *hasn't* been a great teacher though - it's
taught you to fear exceptions like they were the plague, bending code
out of shape to avoid throwing it, as in the case of the code you
showed for dealing with resources. The problem in that case wasn't that
you were throwing lots of exceptions, it was the *reason* you were
throwing lots of exceptions - namely that your resources were broken.
The fix for that isn't to ignore resources which should be there but
aren't, but to make sure that you only ask for what you need, and that
everything you need is present.
You've admitted that you've never run into a situation where decent
code has been significantly slowed down by exceptions, but you're still
coding as if they're about to slow your app down to a crawl at any
minute.
The code is not any shorter or longer. Cast versus "as". Not anymore clear.
In fact, hidden exception handling is who knows where. Its more comlpe.
No, the exception handling is very predictable, whereas if you use
"as" it's somewhere separate from the change of reference type. It's
wherever it's first used - and *that's* "who knows where".
Did you forget? I thought you wanted to throw?
Which the cast does, if anything's wrong.
This is no different than
forgetting to add the hanlder code. Or anything else for that matter. You do
step through your code in the debugger when you write it don't you?
Nope. I try to avoid using the debugger whenever possible. I write unit
tests instead. Much more reliable. I regard having to resort to the
debugger as a partial admission of failure in the first place - it
means my code isn't clear enough to be able to spot a problem just from
the place that the unit test fails, or (worse) my unit tests aren't
good enough to spot a problem which has cropped up later on.
Not disrespect. I just found that you took exception (no pun intended) to my
comments in several post, yet steered clear of others. I just wondered if
you had something against me, or if you were after interllectual discussion.
You do seem adamant. But then again, so do I.
Like I said, I don't have time to read and respond to every post, but I
can point you in the directions of several times when I've had similar
"exceptions are hugely damaging to performance" discussions.