Idea: GC and IDisposable

R

Robert Simpson

Not true in the case of the CF. GC runs in the context of the thread that
made the allocation that caused GC to occur. All other managed threads in
the AppDomain are suspended during the entire collection cycle (hence my
major concerns about performance of this suggestion).

Fair enough. Still, since GC runs in a separate thread on the desktop, it's
not unreasonable to think that MS might change this design in the future for
the CF.

Robert
 
G

Guest

Fair enough. Still, since GC runs in a separate thread on the desktop,
it's not unreasonable to think that MS might change this design in the
future for the CF.

It's a very resonable assumption - in fact I hope the CF GC becomes more
like the desktop as versions progress. Your argument is still quite valid
in that it points out flaws with the proposal.


--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com
 
J

Jon Skeet [C# MVP]

He's talking about the CF Bitmap, explained in depth here:
http://blogs.msdn.com/scottholden/archive/2006/08/22/713056.aspx
And here:
http://blog.opennetcf.org/ctacke/PermaLink,guid,987041fc-2e13-4bab-930a-f79021225b74.aspx

And as I stated in the second link, I think it's a bug in the Bitmap (and
I've voiced the concern personally with Scott - not sure if CF 3.5 fixed
it). Altering the GC to fix an implementation bug would be just plain
silly.

Ah, right. I can't see how the OP's suggestion would help though, as it
sounds like the GC isn't running in this case anyway - it's failing to
allocate memory in the gwes.exe process.
 
L

Laura T.

Hilton said:
Because when the Bitmap get's GC'd, the GC will do a "if (o 'implements'
IDisposable && !o.Disposed) o.Dispose ()" and even those this Bitmap
object won't be Disposed deterministically, it will be Disposed at least
when/if an OutOfMemory exception gets thrown.

First:

Are you aware thet there is no Disposed property in the IDisposable
interface:

// Summary:
// Defines a method to release allocated unmanaged resources.
[ComVisible(true)]
public interface IDisposable
{
// Summary:
// Performs application-defined tasks associated with freeing,
releasing, or
// resetting unmanaged resources.
void Dispose();
}

It would mean that runtime should take the step to record if the Dispose()
was called.. a big overhead, let me say it, for nothing.
CLR tracks already finalizers, now it should double track?

Second:

What if it Dispose() generates an exception? Was the object Disposed() or
not? Do we need to finalize or not?
Partially disposed objects are a nightmare.
Please recall, I'm not saying that my proposal is perfect and that we
should change our code to use if (it requires no code change in the apps).
All I'm saying is that if the GC add the "if" as stated above, it would
act as some level of safety net and catch undisposed objects.

It can't.. add the "if" as I said before. To support it, yes, it would
require code changes.
I think we're kidding ourselves to think that you, me, and all the other
.NET engineers in the world will always write code to do the right thing.
I really like "using ()" and use it all the time, but there are some
objects whose lives cannot be bottled up in a few lines. One little bug
and whammo,

Well, that's the life of sw engineers. One little bug (no finalizer when
there should be one) and..
You could blow up a nuclear power plant.. just a little bug.
CLR offers very good protection. Better than any other runtime. But it still
can't protect you from yourself.
a LOT of memory potentially gets leaked. I cannot even count the times

Leaked? Not if you are talking managed resources. Not in any case if you
have a finalizer,
and by saying finaliser I don't mean ~class() { // Gone for lunch };
people have posted here about being out of memory only to find that they
didn't know to Dispose a Bitmap object. The purist in you will jump up
and down and say "well they should have, they screwed up, and they didn't
call Dispose() - their fault.". Now if only we all lived in a perfect
world where one perfect engineer was solely responsible for one perfect
project... However, multiple engineers work on multiple projects and one
missing Dispose() can cost a company millions. Then when we get to
finding the

No, one missed finalizer can cost millions. One missed Dispose() could cost
a few bytes in a short time.
problem, we're back to finding the problem-causing 'malloc-free' we're all
so fond of. :|
Not really. There is no 'free' in C#.
Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.

Hilton

Now, ff you change the word Dispose to Finalize, you see that the framework
is already there.
The Dispose() is "user mode" and Finalize() is "kernel mode".. which one you
trust more?
You really need a double safety net?
 
?

=?ISO-8859-1?Q?G=F6ran_Andersson?=

Hilton said:
Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.

Adding another safety net doesn't make any application more robust. It
might only keep the application running a bit longer despite it's lack
of robustness.
 
H

Hilton

Laura said:
First:

Are you aware thet there is no Disposed property in the IDisposable
interface:

Yes, that was pseudo code.

It would mean that runtime should take the step to record if the Dispose()
was called.. a big overhead, let me say it, for nothing.

One bit (at worst a byte) is a big overhead?

To support it, yes, it would require code changes.

Clearly I'm being misunderstood here - it would require no code changes to
the app.

Leaked? Not if you are talking managed resources. Not in any case if you
have a finalizer,
and by saying finaliser I don't mean ~class() { // Gone for lunch };

Does the Bitmap constructor create 'managed resources'? [hint: trick
question] OK, I'll tell you the answer, on the CF, new Bitmap (...) can,
only in some circumstances, create a memory leak if the app does not
Dispose() it.

Hilton
 
H

Hilton

Brian said:
Nevermind the fact that adding *any* interface after the fact could be
a version breaking change. That's why the Framework Design Guidelines
book recommends that you to choose them wisely right from the get go.

And how can you decide today how you might optimize a class in 3 years time
when Microsoft release some new (unmanaged) technology that you could use?
In the extreme case, you'd have to implement IDisposable on practically
every class, or just get lucky.

Hilton
 
H

Hilton

Jon said:
Ah, right. I can't see how the OP's suggestion would help though, as it
sounds like the GC isn't running in this case anyway - it's failing to
allocate memory in the gwes.exe process.

IIRC, the GC runs when an OuotOfMemoryException is about to be thrown, so my
proposal would Dispose the Bitmaps perfectly and the app would never run out
of memory. Again, for the record, I'm not saying the app should expect
this. The engineer should call Dispose, etc etc etc.

BTW: The code is trivial to reproduce as per the link above just keep
creating Bitmaps and you'll get the OOM exception.

Hilton
 
G

Guest

One bit (at worst a byte) is a big overhead?

It's a lot more than that. It's sapce (likely more than a byte, as you have
to have some form of "handle" to associate with the object, so that's at
least 4 bytes, plus if it's a linked list you have a pointer to the next
item, and possible the previous. Now you're at 12. The larger issue is
performance. This will be slow. And yes, speed is important.
Does the Bitmap constructor create 'managed resources'? [hint: trick
question] OK, I'll tell you the answer, on the CF, new Bitmap (...) can,
only in some circumstances, create a memory leak if the app does not
Dispose() it.

And this is the only case you have for your proposed "fix" and which I think
is a bug in the Bitmap implemenation. Fixing the Bitmap is a far better
solution than changing the GC.


--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com
 
G

Guest

IIRC, the GC runs when an OuotOfMemoryException is about to be thrown, so
my proposal would Dispose the Bitmaps perfectly and the app would never
run out of memory. Again, for the record, I'm not saying the app should
expect this. The engineer should call Dispose, etc etc etc.

The GC does run. The problem with a CF Bitmap is that the GC runs, but the
finalizer thread hasn't completed, so you can get a second OOM if you
attempt to allocate again before it's done. Scott and I both covered very
well why it happens and how to get around it without seeing an OOM. The GC
should *not* be trying to fix this issue.
BTW: The code is trivial to reproduce as per the link above just keep
creating Bitmaps and you'll get the OOM exception.

Due to a bug in the Bitmap, not a fault in the GC. You keep trying to fix a
bug with a GC "workaround."


--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com
 
G

Guest

And how can you decide today how you might optimize a class in 3 years
time when Microsoft release some new (unmanaged) technology that you could
use? In the extreme case, you'd have to implement IDisposable on
practically every class, or just get lucky.

Or you subclass. That's what OOP is about. Adding/changing an interface is
the same in my book as changing the name of every method in the class aloing
with the class itself. It's a major breaking change.


--

Chris Tacke, Embedded MVP
OpenNETCF Consulting
Managed Code in an Embedded World
www.OpenNETCF.com
 
H

Hilton

Chris,
Does the Bitmap constructor create 'managed resources'? [hint: trick
question] OK, I'll tell you the answer, on the CF, new Bitmap (...) can,
only in some circumstances, create a memory leak if the app does not
Dispose() it.

And this is the only case you have for your proposed "fix" and which I
think is a bug in the Bitmap implemenation. Fixing the Bitmap is a far
better solution than changing the GC.

You might have missed my post from 9/25 in which I said: "Note: I'm just
using a hypothetical Bitmap class as an example here - let's not go off on a
tangent speaking about the actual Bitmap class." Also, if you look through
my posts, I continually agree that fixing the app is without doubt the
correct thing to do. Other than the usage of (the actual) Bitmap class,
there are no apps anywhere in the C# world that do not call Dispose when
they should?

Anyway, we've beaten this dead horse. Thanks for the input. I enjoyed the
thread, educational as always.

Hilton
 
M

Michael S

Hilton said:
Please recall, I'm not saying that my proposal is perfect and that we
should change our code to use if (it requires no code change in the apps).
All I'm saying is that if the GC add the "if" as stated above, it would
act as some level of safety net and catch undisposed objects.

I are just inventing an automatic finalizer for finalizers.
Have you considered the runtime cost?
I think we're kidding ourselves to think that you, me, and all the other
.NET engineers in the world will always write code to do the right thing.
Nope.

I really like "using ()" and use it all the time, but there are some
objects whose lives cannot be bottled up in a few lines.

That is a truism for most code written.
One little bug and whammo, a LOT of memory potentially gets leaked.

You should try coding in Delphi and C++.
I cannot even count the times people have posted here about being out of
memory only to find that they didn't know to Dispose a Bitmap object. The
purist in you will jump up and down and say "well they should have, they
screwed up, and they didn't call Dispose() - their fault.".

No, not purists. Realists. And really, - It's their fault! :)
Now if only we all lived in a perfect world where one perfect engineer was
solely responsible for one perfect project...

What would be a wonderful world.
However, multiple engineers work on multiple projects and one missing
Dispose() can cost a company millions.

A bad business descision can also cost a company millions.
Then when we get to finding the problem, we're back to finding the
problem-causing 'malloc-free' we're all so fond of. :|

That is why we use test-driven application development.
Also, remember it was Apollo 11 that landed on the moon, not Apollo 1.
Why not add a safety net to make a C# application more robust, less leaky,
stay up longer, etc? Is it a band aid? Absolutely.

Because for me it is the analogy of adding active armor to a car,
and encourage people to drive recklessly, as;
- Hey, if you crash into something, you just bounce off!

- Michael Starberg
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top