.NET Newbie and Garbage Collection

G

Guest

Hi,

I'm about to delve into the wonderful world of .NET and C# and I have a
couple of questions regarding Garbage Collection. I'm an ex-Delphi programmer
and I'm used to DIY GC.

1. Assume there are two .NET apps running and they are both consuming a
large amount of memory because of lots of objects that haven't been GC'd yet.
Another .NET app is started. If there is no memory available, will the GC
free memory from the first two apps to give to the third? Also if this third
app is a non-.NET app, will Windows cause the GC in the first two apps to
fire to free up some memory for the Operating System?

2. Why isn't there a way of manually freeing specific objects in .NET if I
want to. E.g. The small objects I create I'm happy for the GC to take care
of, but if I create a massive DataTable in memory and I'm finished with it,
why can't I free it up there and then? I.E. why can't I just do all my own GC
and let the .NET GC take care of things I've forgotten about?

Thanks,

Larry
Cape Town
 
H

Hans Baumann

AFAIK, you can use the Dispose() method to prepare an object from being
GC'd, and in another threads you can read that GC is fast enough to minimize
the amount of unused objects in memory.

Hope this gives you a Lead.
 
N

Nick Hounsome

Larry said:
Hi,

I'm about to delve into the wonderful world of .NET and C# and I have a
couple of questions regarding Garbage Collection. I'm an ex-Delphi
programmer
and I'm used to DIY GC.

1. Assume there are two .NET apps running and they are both consuming a
large amount of memory because of lots of objects that haven't been GC'd
yet.
Another .NET app is started. If there is no memory available, will the GC
free memory from the first two apps to give to the third? Also if this
third
app is a non-.NET app, will Windows cause the GC in the first two apps to
fire to free up some memory for the Operating System?

No.

P.S. I'm not sure whether or not the GC EVER gives memory back to the OS. It
is not its main purpose and It is certainly not required to do so.
It is not as easy to do as you might think since it envloves memory
compaction (which it can do) and complex thread interraction.
2. Why isn't there a way of manually freeing specific objects in .NET if I
want to. E.g. The small objects I create I'm happy for the GC to take care
of, but if I create a massive DataTable in memory and I'm finished with
it,
why can't I free it up there and then? I.E. why can't I just do all my own
GC
and let the .NET GC take care of things I've forgotten about?

Because you might be wrong - The GC could never allow you to free something
that was actually used somewhere else therefore it would have to do a whole
lot of work to check that the reference you gave it was the only one. This
would almost make the whole process slower.

I think that you might be able to influence how the GC goes about its
business through config files but I don't know the details.
 
X

xtopher.brandt

Jeffrey Richter's book "Applied .Net Framework Programming" has an
excellent chapter on Garbage Collection. The answers you've gotten two
both questions are correct. The book will give you the details on why.

Chris.
 
M

Michael D. Ober

The answer about will applications return memory to the OS may be incorrect,
depending on how the .NET runtime handles the system memory information
structures and messages. Using these structures and messages, an
application can determine if the system as whole is running low on memory.
I would hope that the .NET GC is polite to the system (and it seems to be
from watching the VM size in task manager) and return unused heap memory to
the OS.

In Windows CE, the message WM_HIBERNATE is sent to applications when the
system as a whole is running out of memory. In Win95 and later, the message
WM_COMPACTING can be sent to all top level windows when the system thinks it
needs more memory. WM_COMPACTING is documented as being a compatibility
message for 16-bit versions of Windows, but it hasn't be depreciated in
32-bit windows, which means that the system still sends it when memory
starts running low.

Mike Ober.
 
N

Nick Hounsome

Michael D. Ober said:
The answer about will applications return memory to the OS may be
incorrect,
depending on how the .NET runtime handles the system memory information
structures and messages. Using these structures and messages, an
application can determine if the system as whole is running low on memory.
I would hope that the .NET GC is polite to the system (and it seems to be
from watching the VM size in task manager) and return unused heap memory
to
the OS.

This is a very interesting problem but the logic behind returning memory in
such circumstances is dubious:

As an anology consider the case of a sudden food shortage in your area.
Do you go and return some of your unused food stocks to the supermarket?
I don't think so! (unless you know that it will be short)

You can say that this is just selfishness but how is an app supposed to know
its relative importance to the user(s) of the system? [Only give it up to
higher priority apps?]

Computing theory suggests that if you needed X bytes of memory before then
you will propbably need it again so giving it up to a possibly less
important application with no gaurantee of being able to retrieve it again
is not a sound decision.
 
G

Guest

Nick Hounsome said:
Computing theory suggests that if you needed X bytes of memory before then
you will propbably need it again so giving it up to a possibly less
important application with no gaurantee of being able to retrieve it again
is not a sound decision.

LARRY:

This doesn't make sense, because your program doesn't need x bytes of
memory. For example, I create object A, do some work and then I'm finished
with it. I then create object B and do some work. The amount of memory
required for my program is not A + B, it's the maximum of A and B.

One thing that I'm still confused about is the fact that a program can
consume a large chunk of memory, even though most of that memory is no longer
being used. I ran some tests to simulate multiple .NET programs running
side-by-side to see what would happen. I ran a program (.NET V2) that gets
20,000 rows from a database, displays them in a grid and cleans everything up
(i.e. it's available for GC). I have a button that performs all these steps,
so I can run this process multiple times. I ran multiple instances of the
program and each instance maxed out at about 50mb of RAM. When the available
RAM was low and I started a new instance of the app, the new instance started
stealing memory from the disk cache. To me this doesn't quite make sense. If,
in the future, every single app is written in .NET, surely running multiple
programs side-by-side will start producing memory problems?


Cheers,

Larry.
 
N

Nick Hounsome

Larry said:
LARRY:

This doesn't make sense, because your program doesn't need x bytes of
memory. For example, I create object A, do some work and then I'm finished
with it. I then create object B and do some work. The amount of memory
required for my program is not A + B, it's the maximum of A and B.

Yes but GC is supposed to smooth out your memory requirements and minimize
time wastage through repeatedly allocating and returning memory to the OS.

If you have very little spare memory then you will spend a lot of time
waiting for the GC to find garbage and compact the remaining objects - this
is a time consuming process and will also tend to cause page thrashing in
the virtual memory system which is even worse - not just on your app but on
all others.
One thing that I'm still confused about is the fact that a program can
consume a large chunk of memory, even though most of that memory is no
longer
being used. I ran some tests to simulate multiple .NET programs running
side-by-side to see what would happen. I ran a program (.NET V2) that gets
20,000 rows from a database, displays them in a grid and cleans everything
up
(i.e. it's available for GC). I have a button that performs all these
steps,
so I can run this process multiple times. I ran multiple instances of the
program and each instance maxed out at about 50mb of RAM.

This is clearly when the GC considers that enough is enough and it should
start to consume more CPU rather than memory as I described above.

I believe that this number is configurable somehow.
When the available
RAM was low and I started a new instance of the app, the new instance
started
stealing memory from the disk cache. To me this doesn't quite make sense.
If,
in the future, every single app is written in .NET, surely running
multiple
programs side-by-side will start producing memory problems?

You are correct in assuming that running a lot of GC apps will typically
require more memory than equivalent C++ apps for precisely the reasons that
I gave.

The solution is to run different apps in the same process so that they can
share a memory pool just as you have in a typical web server. AppDomains
protect each app from the other but the underlying runtime is common.

Similar memory wastage problems can exist in C++ when you high performance
allocators for classes - the more that you have the more memory will be
wasted compared to using the common heap.
 
G

Guest

Nick Hounsome said:
The solution is to run different apps in the same process so that they can
share a memory pool just as you have in a typical web server. AppDomains
protect each app from the other but the underlying runtime is common.

Thanks for the info. In my example I was referring to a client, rather than
a server environment. Would something like sharing the same process in a
client environment be possible?
 
N

Nick Hounsome

Larry said:
Thanks for the info. In my example I was referring to a client, rather
than
a server environment. Would something like sharing the same process in a
client environment be possible?

Yes - this is common for individual apps e.g. all your internet explorer
windows are (usually) the same process [I don't think they are .NET but the
principle holds].

Normally the app starts by looking for an existing instance (or it could be
a 'general' server exe for your suite of apps) and if so it tells it what is
wanted and the existing app then runs the required app in a new domain
whilst the instance that the user selected (by double clicking in explorer
say) just exits.

The nature of this transaction does require the apps to know and trust the
"server" which limits them to a single companies suite of products in
practice.
 
G

Guest

Nick Hounsome said:
Yes - this is common for individual apps e.g. all your internet explorer
windows are (usually) the same process [I don't think they are .NET but the
principle holds].

Normally the app starts by looking for an existing instance (or it could be
a 'general' server exe for your suite of apps) and if so it tells it what is
wanted and the existing app then runs the required app in a new domain
whilst the instance that the user selected (by double clicking in explorer
say) just exits.

The nature of this transaction does require the apps to know and trust the
"server" which limits them to a single companies suite of products in
practice.

Thanks Nick. Where can I find more detailed info about GC? Chris recommended
"Applied .Net Framework Programming". Do you know of any other good sources?

Cheers,

Larry.
 
M

Michael D. Ober

Nick Hounsome said:
Computing theory suggests that if you needed X bytes of memory before then
you will propbably need it again so giving it up to a possibly less
important application with no gaurantee of being able to retrieve it again
is not a sound decision.
Actually, computing theory says absolutely nothing about the long term
memory requirements of an application. What it says is that at any given
time, there is a working set of memory that is required. The working set's
size changes as the application performs different tasks. For instance, an
application that handles matrix multiplication doesn't need to retain the
memory footprint for 100 x 100 matrices when it's multiplying 10 x 10
matrices. It is true that applications without a compacting garbage
collector tend to grow in memory because their allocated memory gets
fragmented, but applications with a compacting garbage collector don't have
to. If they do, it's either because the application is creating a lot of
long term objects, which is rare in a well designed application, or the
garbage collector isn't working very well. My emperical testing of the .NET
2.0 GC subsystem is that it is well written and properly handles recursive
references to objects that can't be accessed by the program anymore. What I
haven't been able to tell is if the .NET 2.0 GC intercepts and handles the
WM_COMPACTING messages (I doubt it does as this message is a legacy message
for 16-bit Windows) or checks the system information structures to see if
global memory is running low. If after a few of GC runs, there is still a
lot of excess memory allocated to the program that the memory manager hasn't
touched, it is probably safe to reduce the application's memory footprint.

Also remember that Windows itself will trim the physical memory footprint of
an application based on page usage and global system requirements.

Mike Ober.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top