Mihai said:
Hi
Don't know if that's the way to do it but I'm continuing this thread
since it seams to be about a very similar problem (if I make a mistake
pls tell me, I'm just trying to get to the bottom of a very serious
problem I have).
It's mostly the same thing, a C# application that grows in memory
consumption over 200Mb. The problem is that in a while it crushes in a
very nasty way.
The application is quite complex but the constant growth in memory usage
isn't at all justified. It responds slower and slower till at one moment
it just freezes.
Hope somebody has an idea
Thanks
Mihai
Hi,
my story, last year, I did a memory-stress-test by simply creating a
bunch of small objects (8 bytes in size, that is, an integer and the
implicit vmt pointer) and storing them in an array (500000 elements).
When the array is full, a new array is created and populated and so on
and so on. This went well, up until a certain memory load. Further
allocating, beyond this point, began taking more and more time. You
could also notice lots of harddisk activity, which must be swap activity.
My conclusion: everything goes well, when there's enough free physical
memory. No (or less) swapping is needed, the garbage collector does not
run much often and allocation is pretty fast. But when free memory gets
very rare (the OS already is swapping things back and forth) the GC is
run more often. When the GC runs, it scans whatever it thinks is
necessary, causing even more page faults, causing the application to
choke. This does not happen in applications without a garbage collector
(pretty much every application before the .net/java/... period). With
'native' applications, pages that are swapped out of memory stay on disk
until the application references them. There's no garbage collector
that needs to scan (read: reload from swapfile) the *entire* memory
space for unreferenced objects.
So, your application scales with the amount of free physical memory.
(IMHO)
Cheers,
Benoit.