M
Mike
..NET 1.1
Hi.
My C# .NET application has a memory leak - if I leave it running for hours
then the system ends up using 500MB+ of virtual memory, which is released
immediately when I close the app. Now, I thought this would be impossible
with automatic garbage collection, but it seems this is not the case.
The application is threaded and all work is done within spawned threads. No
thread runs for the lifetime of the app - they are started and ended as
required, so I thought that even if I did not explicitly release an object's
resources, GC would at some point do this for me. As far as I can see
though, I AM releasing all resources.
Can someone give any advice on:
i) how this situation could occur in the first place,
ii) how to track down where the memory leak is occuring.
Any help is appreciated.
Thanks,
Mike
Hi.
My C# .NET application has a memory leak - if I leave it running for hours
then the system ends up using 500MB+ of virtual memory, which is released
immediately when I close the app. Now, I thought this would be impossible
with automatic garbage collection, but it seems this is not the case.
The application is threaded and all work is done within spawned threads. No
thread runs for the lifetime of the app - they are started and ended as
required, so I thought that even if I did not explicitly release an object's
resources, GC would at some point do this for me. As far as I can see
though, I AM releasing all resources.
Can someone give any advice on:
i) how this situation could occur in the first place,
ii) how to track down where the memory leak is occuring.
Any help is appreciated.
Thanks,
Mike