Memory leak

S

Sharon

Hi to all.
My program starts at about 7000 KB memory consumption.
After few hours it gets to over 200 MB.
Running GC.Collect() periodically did not help, so it must be a leak.
How can i find where the memory is going?
Thanks, Sharon.
 
R

Richard A. Lowe

..NET/CLR will use as much memory as is available until extrenal memory
pressure forces Windows to reduce the working set of the application - try
it, do some other memory-intensive things while you app is idling and you'll
see the memory used by your app decrease.

Note that even if objects are collected inside the CLR, that does NOT mean
the working set (the physically allocated memory) will be reduced. There
are ways of doing this however they are not generally recommneded and
require invoking Win32 API calls.
 
S

Sharon

Thanks Richard.
I've tried it and .NET does behave as you say.
Still, 200MB memory consumption is alarming :)
At least its not a leak.
Sharon.
 
J

John Wood

Actually reducing the workingset doesn't require a p/invoke... just try
this:

System.Diagnostics.Process.GetCurrentProcess().MaxWorkingSet =
System.Diagnostics.Process.GetCurrentProcess().MinWorkingSet;
 
N

Nicholas Paldino [.NET/C# MVP]

I find it interesting that it is "alarming" (something most other people
who see this for the first time assume as well).

I mean, as long as the CLR gives back memory when the OS needs it, there
shouldn't be an issue. The CLR takes what it needs when it is available,
and returns it when told to by something more authoritative (the OS). There
is no reason to have all those resources go to waste =)
 
J

John Wood

The biggest issue are the users who aren't clever (or interested) enough to
understand memory management in the CLR, but who have been told by someone
what the task manager is for.

I guess the problem is that it just *looks* bad...

Nicholas Paldino said:
I find it interesting that it is "alarming" (something most other people
who see this for the first time assume as well).

I mean, as long as the CLR gives back memory when the OS needs it, there
shouldn't be an issue. The CLR takes what it needs when it is available,
and returns it when told to by something more authoritative (the OS). There
is no reason to have all those resources go to waste =)


--
- Nicholas Paldino [.NET/C# MVP]
- (e-mail address removed)

Sharon said:
Thanks Richard.
I've tried it and .NET does behave as you say.
Still, 200MB memory consumption is alarming :)
At least its not a leak.
Sharon.
 
S

Sharon

John Wood said:
The biggest issue are the users who aren't clever (or interested) enough to
understand memory management in the CLR, but who have been told by someone
what the task manager is for.

I guess the problem is that it just *looks* bad...

message news:[email protected]...
I find it interesting that it is "alarming" (something most other people
who see this for the first time assume as well).

I mean, as long as the CLR gives back memory when the OS needs it, there
shouldn't be an issue. The CLR takes what it needs when it is available,
and returns it when told to by something more authoritative (the OS). There
is no reason to have all those resources go to waste =)


--
- Nicholas Paldino [.NET/C# MVP]
- (e-mail address removed)

Sharon said:
Thanks Richard.
I've tried it and .NET does behave as you say.
Still, 200MB memory consumption is alarming :)
At least its not a leak.
Sharon.



.NET/CLR will use as much memory as is available until extrenal memory
pressure forces Windows to reduce the working set of the application -
try
it, do some other memory-intensive things while you app is idling and
you'll
see the memory used by your app decrease.

Note that even if objects are collected inside the CLR, that does NOT
mean
the working set (the physically allocated memory) will be reduced. There
are ways of doing this however they are not generally recommneded and
require invoking Win32 API calls.

--
C#, .NET and Complex Adaptive Systems:
http://blogs.geekdojo.net/Richard
Hi to all.
My program starts at about 7000 KB memory consumption.
After few hours it gets to over 200 MB.
Running GC.Collect() periodically did not help, so it must be a leak.
How can i find where the memory is going?
Thanks, Sharon.
 
S

Sharon

I still have no idea why my application needs 200MB.
It should work on less.
I assume .NET increases the working set according to the application need.
I tried a tool called memory profiler, but it crashed.
Sharon.

Nicholas Paldino said:
I find it interesting that it is "alarming" (something most other people
who see this for the first time assume as well).

I mean, as long as the CLR gives back memory when the OS needs it, there
shouldn't be an issue. The CLR takes what it needs when it is available,
and returns it when told to by something more authoritative (the OS). There
is no reason to have all those resources go to waste =)


--
- Nicholas Paldino [.NET/C# MVP]
- (e-mail address removed)

Sharon said:
Thanks Richard.
I've tried it and .NET does behave as you say.
Still, 200MB memory consumption is alarming :)
At least its not a leak.
Sharon.
 
M

Mihai Diac

Hi

Don't know if that's the way to do it but I'm continuing this thread
since it seams to be about a very similar problem (if I make a mistake
pls tell me, I'm just trying to get to the bottom of a very serious
problem I have).

It's mostly the same thing, a C# application that grows in memory
consumption over 200Mb. The problem is that in a while it crushes in a
very nasty way.

The application is quite complex but the constant growth in memory usage
isn't at all justified. It responds slower and slower till at one moment
it just freezes.

Hope somebody has an idea
Thanks

Mihai
 
S

Sherif ElMetainy

Hello

Most probably there is some reference to some objects, for example you may
be adding objects to an arraylist or hashtable and not removing them when
they are no longer needed. I suggest using a memory profiler to find what
type of objects are allocated.

Best regards,
Sherif
 
B

Benoit Vreuninckx

Mihai said:
Hi

Don't know if that's the way to do it but I'm continuing this thread
since it seams to be about a very similar problem (if I make a mistake
pls tell me, I'm just trying to get to the bottom of a very serious
problem I have).

It's mostly the same thing, a C# application that grows in memory
consumption over 200Mb. The problem is that in a while it crushes in a
very nasty way.

The application is quite complex but the constant growth in memory usage
isn't at all justified. It responds slower and slower till at one moment
it just freezes.

Hope somebody has an idea
Thanks

Mihai

Hi,

my story, last year, I did a memory-stress-test by simply creating a
bunch of small objects (8 bytes in size, that is, an integer and the
implicit vmt pointer) and storing them in an array (500000 elements).
When the array is full, a new array is created and populated and so on
and so on. This went well, up until a certain memory load. Further
allocating, beyond this point, began taking more and more time. You
could also notice lots of harddisk activity, which must be swap activity.
My conclusion: everything goes well, when there's enough free physical
memory. No (or less) swapping is needed, the garbage collector does not
run much often and allocation is pretty fast. But when free memory gets
very rare (the OS already is swapping things back and forth) the GC is
run more often. When the GC runs, it scans whatever it thinks is
necessary, causing even more page faults, causing the application to
choke. This does not happen in applications without a garbage collector
(pretty much every application before the .net/java/... period). With
'native' applications, pages that are swapped out of memory stay on disk
until the application references them. There's no garbage collector
that needs to scan (read: reload from swapfile) the *entire* memory
space for unreferenced objects.
So, your application scales with the amount of free physical memory.
(IMHO)

Cheers,
Benoit.
 
S

Sherif ElMetainy

Hello

You are right, but in your test you are allocating a lot of large arrays
which goes to the the large object heap. Allocating a lot of large objects
causes a lot of generation 2 collections which hurts perfomance badly.
Typically you should design your application so that large objects
allocation are not frequent and when they do happen the large objects should
not be short lived.

Best regards,
Sherif
 
M

Mihai Diac

Thanks guys, your posts are usefull and also gave me a insight of the
problem. I'll post back when I get things working (hopefully soon).
 
M

Mike P

I had a similar problem, and I found that that the key is finding out why
references are left around when they shouldn't (or you think they shouldn'
t). I found a tool called Memory Profiler. For example, setting up an event
handler from one class to another creates a link between the 2 classes. To
break the link you need to unset the event handler. (This was not obvious).



However will all the memory profiling tools and using Dispose everywhere, I
still noticed growth in memory consummation over time. So, at the end, I
think the problem can be controlled with reference counting and Dispose,
still I believe the GC and the memory management system in .NET needs more
work. I hope version 2.0 fixes some of these problems.
 
L

Ludovic SOEUR

I had similar problems. I tried ".NET Memory Profiler 2.0" to profile memory
allocation.
Most oft he leaks where obvious : a link to a parent class that was not
unset before setting the object to null.
But there was 3 leaks that where not obvious at all :
RichText, MainMenu and ToolTip must be disposed BEFORE setting the object to
null :
public void unRegister() {
myRichText.Dispose();
myMainMenu.Dispose();
myToolTip.Dispose();
}

In the code :
myObject=new MyObject();
.....
.....
myObject.unRegister();
myObject=null;


Hope it helps,
Ludovic Soeur.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top