Is this a bug in .NET garbage collection?

G

Guest

I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.
 
D

Daniel O'Connell [C# MVP]

What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.
 
G

Guest

Garbage Collector Works well for .NET 1.0 (SP 1+) and .NET 1.1, but you
cannot always predict when it will reclaim your unused array. I am pretty
sure though that if you click the button twice, the memory will still be 300M
rather than 600M.
If you would like to really understand how the garbage collector works, I
recommend that you download shared source
http://www.microsoft.com/downloads/...FA-7462-47D0-8E56-8DD34C6292F0&displaylang=en
and step through the garbage collection source code (e.g. gcXXX.cpp).
Aleksey Nudelman,
csharpcomputing.com
 
I

Imran Koradia

1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.
 
G

Guest

Thanks for the reply. I am using .NET 1.1 SP1 (the very latest update). You
are right that if I click twice the memory will still 300M. But the problem
is this 300M is there FOREVER. I also tried to increase the size of array to
900M on my machine with 512M RAM. This 900M is there FOREVER too! And my
machine becomes very slow now...

csharpcomputing.com said:
Garbage Collector Works well for .NET 1.0 (SP 1+) and .NET 1.1, but you
cannot always predict when it will reclaim your unused array. I am pretty
sure though that if you click the button twice, the memory will still be 300M
rather than 600M.
If you would like to really understand how the garbage collector works, I
recommend that you download shared source
http://www.microsoft.com/downloads/...FA-7462-47D0-8E56-8DD34C6292F0&displaylang=en
and step through the garbage collection source code (e.g. gcXXX.cpp).
Aleksey Nudelman,
csharpcomputing.com
Yang said:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.
 
G

Guest

Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

Imran Koradia said:
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


Yang said:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

 
G

Guest

Thanks for the reply. Please see my other replies. I am using .NET 1.1 SP1.
(NOT the 1.0 SP1). The About box in Visual Studio.NET shows the version
1.1.4322 SP1.

Daniel O'Connell said:
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

Yang said:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

 
D

Daniel O'Connell [C# MVP]

Yang said:
Thanks for the reply. Please see my other replies. I am using .NET 1.1
SP1.
(NOT the 1.0 SP1). The About box in Visual Studio.NET shows the version
1.1.4322 SP1.

Then it sounds like you are holding onto the memory somewhere. Can you show
a small example program that exhibits the problem?
Daniel O'Connell said:
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

Yang said:
I found a very strange behavior when I write a C# windows application.
If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task
Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the
memory
get released.

3. The variable aaa has gone out of scope, so it does not make any
sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose
to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not
make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

 
G

Guest

Daniel,

Thanks for the fast response. I have showed the sample code in my original
post. Let me repeat it here:

Just create the simplest windows form project and add a button and handler
as follows. This is all you need to reproduce this problem. Do you think the
varible aaa is hold somewhere? I don't think so...

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}





Daniel O'Connell said:
Yang said:
Thanks for the reply. Please see my other replies. I am using .NET 1.1
SP1.
(NOT the 1.0 SP1). The About box in Visual Studio.NET shows the version
1.1.4322 SP1.

Then it sounds like you are holding onto the memory somewhere. Can you show
a small example program that exhibits the problem?
Daniel O'Connell said:
What version of the framework? If memory serves there is a bug in the 1.0
runtime won't free large objects in certain circumstances.

I found a very strange behavior when I write a C# windows application.
If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task
Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the
memory
get released.

3. The variable aaa has gone out of scope, so it does not make any
sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose
to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not
make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.

 
C

Chris Lyon [MSFT]

Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------
Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

Imran Koradia said:
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


Yang said:
I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.





--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
G

Guest

Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

"Chris Lyon [MSFT]" said:
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------
Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

Imran Koradia said:
1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.




--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
C

Chris Lyon [MSFT]

I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/DBGch02.asp for more information

-Chris

--------------------
Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

"Chris Lyon [MSFT]" said:
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------
Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

:

1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.



--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.




--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
G

Guest

Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

"Chris Lyon [MSFT]" said:
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/DBGch02.asp for more information

-Chris

--------------------
Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

"Chris Lyon [MSFT]" said:
Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------


Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

:

1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.






--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.




--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
C

Chris Lyon [MSFT]

Hi Yang

I think the issue is that once your large object has been collected, the CLR doesn't know if you'll ever need that much memory again. It's cheaper to keep it once it's been
allocated instead of reallocating a huge chunk every time, and possibly failing that allocation. The GC bases its heuristics on your past allocation pattern, hoping it will be the
same in the future.

You are right, you don't want to call GC.Collect(). The large object will get collected the next time a full (expensive) collection occurs, so you're better off waiting, unless you've
profiled and found performance is better with your Collect.

No, there's no way to force the GC to collect individual objects.

-Chris

--------------------
Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

"Chris Lyon [MSFT]" said:
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/DBGch02.asp for more information

-Chris

--------------------
Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

:

Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------


Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

:

1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.






--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.



--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.




--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
G

Guest

Chris,

Thanks for the fast response. Please see my comments below:

"Chris Lyon [MSFT]" said:
Hi Yang

I think the issue is that once your large object has been collected, the CLR doesn't know if you'll ever need that much memory again. It's cheaper to keep it once it's been
allocated instead of reallocating a huge chunk every time, and possibly failing that allocation.

I understand this part. But I think I am talking about some extreme
situation when that saved memory downgrades the entire system's performance.
My point is in this extreme situation, .NET should release that memory to
Windows and allow other application to use it.
You are right, you don't want to call GC.Collect(). The large object will get collected the next time a full (expensive) collection occurs, so you're better off waiting, unless you've
profiled and found performance is better with your Collect.

This confused me. From my observation, if I don't call GC.Collect() to force
..NET release the memory, everytime I allocate a bigger block of memory .NET
will keep this memory in its heap. In other word, the total memory occupied
by the application will be the maximum size of memory you have allocated in
the history.

In my example, I allocated 300M memory, can you tell me in what condition
that 300M will be reduced. From my observation, if I allocation 10M
thereafter, the 300M is still there. If I allocate 400M, then my application
holds this 400M! Basically, the memory never goes down.
No, there's no way to force the GC to collect individual objects.

-Chris

--------------------
Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

"Chris Lyon [MSFT]" said:
I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/DBGch02.asp for more information

-Chris

--------------------


Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

:

Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------


Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

:

1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.






--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.





--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.




--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
C

Chris Lyon [MSFT]

Hi Yang
I understand this part. But I think I am talking about some extreme
situation when that saved memory downgrades the entire system's performance.
My point is in this extreme situation, .NET should release that memory to
Windows and allow other application to use it.

Have you found this to be true with a non-trivial .NET application? The GC tunes itself to your application. If all your app does is allocate a huge chunk of memory, then exits, it
won't try to reduce its size in memory.
This confused me. From my observation, if I don't call GC.Collect() to force
.NET release the memory, everytime I allocate a bigger block of memory .NET
will keep this memory in its heap. In other word, the total memory occupied
by the application will be the maximum size of memory you have allocated in
the history.

I think you're confusing objects in memory and the size of the large object heap. GC.GetTotalMemory and profilers will tell you that the object has been collected, but the size of
the reserved large object heap remains the same.
In my example, I allocated 300M memory, can you tell me in what condition
that 300M will be reduced. From my observation, if I allocation 10M
thereafter, the 300M is still there. If I allocate 400M, then my application
holds this 400M! Basically, the memory never goes down.

According to the article I pointed you to, the large object heap doesn't shrink. Again, if this is a problem in a real-world application, and not a trivial test, let me know. If that's the
case, then you probably need to redesign your application with memory usage in mind. I can point you to several resources if you need.

Thanks
-Chris
No, there's no way to force the GC to collect individual objects.

-Chris

--------------------
Thanks Chris,

I noticed the same thing that GC.GetTotalMemory(true) and GC.Collect() can
release the memory to Windows.

However I still think the algorithm here is not optimal. Say, my application
occupied 900MB memory, other applications run on the same computer cannot get
those memory. The reason of this is just because my application allocated
that memory ONCE and never use it any more.

I think it is good for .NET to cache that big memory for performance, but it
should be smart enough to release it if the situation is my application does
not need it while other applications need it.

In my sample code, do you think if I should call GC.Collect()? I think if I
do that, it will cause other problems which have been explained in many
articles, for example, it may take long time to collect memory of all
generations...

My one more opinion is: since in this case I know that memory is not used,
is it possible for me to call some functions to just release this paticular
object?

Thanks again.

:

I tried running the test myself.

Calling GC.GetTotalMemory(false) after the method exits shows the array still in memory. This is expected, since a full collection of the large object heap has not yet been
performed. Calling GC.GetTotalMemory(true) shows the memory has been collected. This is using v1.1 SP1.

The amount of vitural memory being used after the collection is the large object heap itself. It expands to fit your large objects, but doesn't shrink when it's emptied. This is
similar to the way Windows manages its virtual memory.

See http://msdn.microsoft.com/library/default.asp?url=/library/en-us/dnbda/html/DBGch02.asp for more information

-Chris

--------------------


Chris,

I hope you can write a couple lines of code and reproduce this problem on
your computer.

I am very sure this is a problem, because I examed it by .NET Memory
Profiler, GC.GetTotalMemory() and Task Manager.

Here is the .Net Memory Profiler I used : http://www.scitech.se/memprofiler/

Also 2 of my coworkers got the same result on their computers respectively.

One thing is obvious - you even don't need any tools to verify - after you
allocate such a big thunk of memory (I tried 900MB), your whole system
becomes slow and never come back until you close the app.

AGAIN, I HOPE EVERYBODY REPEAT MY TEST BEFORE REPLYING. Tell me if the same
problem happens on your machine or not.

:

Hi Imran

Task manager is a bad way to determine how much memory your .NET application is using. Task Manager reports the working set, which is not necessary the same thing.

To verify the GC is working, use GC.GetTotalMemory() or the CLR Profiler, and you should find the memory does in fact get collected.

Thanks
-Chris

--------------------


Thanks for the reply. But I think it is not true in my case. I am using .NET
1.1 with the latest SP1. I also left the application overnight and opened
many other applications. Basically whatever I do, the memory has never been
released.

:

1.1 framework seems to work fine. The GC performs a collect and releases all
the memory very soon (a second) after the array runs out of scope. If you're
using 1.0 then it could very well be what Daniel mentioned.

Imran.


I found a very strange behavior when I write a C# windows application. If I
allocate a huge chunk of memory by an array, the memory will never be
released by .NET.

The problem can be demostrated as follows:

1. Just create the simplest windows form project and add a button and
handler like this:

private void button1_Click(object sender, System.EventArgs e)
{
byte[] aaa = new byte[300000000];
for(int i=0; i<aaa.Length; i++)
aaa = 10;
}

2. After executing the above code, I observed the memory in Task Manager.
The Commit Charge Total jumped to 300M and the MEM Usage shows the same
thing. The problem is whatever you do the memory usage can never drop
back. I
have tried to open many other applications and NOTHING can make the memory
get released.

3. The variable aaa has gone out of scope, so it does not make any sense
that the .NET still holds the memory. Now the performance of the whole
system
is downgraded.

4. The only thing I can make the memory release to Windows is to call
GC.Collect(). Is this the only way to release that memory? Do I suppose to
do
this?

Can anybody confirm if this is a bug in .NET? Appearantly it does not make
sense to hold the memory FOREVER even if nobody uses it.

Thanks,

Yang.






--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.





--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.



--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.




--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 
G

Guest

Hi Chris,

I totally understand what you are talking about and I don't think I confused
with anything. Actually the reason I found this problem is from the REAL
project I am currently working on. In this project, we need to do a
simulation which involves very intensive math calculation and a huge array.
That is why we found this problem.

You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.
But in our calulation it is almost impossible not to use a huge array. If I
someday we can change our simulation algorithm to not to use the huge array,
it would make the algorithm very complex and downgrade the performace. BUT -
since we have 1GB memory chip on my machine, why cannot I use it? Why do I
have to close my application to release my memory to other applications?

I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...

Yang.
 
J

Jon Skeet [C# MVP]

Yang Lu said:
I totally understand what you are talking about and I don't think I confused
with anything. Actually the reason I found this problem is from the REAL
project I am currently working on. In this project, we need to do a
simulation which involves very intensive math calculation and a huge array.
That is why we found this problem.

You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.
But in our calulation it is almost impossible not to use a huge array. If I
someday we can change our simulation algorithm to not to use the huge array,
it would make the algorithm very complex and downgrade the performace. BUT -
since we have 1GB memory chip on my machine, why cannot I use it? Why do I
have to close my application to release my memory to other applications?

I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...

But do you only need this huge array once, and your application will
keep running for a long time after it's no longer used, never
allocating large objects? That's the situation in which it becomes a
problem, as I understand it.
 
G

Guest

But do you only need this huge array once, and your application will
keep running for a long time after it's no longer used, never
allocating large objects? That's the situation in which it becomes a
problem, as I understand it.

Our application is to control irrigation system. One of its feature is
simulation how much water will be used. If users want to simulate one day's
water usage, it consume less memory, but if they want to simulate water usage
for 2 couple months, there will need a huge amount of memory. Also users can
run the simulation at any time: maybe once a day, or maybe once a week...

What our QA team observed was that the memory used by this application,
after the simulation, can increase, but never decrease. After researching for
a few days, I found our situation can be generalized by the sample code I
post here.
 
C

Chris Lyon [MSFT]

Hi Yang
You might be right that we need to redesign the project becasue .NET does
not have the ability to handle huge arrays correctly. If I interpret your
saying about redesign, I would say you are telling us NOT to use huge arrays.

I don't think I ever said .NET doesn't handle huge arrays correctly :)
What I mean is, the way you should design your managed application, with respect to memory allocations, is very different than unmanaged. In C++ you can allocate a huge array,
and as soon as you're done with it delete it, and the memory is freed. Obviously with a GC you can't do that, and lage objects are handled differently, since it's so expensive to
create, deallocate, re-create, etc.

Instead of a huge contiguous chunk of memory, is it possible for you to use an ArrayList, or some other dynamic structure? That way it won't all be allocated into the LOH, and be
collected earlier, possibly in pieces. It's hard to give concrete advice without seeing your application, but in general you want to avoid huge memory allocations in .NET because
it limits what the GC can do with it (mainly for perf reasons). See Rico Mariani's blog for more on GC and Perf: http://weblogs.asp.net/ricom/
I have describle the algorithm about how to handle this situation ideally in
my previous posts. You just simply do not want to admit I am right and you
just do not want to say .NET needs improvement - this is the part really
confused me...

Of course .NET needs improvement, that's why I have a job ;)
But I think your algorithm doesn't take into account the other issues involved in a general garbage collector's implementation. There's an excellent book on garbage collection
technology by Jones and Lins that I would recommend, if you're interested in learning more.

-Chris

--

This posting is provided "AS IS" with no warranties, and confers no rights. Use of included script samples are subject to the terms specified at
http://www.microsoft.com/info/cpyright.htm

Note: For the benefit of the community-at-large, all responses to this message are best directed to the newsgroup/thread from which they originated.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top