Memory leak with socket BeginReceive?

I

ianrae88

We are getting a memory leak with code that's using async sockets.
Private Bytes perf counter keeps increasing as well as Bytes In All
Heaps. CLRProfiler shows a lot of old asyncresult objects (180 seconds
old) in the Gen 2 heap.

It looks very similar to the thread (link is below) about how
BeginReceive causes a buffer to be pinned, creating "holes" in the .Net
memory heap. The pinned objects prevent heap compaction. And since
allocation always occurs at the top of the heap, the CLR must
continually allocate more Win32 memory.

http://groups.google.ca/group/micro...ocket+gc+pinned&rnum=1&hl=en#06e053ecb340f2f6

Strangely, the addition of a Thread.Sleep(1) seems to make the memory
leak go away.

Basically, here is the BeginReceive callback function:

void AsyncReceiveCallback(IAsyncResult result)
{
VeoAsyncIOInfo thisConnection = (VeoAsyncIOInfo)result.AsyncState;
int num_read = thisConnection.m_socket.EndReceive(result);
if (0 != num_read) {
ProcessNewData(thisConnection.m_buffer, num_read,
thisConnection.m_socket);
}

Thread.Sleep(1); //hack hack hack

thisConnection.m_socket.BeginReceive(thisConnection.m_buffer, 0,
MAXMSGSIZE, SocketFlags.None, new AsyncCallback(AsyncReceiveCallback),
thisConnection);
}

It appears as though the sleep delays BeginReceive until after a reply
is sent on the same socket. ProcessNewData() queues the data from
which another thread consumes it and sends a reply.

Are there rules about sending on a socket while a BeginReceive is
active? It works, but it leaks.
 
I

ianrae88

Good point. Most examples do this, but I'll try removing it.

It doesn't however explain why the memory leak goes away by adding a
sleep.

I can't decide if the problem is ordering of syncrhonous-send +
begin-receive, or its a timing issue with the GC.
 
I

ianrae88

Update. This leak is a byproduct of two things. First, all allocation
in the CLR takes place at the top of the heap. Second, the GC can't
compact memory below the highest (newest) pinned object. Every call to
socket.BeginReceive pins the buffer object that you pass in. It remain
pinned for a long time (until the data arrives on the socket), followed
by a brief instant between EndReceive and the next BeginReceive when
it's unpinned, and you process received data. Only if a GC occurs
during the brief instant, AND it's on the highest pinned buffer, will
memory be compacted. That's why a Thread.Sleep helps.

The real fix is to allocate a fixed pool of buffers early during
program startup. They're pinned but low in memory. Or use .Net 2.0
which is supposed to fix this problem :)
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top