Low .NET Socket performence!!!

G

Guest

Hi,

I have experienced low performance using the System.Net.Sockes.Socket object:
1. An unmanaged application that use IOCompletion ports gives performance of about 1.33 better then the same app written in C# using the Begin*/End* APIs.
2. I have did an additional test using .NET Sockets: I have created an Application that uses the Blocking interface of the .NET Sockets to receive data ( Socket.Receive ), surprisingly the 'blocking' application gave a performance better the application that uses async IO ( about 15% better ) although usage of the performance monitor showed that the async IO app produce less context switches and less interop marshaling..., why is that happening, why does the blocking 'server' give better performance then the async server?????

Note that I have used the same client for bought of the servers.

Attached you could find the server/client code used to produce the results, I have tried to prodce code as compact as possible... this is a single executable that runs bought client and server ( according to command line arguments ), the test was done on the same computer so the network bottle neck will be disabled, following is the actual code (Just cut and paste it to a console application):
using System;
using System.Net;
using System.Net.Sockets;
using System.Threading;

namespace SocketTester
{
class Program
{
static protected byte[] m_btBuffer = new byte[1024 * 30];
static protected Socket m_Socket = null;
static protected Socket m_socClient = null;
static protected long m_lEndTime = 0;
static protected long m_lDataReceived = 0;
static protected AsyncCallback m_ReceptionDelegate = null;

static protected void ReceptionCallback(IAsyncResult ar)
{
long iRecievedAmount = (long)m_socClient.EndReceive(ar);
m_lDataReceived += iRecievedAmount;
if (0 == iRecievedAmount)
{
m_lEndTime = Environment.TickCount;
((EventWaitHandle)ar.AsyncState).Set();
return;
}
m_socClient.BeginReceive(m_btBuffer, 0, m_btBuffer.Length, 0, m_ReceptionDelegate, ar.AsyncState);
}

static void Main(string[] args)
{
if ('l' == args[0][0])// Listener
{
m_Socket = new Socket(AddressFamily.InterNetwork
, SocketType.Stream
, ProtocolType.Tcp);
m_Socket.Bind(new IPEndPoint(0, 5001));
m_Socket.Listen(10);
m_socClient = m_Socket.Accept();
Console.WriteLine("Session Started...");
int lTime = Environment.TickCount;
int iBytesRead = 0;
if ('s' == args[0][1])// [L]istener ynchronous
{
while (0 != (iBytesRead = m_socClient.Receive(m_btBuffer)))
m_lDataReceived += iBytesRead;
m_lEndTime = Environment.TickCount;
}
else if ('a' == args[0][1])// [L]istener [A]synchronous
{
EventWaitHandle evntCompletion = new EventWaitHandle(false, EventResetMode.ManualReset);
m_ReceptionDelegate = new AsyncCallback(ReceptionCallback);
m_socClient.BeginReceive(m_btBuffer, 0, m_btBuffer.Length, 0, m_ReceptionDelegate, evntCompletion);
evntCompletion.WaitOne();
}
float fDuration = (m_lEndTime - lTime) / 1000.0f;
Console.WriteLine("Duration: {0}sec", fDuration);
Console.WriteLine("Bit rate: {0}mb/sec", (m_lDataReceived / (1000 * 1000)) / fDuration);
Console.WriteLine("Session terminated...");
}
else if ("c" == args[0])// [C]lient, data originator.
{
TcpClient Client = new TcpClient("127.0.0.1", 5001);
NetworkStream Stream = Client.GetStream();
int iAmount = (int.Parse(args[1]) * 1000 * 1000) / m_btBuffer.Length;
for (int i = 0; i < iAmount; i++)
Stream.Write(m_btBuffer, 0, m_btBuffer.Length);
Stream.Close();
Client.Close();
}
else
{
Console.WriteLine("SocketTester.exe c - Client");
Console.WriteLine(" la - Listener, Asynch");
Console.WriteLine(" ls - Listener, Synch");
}
}
}
}

Nadav
http://www.ddevel.com
 
J

Jon Skeet [C# MVP]

Attached you could find the server/client code used to produce the
results, I have tried to prodce code as compact as possible... this
is a single executable that runs bought client and server ( according
to command line arguments ), the test was done on the same computer
so the network bottle neck will be disabled, following is the actual
code (Just cut and paste it to a console application):

Thanks for the code, but it doesn't compile - EventWaitHandle isn't
found. Is this a Whidbey class, or is it one of yours and you haven't
posted the code for it?
 
G

Guest

Same code for the 2003 version.

using System;
using System.Net;
using System.Net.Sockets;
using System.Threading;

namespace SocketTester
{
class Program
{
static protected byte[] m_btBuffer = new byte[1024 * 30];
static protected Socket m_Socket = null;
static protected Socket m_socClient = null;
static protected long m_lEndTime = 0;
static protected long m_lDataReceived = 0;
static protected AsyncCallback m_ReceptionDelegate = null;

static protected void ReceptionCallback(IAsyncResult ar)
{
long iRecievedAmount = (long)m_socClient.EndReceive(ar);
m_lDataReceived += iRecievedAmount;
if (0 == iRecievedAmount)
{
m_lEndTime = Environment.TickCount;
((ManualResetEvent)ar.AsyncState).Set();
return;
}
m_socClient.BeginReceive(m_btBuffer, 0, m_btBuffer.Length, 0, m_ReceptionDelegate, ar.AsyncState);
}

static void Main(string[] args)
{
if ('l' == args[0][0])// Listener
{
m_Socket = new Socket(AddressFamily.InterNetwork
, SocketType.Stream
, ProtocolType.Tcp);
m_Socket.Bind(new IPEndPoint(0, 5001));
m_Socket.Listen(10);
m_socClient = m_Socket.Accept();
Console.WriteLine("Session Started...");
int lTime = Environment.TickCount;
int iBytesRead = 0;
if ('s' == args[0][1])// [L]istener ynchronous
{
while (0 != (iBytesRead = m_socClient.Receive(m_btBuffer)))
m_lDataReceived += iBytesRead;
m_lEndTime = Environment.TickCount;
}
else if ('a' == args[0][1])// [L]istener [A]synchronous
{
ManualResetEvent evntCompletion = new ManualResetEvent(false);
m_ReceptionDelegate = new AsyncCallback(ReceptionCallback);
m_socClient.BeginReceive(m_btBuffer, 0, m_btBuffer.Length, 0, m_ReceptionDelegate, evntCompletion);
evntCompletion.WaitOne();
}
float fDuration = (m_lEndTime - lTime) / 1000.0f;
Console.WriteLine("Duration: {0}sec", fDuration);
Console.WriteLine("Bit rate: {0}mb/sec", (m_lDataReceived / (1000 * 1000)) / fDuration);
Console.WriteLine("Session terminated...");
}
else if ("c" == args[0])// [C]lient, data originator.
{
TcpClient Client = new TcpClient("127.0.0.1", 5001);
NetworkStream Stream = Client.GetStream();
int iAmount = (int.Parse(args[1]) * 1000 * 1000) / m_btBuffer.Length;
for (int i = 0; i < iAmount; i++)
Stream.Write(m_btBuffer, 0, m_btBuffer.Length);
Stream.Close();
Client.Close();
}
else
{
Console.WriteLine("SocketTester.exe c <amount to send in MB> - Client");
Console.WriteLine(" la - Listener, Asynch");
Console.WriteLine(" ls - Listener, Synch");
}
}
}
}

Nadav
http://www.ddevel.com
 
J

Jon Skeet [C# MVP]

Nadav said:
Same code for the 2003 version.

<snip>

Hmm. Well, I can reproduce the problem. I'm not entirely sure why it
happens, to be honest. I'll have a think and try to investigate it
further.

Btw, you do realise that there's no point in doing

if (0==x)
type comparisons in C# rather than the (IMO) more readable
if (x==0)

I assume this is a hang-over from C/C++ coding where it avoids
accidental typos of if (x=0) - but that doesn't compile in C# anyway,
as the type of an if statement has to be boolean.
 
G

Guest

A Little fix up on what I was saying on my original post, the Async listener ( the one that use IO Completion ports ) produced much more context switches then the blocking listener ( according to the performance counter ), this result is in contrast to what was expected... My guess is that as EndRead is used to wait for the operation completion it must wait on the OVERLAPPED.hEvent, this is OK as long as it is done OUTSIDE the completion callback, in the completion callback this wait is not needed as 'we already know' the operation was completed, if there was a way of getting the amount of bytes read other then calling EndRead it would probably solve the problem ( concerning my assumption is true ).

Nadav
http://www.ddevel.com
 
Top