J
Jerome Symons
Hi everyone,
I have developed a C# client/server system that creates bitmap images
at the server and sends them to the client as an array of bytes using
a Socket.
When the data rate through the socket reaches about 4.5 MB/s, the
Socket.Read command at the client end uses all of my CPU. I don't know
why it takes so much CPU to read the data from the socket.
My socket is a TCP/IP stream type socket and the images that are sent
from the server are about 300 kB in size. I am able to send about 15
images/s before the communication between the server and client breaks
down. I am assuming that this is happening because the CPU for the
client runs out of steam.
Thanks for any help.
Jerome
PS - No I cannot use jpeg compression to reduce the size of the
images.
I have developed a C# client/server system that creates bitmap images
at the server and sends them to the client as an array of bytes using
a Socket.
When the data rate through the socket reaches about 4.5 MB/s, the
Socket.Read command at the client end uses all of my CPU. I don't know
why it takes so much CPU to read the data from the socket.
My socket is a TCP/IP stream type socket and the images that are sent
from the server are about 300 kB in size. I am able to send about 15
images/s before the communication between the server and client breaks
down. I am assuming that this is happening because the CPU for the
client runs out of steam.
Thanks for any help.
Jerome
PS - No I cannot use jpeg compression to reduce the size of the
images.