Socket.BeginSend single threaded - How to?

S

Sharon

Hello all,

I'm using Socket.BeginSend.
In my code, a stress situation may lead to several calls to BeginSend,
before any completion callback has a chance to be executed.
How can I make sure that, when the system will process the completion
events, they will be serialized and use a single thread and will not be
executed in parallel by several threads?
 
P

Peter Duniho

Sharon said:
Hello all,

I'm using Socket.BeginSend.
In my code, a stress situation may lead to several calls to BeginSend,
before any completion callback has a chance to be executed.
How can I make sure that, when the system will process the completion
events, they will be serialized and use a single thread and will not be
executed in parallel by several threads?

Not enough information in your question for a precise answer. You have
some options though:

– Fix the design so that concurrent processing of completions of
sends isn't a problem. IMHO, this is actually the best solution,
assuming it's doable.

– If you simply need serialized processing, but the actual order
doesn't matter, then set up a producer/consumer. Or, if the processing
is not expensive, just serialize via the "lock" statement.

– If the order of processing matters also, then you'll need to create
a queue as the sends are begun. Then, as they complete, you'll update
each's status in the queue, and then process each send in the queue
that's been completed, stopping when you find that the first one in the
queue hasn't yet been completed. Again, you'll still need some
synchronization to serialize access to the queue, and whether processing
of the queue actually takes place in a producer/consumer pattern, or is
simply done in a completion callback, will depend on how expensive you
anticipate the processing will be.

Pete
 
P

Peter Duniho

Sharon said:
Hi Pete,

Thanks for your reply.

I will rephrase the problem: Since this asynch model is implemented by the
Framework, using a thread/thread pool, I believe, for dequeueing completion
packets, I just need to make sure that there is only one thread that dequeues
the completion packets of my BeginSend calls (or a thread pool of size 1 for
this matter).
Is it controllable?

No.

If you don't care about the scaling benefits of using IOCP, which is
implicitly supported using the asynchronous model of the Socket class,
then you could of course implement your own asynchronous model in which
you use only one thread. How that would qualify as "asynchronous", I'm
not really sure, but it sounds like that would be good enough for you.

That said, there should be no reason that your completion callbacks
themselves need serialization. Whatever design goal you have in mind,
I'm positive it's compatible with the default implementation of the
asynchronous model of the Socket class.

Pete
 
T

Tim Roberts

Sharon said:
I will rephrase the problem: Since this asynch model is implemented by the
Framework, using a thread/thread pool, I believe, for dequeueing completion
packets, I just need to make sure that there is only one thread that dequeues
the completion packets of my BeginSend calls (or a thread pool of size 1 for
this matter).

That would create a strong dependency on an operating system implementation
detail -- not a healthy thing. What if they used a different model in the
next update?

The much better solution is to use some kind of lock, so YOU can make sure
your processing is serialized. That's what they're there for.
 
S

Sharon

Hi Tim,
You are correct. But if I want to use my own concurrency model, can I create
my own thread pool?
 
T

Tim Roberts

Sharon said:
You are correct. But if I want to use my own concurrency model, can I create
my own thread pool?

Sure, but Socket.BeginThread isn't going to use it.

If you really need lower-level control, you'll need to implement your own
socket code.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top