Frames lost when buffer exceeds

D

Dan

On Windows 2000 Server I have a 13 PC Network.

We are reguarly getting timed out on the Internet and what
makes me believe this is the case is the Network traffic.

I have taken a screenshot of "Network Monitor" (Please see
HERE
)

Now you'll see I neatly highlighted the frames being lost
when the buffer exceeds.

Now I know for sure that this shouldn't be happening, but
what I do not know is why its happening.

Is there anyone here who has the knowledge to point me in
the right direction of what appears to be happening?

Thank you

Dan
 
P

Phill

Consider that when you monitor the network you want to control how much data
you collect. Lets say 1Mb and lets call this a buffer. When the 1Mb is full
you will lose information.
% Buffer Utilized is 100% ie full
So because the buffer is full you have lost 2789 frames.
There is an option to increase the buffer size.

The network monitor that you are using is probably the cut down version
(full version comes with SMS v2) Cut down version only shows packets to and
from the box running the monitor. Hence LOCAL and Broadcasts.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top