Odd statistical results when measuring time to get a WebResponse

  • Thread starter Thread starter Joannes Vermorel
  • Start date Start date
J

Joannes Vermorel

I am currently doing some CS research that requires to collect some "web
page retrieval latency" statistical data. Basically I have a list of URLs,
and I want to measure the retrieval time of those URLs. So far, it seems
very simple to do :

<code>
DateTime beforeCall = DateTime.Now;
myHttpRequest.GetResponse();
DateTime afterCall = DateTime.Now;
</code>

At the end, the delays are measured in millesecond, and my code is working
(both on .Net and with Mono). What is bothering me are the results : 80% of
the millisecond time measures with .Net ends up with zeros (like 80ms, 90ms
....). With Mono, the results are more "randomly" distributed.

My question is "Why 80% of the time do I get a rounded result with .Net ?".
I would understand a 100% case (always rounding), a 10% case (uniform
probability), but this 80% is very weird.

Do anyone has an idea about it ?

Thanks,
Joannes
 
Ok I think I figured out the problem, looking at
http://www.eggheadcafe.com/articles/20021111.asp

" The reason is because of the thread context switch process, the
DateTime.Now value has a resolution of 16 milliseconds or thereabouts on
most machines." (citation)

My next question is how do I manage to get a precise counter from within
..Net without having to pass through QueryPerformanceCounter ?

Joannes
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Back
Top