Measuring Time, to the millisecond

B

Brian

I'm writing a small app, one of the functions is like a stopwatch that is
counting down (or up) elapsed time. I need the elapsed time to at least
tenths of second (otherwise this would be a snap).

When updating a label with this time, it seems that I cannot get more
specific than a given second to define what "now" is. For example, to begin
prototyping I have a 100ms timer continually update a label. Each iteration
of the timer compares the current time to the static start time. Even in
measuring the difference in ticks reveals only second accuracy.

To continue the test, I can simply add 100 ms (the timer interval) as a
timespan to the start time each iteration, and this works. However I fear
this is really not accurate as I'm no longer measuring elapsed time, rather
accumulated iterations which in theory reflect time.

Any solutions to this? Hoping I'm doing something wrong.

Brian
 
C

Chris Tacke, eMVP

Managed code is not deterministic enough to provide millisecond accuracy.
You could use a performace counter or the GetTickCount counter in a DLL and
query it, but the simple act of querying it again won't be reliably
accurate. You will, however, be guaranteed that at the time of read, the
value read will be right.

-Chris
 
B

Brian

Thanks Peter, that's exactly what I've done once I realized it was there ..
seems accurate so far for my tests....

Thanks for posting,
Brian
 
D

Dick Grier

Hi,

This will give you the resolution that you seek. The accuracy will be less
than that, and... As Chris says, it is non-deterministic -- and actually
will vary quite a bit with program activity and even more if you are
multitasking.

Dick

--
Richard Grier (Microsoft Visual Basic MVP)

See www.hardandsoftware.net for contact information.

Author of Visual Basic Programmer's Guide to Serial Communications, 3rd
Edition ISBN 1-890422-27-4 (391 pages) published February 2002.
 
B

Brian

I don't think I'll have a choice but to trust it. It's informational and
not mission critical; also, I'm only capturing the tenths of a second, and
it seems accurate enough for this application. I've run it through a number
of tests and to this resolution it hasn't missed yet, though I realize this
is a possibility. I think it's a fact of the OS and I'll just deal with
it...

Brian
 
D

Dick Grier

Hi,

Under normal circumstances 0.1 second accuracy can be assumed, I think.
And, as long as the timing isn't "mission critical" as you say, if you
encounter an occasional "outer," that should be OK, too. As long as you
realize that things can change as soon as you release your program to your
users -- they have a nasty habit of doing things that affect timing that you
didn't anticipate.

Dick

--
Richard Grier (Microsoft Visual Basic MVP)

See www.hardandsoftware.net for contact information.

Author of Visual Basic Programmer's Guide to Serial Communications, 3rd
Edition ISBN 1-890422-27-4 (391 pages) published February 2002.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top