Timer fires a few milliseconds before the actual Due-Time

G

Guest

I am using VS 2005 Beta - C#

Problem: The Timer fires a few milliseconds before the actual Due-Time

Let's say a timer is created in the following manner:
System.Threading.Timer m_timer = null;

Let's declare a constant
Int32 m_TimePeriod = 10000;


In the Initialize method:
m_timer = new System.Threading.Timer(new TimerCallback(XXXTimerProc));
m_timer.Change(m_TimePeriod, 0);


In WndProc of Form:
Get the current time and store in m_dtCurrentTime

// Change the Due-Time and Period to enable the timer
m_timer.Change(m_TimePeriod, 0);


In the XXXTimerProc:
Check if ( Current Time - m_dtCurrrentTime) >= m_TimePeriod
{
// Perform some action

::
::

// Change the Due-Time and Period to disable the timer
m_timer.Change(System.Threading.Timeout.Infinite,
System.Threading.Timeout.Infinite);
}


In the WndProc of the Form, the Timer's Change method is called to set the
DueTime to m_TimePeriod. This enables the timer. But the Timer is fired a few
milliseconds before m_TimePeriod. As a result, In the XXXTimerProc, the
condition for checking the difference between the DateTime values fails and
the neccesary actions do not take place.

Is there something wrong that I am doing or Is it a bug in VS 2005 Beta?
 
I

Ignacio Machin \( .NET/ C# MVP \)

Hi,

First of all you will NEVER get the event executed exactly as you
especified, winXX is not a real time OS so what is done when the time to
execute the timer is simply generate an event, the handler will be executed
in a thread frm the threadpool.
As you can imagine all this actions take time.

With that said the comparision if ( Current Time - m_dtCurrrentTime) will
never be exact.
And really don't see the point of the comparision, after all you are calling
this from a timer, what is the point to check the timer elapsed time ?


Why are you calling Change twice , there is where your problem may be.

cheers,
 
J

John Vottero

I have also seen this problem and I haven't found an explanation.

See in-line comments:

Ignacio Machin ( .NET/ C# MVP ) said:
Hi,

First of all you will NEVER get the event executed exactly as you
especified, winXX is not a real time OS so what is done when the time to
execute the timer is simply generate an event, the handler will be
executed in a thread frm the threadpool.
As you can imagine all this actions take time.


We realize that Windows isn't a real time OS, I would understand if the
timer went off a few milliseconds too late or even seconds late but, the
problem is it's going off too soon. If I set a timer to go off 50,000
milliseconds from now, why would it go off 49,998 milliseconds from now?
The only explanation that I can think of it that the Windows Time service
adjusts the time forward a few milliseconds but that doesn't affect when
timers we fire. That's just a guess.
With that said the comparision if ( Current Time - m_dtCurrrentTime) will
never be exact.
And really don't see the point of the comparision, after all you are
calling this from a timer, what is the point to check the timer elapsed
time ?

In our case, we have a collection of things that have to happen at specific
times. We walk through that collection and find the nearest time and then
calculate the number of milliseconds between now and the items action time.
Then we set a timer to go off in that number of milliseconds. When the
timer goes off, we walk through the list again expecting to find at least
one item with an action time that has past but, sometimes we don't because
the timer went off a few milliseconds too soon.

In our case, it just means that we reset the time for just a few
milliseconds but I would still like to know what's going on.


John Vottero
 
R

Robert Heuvel

why don't you use a thread instead that runs constantly Sleep(0-1) and
distributes tasks as the come to be? Its the most precise way to do it and
is only a tick more difficult...
 
J

John Vottero

Robert Heuvel said:
why don't you use a thread instead that runs constantly Sleep(0-1) and
distributes tasks as the come to be? Its the most precise way to do it and
is only a tick more difficult...

I don't see why that would solve the problem. If I calculate the number of
milliseconds until 1:00:00.0000 and do a Sleep(thatNumber) I could still
wake from the sleep to find that it's 12:59:59.9773.
 
G

Guest

Hi,

Well, regarding the Change event being called twice: After the timer is
fired, it is disabled. Subsequently, On receiving a message in the WndProc of
the Form, I would like to enable the timer. Hence the Change event is called
twice - once to disable the timer and the next time to enable it.

I can understand if the timer fires a millisconds later than the actual
time. But the problem is that it fires a few milliseconds before the actual
time. Anybody to help out?

Thanks.
 
W

Willy Denoyette [MVP]

HL said:
I am using VS 2005 Beta - C#

Problem: The Timer fires a few milliseconds before the actual Due-Time

Let's say a timer is created in the following manner:
System.Threading.Timer m_timer = null;

Let's declare a constant
Int32 m_TimePeriod = 10000;


In the Initialize method:
m_timer = new System.Threading.Timer(new TimerCallback(XXXTimerProc));
m_timer.Change(m_TimePeriod, 0);


In WndProc of Form:
Get the current time and store in m_dtCurrentTime

// Change the Due-Time and Period to enable the timer
m_timer.Change(m_TimePeriod, 0);


In the XXXTimerProc:
Check if ( Current Time - m_dtCurrrentTime) >= m_TimePeriod
{
// Perform some action

::
::

// Change the Due-Time and Period to disable the timer
m_timer.Change(System.Threading.Timeout.Infinite,
System.Threading.Timeout.Infinite);
}


In the WndProc of the Form, the Timer's Change method is called to set the
DueTime to m_TimePeriod. This enables the timer. But the Timer is fired a
few
milliseconds before m_TimePeriod. As a result, In the XXXTimerProc, the
condition for checking the difference between the DateTime values fails
and
the neccesary actions do not take place.

Is there something wrong that I am doing or Is it a bug in VS 2005 Beta?


All timers in the Framework are using the System clock as timer source, this
clock has a (system wide default) resolution of x milliseconds (where x is
10 msec. on most X86 based hardware) - that means that the clock changes
(fires) every x (10) msec.
Say you initialize a timer to fire after 10 seconds, and say this happens 2
msec. after the last system clock tick, that means that the timer counts
10000 ticks before it fires, that is after 8 msec. (to the first tick) +
9999 * 10 msec = 9999.8 msec.
When you initialize the timer, your current system time is the time of the
system clock at the last tick, that is the timer initialization time - 2
msec, lets call it t1. When the timer fires the current time is the same as
the time of the system clock, that is 10000 clock ticks after t1 or exactly
10000 msec. That means that your timer interval will always be up to x - 1
msec. less than the wanted interval.
What you could do to solve your problem (though I don't get why you are
doing this) is correct the start time by adding the clock interval to the
start time t1 (your m_dtCurrentTime).

Willy.
 
J

John Vottero

Willy Denoyette said:
[snip]

All timers in the Framework are using the System clock as timer source,
this clock has a (system wide default) resolution of x milliseconds (where
x is 10 msec. on most X86 based hardware) - that means that the clock
changes (fires) every x (10) msec.
Say you initialize a timer to fire after 10 seconds, and say this happens
2 msec. after the last system clock tick, that means that the timer counts
10000 ticks before it fires, that is after 8 msec. (to the first tick) +
9999 * 10 msec = 9999.8 msec.
When you initialize the timer, your current system time is the time of the
system clock at the last tick, that is the timer initialization time - 2
msec, lets call it t1. When the timer fires the current time is the same
as the time of the system clock, that is 10000 clock ticks after t1 or
exactly 10000 msec. That means that your timer interval will always be up
to x - 1 msec. less than the wanted interval.
What you could do to solve your problem (though I don't get why you are
doing this) is correct the start time by adding the clock interval to the
start time t1 (your m_dtCurrentTime).

Thanks. Is there a way to get the clock interval? Is it a fairly safe bet
that the clock interval will always be <= 10ms?
 
W

Willy Denoyette [MVP]

John Vottero said:
Willy Denoyette said:
[snip]

All timers in the Framework are using the System clock as timer source,
this clock has a (system wide default) resolution of x milliseconds
(where x is 10 msec. on most X86 based hardware) - that means that the
clock changes (fires) every x (10) msec.
Say you initialize a timer to fire after 10 seconds, and say this happens
2 msec. after the last system clock tick, that means that the timer
counts 10000 ticks before it fires, that is after 8 msec. (to the first
tick) + 9999 * 10 msec = 9999.8 msec.
When you initialize the timer, your current system time is the time of
the system clock at the last tick, that is the timer initialization
time - 2 msec, lets call it t1. When the timer fires the current time is
the same as the time of the system clock, that is 10000 clock ticks after
t1 or exactly 10000 msec. That means that your timer interval will always
be up to x - 1 msec. less than the wanted interval.
What you could do to solve your problem (though I don't get why you are
doing this) is correct the start time by adding the clock interval to the
start time t1 (your m_dtCurrentTime).

Thanks. Is there a way to get the clock interval? Is it a fairly safe
bet that the clock interval will always be <= 10ms?

Yep, using PInvoke...

[DllImport("kernel32.dll")]
extern static int GetSystemTimeAdjustment(out uint lpTimeAdjustment,
out uint lpTimeIncrement, out bool lpTimeAdjustmentDisabled);

....
GetSystemTimeAdjustment(out adjustment,
out clockInterval,
out adjustmentDisabled );

clockInterval returns the clock interval in 100 nsec. units. Divide this
number by 1000 to get the msec. timer interval.

Willy.
 
G

Guest

Thanks Willy. The solution was very nicely explained.

Willy Denoyette said:
John Vottero said:
Willy Denoyette said:
[snip]

All timers in the Framework are using the System clock as timer source,
this clock has a (system wide default) resolution of x milliseconds
(where x is 10 msec. on most X86 based hardware) - that means that the
clock changes (fires) every x (10) msec.
Say you initialize a timer to fire after 10 seconds, and say this happens
2 msec. after the last system clock tick, that means that the timer
counts 10000 ticks before it fires, that is after 8 msec. (to the first
tick) + 9999 * 10 msec = 9999.8 msec.
When you initialize the timer, your current system time is the time of
the system clock at the last tick, that is the timer initialization
time - 2 msec, lets call it t1. When the timer fires the current time is
the same as the time of the system clock, that is 10000 clock ticks after
t1 or exactly 10000 msec. That means that your timer interval will always
be up to x - 1 msec. less than the wanted interval.
What you could do to solve your problem (though I don't get why you are
doing this) is correct the start time by adding the clock interval to the
start time t1 (your m_dtCurrentTime).

Thanks. Is there a way to get the clock interval? Is it a fairly safe
bet that the clock interval will always be <= 10ms?

Yep, using PInvoke...

[DllImport("kernel32.dll")]
extern static int GetSystemTimeAdjustment(out uint lpTimeAdjustment,
out uint lpTimeIncrement, out bool lpTimeAdjustmentDisabled);

....
GetSystemTimeAdjustment(out adjustment,
out clockInterval,
out adjustmentDisabled );

clockInterval returns the clock interval in 100 nsec. units. Divide this
number by 1000 to get the msec. timer interval.

Willy.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top