How to measure performance by milliseconds?

G

Guest

I need to test a performance of some code of mine.
I tried using the Environment.TickCount but its resolution is too low (from
the MSDN remarks: The resolution of the TickCount property cannot be less
than 500 milliseconds.), I also tried the DateTime and it also gives me
minimum resolution of 500 milliseconds.
But I need to measure much less the 500 milliseconds, I need it to be
several milliseconds.
How can I do that?
 
G

Guest

Here is some code I use that wraps the QueryPerformanceCounter API:


using System;
using System.Runtime.InteropServices;
using System.ComponentModel;
using System.Threading;
namespace PAB
{
public class HiPerfTimer
{
[DllImport("Kernel32.dll")]
private static extern bool QueryPerformanceCounter(out long
lpPerformanceCount);
[DllImport("Kernel32.dll")]
private static extern bool QueryPerformanceFrequency(out long lpFrequency);
private long startTime ;
private long stopTime;
private long freq;
/// <summary>
/// ctor
/// </summary>
public HiPerfTimer()
{
startTime = 0;
stopTime = 0;
freq =0;
if (QueryPerformanceFrequency(out freq) == false)
{
throw new Win32Exception(); // timer not supported
}
}
/// <summary>
/// Start the timer
/// </summary>
/// <returns>long - tick count</returns>
public long Start()
{
QueryPerformanceCounter(out startTime);
return startTime;
}
/// <summary>
/// Stop timer
/// </summary>
/// <returns>long - tick count</returns>
public long Stop()
{
QueryPerformanceCounter(out stopTime);
return stopTime;
}
/// <summary>
/// Return the duration of the timer (in seconds)
/// </summary>
/// <returns>double - duration</returns>
public double Duration
{
get
{
return (double)(stopTime - startTime) / (double) freq;
}
}
/// <summary>
/// Frequency of timer (no counts in one second on this machine)
/// </summary>
///<returns>long - Frequency</returns>
public long Frequency
{
get
{
QueryPerformanceFrequency(out freq);
return freq;
}
}
}
}
 
W

William Stacey [MVP]

Think DateTime.Now is 10ms accurate. Not sure if that is too much for your
app. Also, this link seems to suggest Environment.TickCount is 1ms accurate
based on interpretation of the article and empirical evidence. Not sure.
http://www.informit.com/guides/printerfriendly.asp?g=dotnet&seqNum=241&rl=1

--
William Stacey [MVP]

|I need to test a performance of some code of mine.
| I tried using the Environment.TickCount but its resolution is too low
(from
| the MSDN remarks: The resolution of the TickCount property cannot be less
| than 500 milliseconds.), I also tried the DateTime and it also gives me
| minimum resolution of 500 milliseconds.
| But I need to measure much less the 500 milliseconds, I need it to be
| several milliseconds.
| How can I do that?
|
| --------
| Thanks
| Sharon
 
C

Chris Dunaway

If you are using .Net 2.0, check out the Stopwatch class. You call
it's Start method, then call it's Stop method and check the Elapsed
property to get the elapsed time. I believe it will use a high
resolution timer if the hardware supports it. Check the
IsHighResolution field to find out.
 
L

Lucian Wischik

Sharon said:
I need to test a performance of some code of mine.
I tried using the Environment.TickCount but its resolution is too low (from
the MSDN remarks: The resolution of the TickCount property cannot be less
than 500 milliseconds.), I also tried the DateTime and it also gives me
minimum resolution of 500 milliseconds.
But I need to measure much less the 500 milliseconds, I need it to be
several milliseconds.

I use System.Diagnostics.Stopwatch

But in any case, I think it's a bad idea to measure such a small
period. You should repeat your operation a thousand times and get the
overall time, then divide by a thousand. This will avoid edge-effects
&c.
 
M

Michel Walsh

Hi


Doing so you also time background tasks with higher priority, which is not
necessary under your control. If you run 1000 times code1 at noon and
compare it with 1000 runs of code2 you started at 9:00 AM, you may have a
huge difference in background tasks, sustained, over the duration of one of
your 1000 times of execution, but not on the other. Maybe irrelevant, in
general, but has to be aware of the possibility.


Vanderghast, Access MVP
 
G

Guest

The DateTime.Now and Environment.TickCount minimum resolution is 500
milliseconds. Only above this time you can get the 10ms accuracy.
 
J

Jon Skeet [C# MVP]

Sharon said:
The DateTime.Now and Environment.TickCount minimum resolution is 500
milliseconds. Only above this time you can get the 10ms accuracy.

I think you misunderstand what "minimum resolution" means here. It
means it will be at least as good as 500ms, but in practice it's likely
to be much better - typically 10 or 15ms. Here's a simple program which
attempts to detect the resolution:

using System;

class Test
{
static void Main()
{
long start = Environment.TickCount;

long now;
while ( (now=Environment.TickCount)==start)
{
}

Console.WriteLine ("Resolution: {0}", now-start);
}
}

On my XP laptop, that always gives 15 or 16. I would be very surprised
indeed to see it give 500.

If you need better resolution than that, you'll need to use P/Invoke to
call QueryPerformanceCounter or whatever if you're using .NET 1.1. As
another poster said, however, it would be better to time more
iterations if possible.
 
W

Willy Denoyette [MVP]

| The DateTime.Now and Environment.TickCount minimum resolution is 500
| milliseconds. Only above this time you can get the 10ms accuracy.
|
| -------
| Thanks
| Sharon

On modern Intel X86 based systems the clock interval is 10 msec. , on ADM
it's 15.625 msec, on SMP systems I have seen values between 10 msec and 60
msec depending on the system HAL. I have never seen 500 msec. as interval,
where do you get this value from?

Willy.
 
W

Willy Denoyette [MVP]

Or:

[DllImport("kernel32", SetLastError=false)]
static extern void GetSystemTimeAdjustment( out uint adjustment,
out uint clockInterval,
out int adjustmentDisabled );

.....
uint adjustment;
uint clockInterval; // in 100nsec units.
int adjustmentDisabled;
GetSystemTimeAdjustment(out adjustment, out clockInterval, out
adjustmentDisabled);
Console.WriteLine ("Resolution: {0} msec.",
clockInterval/10000.0);

Willy.

| > The DateTime.Now and Environment.TickCount minimum resolution is 500
| > milliseconds. Only above this time you can get the 10ms accuracy.
|
| I think you misunderstand what "minimum resolution" means here. It
| means it will be at least as good as 500ms, but in practice it's likely
| to be much better - typically 10 or 15ms. Here's a simple program which
| attempts to detect the resolution:
|
| using System;
|
| class Test
| {
| static void Main()
| {
| long start = Environment.TickCount;
|
| long now;
| while ( (now=Environment.TickCount)==start)
| {
| }
|
| Console.WriteLine ("Resolution: {0}", now-start);
| }
| }
|
| On my XP laptop, that always gives 15 or 16. I would be very surprised
| indeed to see it give 500.
|
| If you need better resolution than that, you'll need to use P/Invoke to
| call QueryPerformanceCounter or whatever if you're using .NET 1.1. As
| another poster said, however, it would be better to time more
| iterations if possible.
|
| --
| Jon Skeet - <[email protected]>
| http://www.pobox.com/~skeet Blog: http://www.msmvps.com/jon.skeet
| If replying to the group, please do not mail me too
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top