On a decent motherboard you should be able to get near microsecond
accuracy with that, which means you probably want to use
sw.Elapsed.TotalMilliseconds instead of sw.Milliseconds.
I disagree. If your test is so sensitive that +/- 1ms makes any
difference, it's running for way too short a time to be meaningful,
IMO. I'd be very suspicious of any test running in less than a second
- and moderately suspicious of a test running in less than 10 seconds,
unless it's to show that (say) an algorithm taking 5 seconds is much
slower than one taking half a second. The variation between runs is
very, very rarely going to be less than a millisecond, so where's the
benefit in giving more precision? The benefit of giving *less*
precision is that Integers are, IMO, easier to immediately recognise
in terms of magnitude than reals. It's easier to compare at a glance,
say, 12532 and 3250 than 12532.23401 and 3250.195323.
I've also
seen a few weird glitches with the latter where occasionally the
machine will hiccup and you'll be off by six orders of magnitude.
That's very odd - never seen anything like that.
Jon