Date Precision Problem

G

Gugale at Lincoln

Hi,

I am working on an application which uses date as a primary key. All my
records are at least a few millisecond apart and are in the form
"20070630T12:50:24.207". SQL Server has a precision of 1/3000 of second to
store dates. I would like to bring down the precision of my data to match
SQL Server's precision. I am doing this to avoid conflicts in ADO.Net
dataset. ADO.Net is more precise and accepts duplicate rows when it
shouldn't which later generates an error while updating the data. Speed is
very critical for this application. The solution I have developed is not
fast enough.

private DateTime AdjustToSQLPrecision(DateTime t)
{
long[] adjust = { 0, 0, 30000, 30000, 30000, 70000, 70000,
70000, 70000, 100000 };
long ticks = t.Ticks;
int remainder = (int)(ticks % 100000);
ticks = ticks - remainder;
ticks = ticks + adjust[remainder / 10000];
return new DateTime(ticks);
}

Can someone suggest a better solution?

Thanks
SG
 
R

Ray Booysen

My solution is not to use DateTime as a primary key. This simply doesn't
scale if the table gets hit more often.

I would seriously reconsider your DB design (if you're able to) to
incorporate a proper primary key.

Kind Regards
Ray Booysen
 
?

=?ISO-8859-1?Q?G=F6ran_Andersson?=

Gugale said:
Hi,

I am working on an application which uses date as a primary key.

The problem is that you are using a date as key. As the ways of storing
a date value differs, you get potential conversion errors at every step.
All my
records are at least a few millisecond apart and are in the form
"20070630T12:50:24.207". SQL Server has a precision of 1/3000 of second to
store dates. I would like to bring down the precision of my data to match
SQL Server's precision. I am doing this to avoid conflicts in ADO.Net
dataset. ADO.Net is more precise and accepts duplicate rows when it
shouldn't which later generates an error while updating the data. Speed is
very critical for this application. The solution I have developed is not
fast enough.

private DateTime AdjustToSQLPrecision(DateTime t)
{
long[] adjust = { 0, 0, 30000, 30000, 30000, 70000, 70000,
70000, 70000, 100000 };
long ticks = t.Ticks;
int remainder = (int)(ticks % 100000);
ticks = ticks - remainder;
ticks = ticks + adjust[remainder / 10000];
return new DateTime(ticks);
}

If you have performance problems, it's hardly because of that method.
It's possible to speed it up, but compared to any database operation the
time spent in the method is negligible.
Can someone suggest a better solution?

Use a data type that does not have precision problems.
 
S

Stephen Ahn

I am working on an application which uses date as a primary key. All my
records are at least a few millisecond apart and are in the form
"20070630T12:50:24.207". SQL Server has a precision of 1/3000 of second to
store dates.

Actually, I think the resolution for the datetime datatype in SQL Server is
3.33 milliseconds.
Try this :

declare @dt1 datetime
declare @dt2 datetime
set @dt1 = '2007-06-22T02:23:10.007'
set @dt2 = '2007-06-22T02:23:10.005'

if @dt1 = @dt2
print 'are equal'
else
print 'not equal'

This prints "are equal".

As others have said, the DateTime type is probably not suitable to use as a
primary key for what you are trying to do.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top