G
Gugale at Lincoln
Hi,
I am working on an application which uses date as a primary key. All my
records are at least a few millisecond apart and are in the form
"20070630T12:50:24.207". SQL Server has a precision of 1/3000 of second to
store dates. I would like to bring down the precision of my data to match
SQL Server's precision. I am doing this to avoid conflicts in ADO.Net
dataset. ADO.Net is more precise and accepts duplicate rows when it
shouldn't which later generates an error while updating the data. Speed is
very critical for this application. The solution I have developed is not
fast enough.
private DateTime AdjustToSQLPrecision(DateTime t)
{
long[] adjust = { 0, 0, 30000, 30000, 30000, 70000, 70000,
70000, 70000, 100000 };
long ticks = t.Ticks;
int remainder = (int)(ticks % 100000);
ticks = ticks - remainder;
ticks = ticks + adjust[remainder / 10000];
return new DateTime(ticks);
}
Can someone suggest a better solution?
Thanks
SG
I am working on an application which uses date as a primary key. All my
records are at least a few millisecond apart and are in the form
"20070630T12:50:24.207". SQL Server has a precision of 1/3000 of second to
store dates. I would like to bring down the precision of my data to match
SQL Server's precision. I am doing this to avoid conflicts in ADO.Net
dataset. ADO.Net is more precise and accepts duplicate rows when it
shouldn't which later generates an error while updating the data. Speed is
very critical for this application. The solution I have developed is not
fast enough.
private DateTime AdjustToSQLPrecision(DateTime t)
{
long[] adjust = { 0, 0, 30000, 30000, 30000, 70000, 70000,
70000, 70000, 100000 };
long ticks = t.Ticks;
int remainder = (int)(ticks % 100000);
ticks = ticks - remainder;
ticks = ticks + adjust[remainder / 10000];
return new DateTime(ticks);
}
Can someone suggest a better solution?
Thanks
SG