Precision error converting from .NET Double to SQL Decimal with Sq

G

Guest

I'm using the new SqlBulkCopy in .Net 2.0 Beta 2 to insert a number of values
into my DB. In the DB, the column is of type Numeric(18,4) and in my app I
am receiving a value in .Net type Double (can't control this). When I build
a datatable with Double values for those columns, then use SqlBulkCopy to
insert it into the table (using WriteToServer), the precision gets screwed
up. I'm not even sure if I would call it precision, it's just that the
decimal point ends up in completely the wrong place. For example, if the
number I want to insert is 186.3535335, I end up with 1.8635 in the DB. I'm
not sure exactly how this is happening, but it's really screwing things up
for me. I assumed that converting to Decimal would just lop off the extra
digits because I do not need precision beyond 4 decimal places, but this is
way off. I've since converted the column types to float and it works fine,
but this seems like a bug to me.

Also, as a sidenote, I tried building my DataTable with SqlTypes so that the
conversion would be done before I used SqlBulkCopy (in case it was a bug in
that) and I got an error saying that it could not convert from Double to type
SqlDecimal. This seems strange to me since SqlDecimal has a constructor that
takes a .Net Double...

I have no idea what's going on, but something is seriously wrong here.
 

Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments. After that, you can post your question and our members will help you out.

Ask a Question

Top