G
Guest
I save a number in the table and want to get that number again, but the
number I get has lower precision than I expect. For example, when I divide
10/3 I get 3.3333333333333335 if the variable is of type Double. But saving
this result into a table with a column of type Double decreases the precision
to 3.333333333333333 , so when I get this number and multiply it 10 I do not
get that exact number which is 10.
How can I solve this problem? so when a field in SQL Server is of type float
and can save a number such as 3.3333333333333335, the ADO.NET table must also
save such number if it is of type Double or Decimal.
I had no choice except to use String!
number I get has lower precision than I expect. For example, when I divide
10/3 I get 3.3333333333333335 if the variable is of type Double. But saving
this result into a table with a column of type Double decreases the precision
to 3.333333333333333 , so when I get this number and multiply it 10 I do not
get that exact number which is 10.
How can I solve this problem? so when a field in SQL Server is of type float
and can save a number such as 3.3333333333333335, the ADO.NET table must also
save such number if it is of type Double or Decimal.
I had no choice except to use String!