G
Guest
In the below code, mOnePlusTwo evaluates to 0.30000000000000004 (although it
displays as ".3"). Consequently, comparing that value to mThree (.3) results
in false.
What's going on? Is there something else I should do (other than switch to
using decimal types)?
double mOne = .1;
double mTwo = .2;
double mThree = .3;
double mOnePlusTwo = mOne + mTwo;
bool mEqual = false;
if (mOnePlusTwo == mThree)
{
mEqual = true;
}
Console.WriteLine("mOne = " + mOne);
Console.WriteLine("mTwo = " + mTwo);
Console.WriteLine("mThree = " + mThree);
Console.WriteLine("mOnePlusTwo = " + mOnePlusTwo);
Console.WriteLine("mThree == mOnePlusTwo? = " + mEqual);
displays as ".3"). Consequently, comparing that value to mThree (.3) results
in false.
What's going on? Is there something else I should do (other than switch to
using decimal types)?
double mOne = .1;
double mTwo = .2;
double mThree = .3;
double mOnePlusTwo = mOne + mTwo;
bool mEqual = false;
if (mOnePlusTwo == mThree)
{
mEqual = true;
}
Console.WriteLine("mOne = " + mOne);
Console.WriteLine("mTwo = " + mTwo);
Console.WriteLine("mThree = " + mThree);
Console.WriteLine("mOnePlusTwo = " + mOnePlusTwo);
Console.WriteLine("mThree == mOnePlusTwo? = " + mEqual);