C
colin
Hi,
Ive finally found what the difference is,
it seems a difference of the least significant bit
is enough to cuase a problem with part of my code,
wich determins if one point is inside a 3d model.
it works if the reference inside point does not lie exactly on
any one surface and this holds true 99.999% of the time in the debugger.
it seems sometimes my plane is ending up as X=0.9999999 in the debugger
and X=1.000000 without debugger atatched and this is enough to make
the difference of the point being exactly on the plane or not.
the inside point is used for the start of the ray.
if the start is on the plane then Ray.IntersectPlane returns null
(rather than zero.)
it also seems to make a difference when I add more code to test for certain
points wich
cuased the error last time and then print out more detailed debug info,
becuase then the point at wich it goes wrong changes.
is there something wich can be affecting the least significant bit like this
?
could it be using higher accuracy fp without the debugger attached ?
if so why does code changes wich dont alter the data at all make a
difference ?
idealy my code should not be susceptable to this so I am going to change it
anyway.
but I think I would like it to be totally deterministic,
even down to the last significant digit as im only using float.
thanks
Colin =^.^=
Ive finally found what the difference is,
it seems a difference of the least significant bit
is enough to cuase a problem with part of my code,
wich determins if one point is inside a 3d model.
it works if the reference inside point does not lie exactly on
any one surface and this holds true 99.999% of the time in the debugger.
it seems sometimes my plane is ending up as X=0.9999999 in the debugger
and X=1.000000 without debugger atatched and this is enough to make
the difference of the point being exactly on the plane or not.
the inside point is used for the start of the ray.
if the start is on the plane then Ray.IntersectPlane returns null
(rather than zero.)
it also seems to make a difference when I add more code to test for certain
points wich
cuased the error last time and then print out more detailed debug info,
becuase then the point at wich it goes wrong changes.
is there something wich can be affecting the least significant bit like this
?
could it be using higher accuracy fp without the debugger attached ?
if so why does code changes wich dont alter the data at all make a
difference ?
idealy my code should not be susceptable to this so I am going to change it
anyway.
but I think I would like it to be totally deterministic,
even down to the last significant digit as im only using float.
thanks
Colin =^.^=