0.1*0.1=0.010000000000002 (error)

Still don’t really get this:

In JavaScript, all numbers are floating-point numbers, so when we write expressions like 0.1 + 0.2 , it’ll return some numbers that we don’t expect like 0.30000000000000004 .

0.1 and 0.2 are approximated to their true values. The approximated value of 0.2 is larger than its rational equivalent, but the closest approximating of 0.3 is smaller than the rational number.

Then we end up with the sum of 0.1 and 0.2 is slightly larger than the rational number 0.3 because 0.2’s approximation is slightly larger when approximated.

I mean why would a computer need to approximate 0.2 when it’s already an exact value (1/5)? I could understand approximating 1/3 or another repeating decimal.

2 Likes