
10 Jan
2008
10 Jan
'08
3:22 a.m.
Hi! Why is 0/0 (which is NaN) > 1 == False and at the same time 0/0 < 1 == False. This means that 0/0 == 1? No, because also 0/0 == 1 == False. I understand that proper mathematical behavior would be that as 0/0 is mathematically undefined that 0/0 cannot be even compared to 1. There is probably an implementation reason behind it, but do we really want such "hidden" behavior? Would not it be better to throw some kind of an error? Mitar