I've been on this kick lately where I have started question some of the basic assumptions I learned in math... things like "you can't easily define pi" or the concept of an "irrational number". Today's philosophical debate is on the idea of division by zero.
This came up because I was creating a ContinuedFraction interface for integers... The number 3, for instance, is 3 + 0/0 + 0/0 .... etc... instead of zeros I had to replace them with nulls in order to prevent the app from trying to compute the 0/0 bit... That got me thinking... maybe it should be allowed...
Let's take a simple concept. Let's say there are 10 pennies on the table, and it is decided that the pennies will be split evenly amongst everyone at the table. [Yes, pennies - I don't want anyone getting the bright idea to make change].
Now, if there are 5 people, everyone gets 2 pennies. Simple.
If there are 3 people, everyone gets 3 pennies and 1 is left on the table.
If there are no people, then there is still 10 pennies on the table.
What's so difficult about that?
So, in essence:
10/5 = 2r0
10/3 = 3r1
10/0 = 0r10
Seems like Occams Razor would say that the whole concept has been arbitrarily inflated to give us errors, exceptions and NaN.
What about the other way around?
If there are 0 pennies...
0/5 = no pennies for anyone
0/3 = same
0/0 = same
Seems like we really don't have an issue there.
Of course, this would mean that things such as limits get screwed up -- but so what. If we think we have irrational numbers, why not stir the pot a bit? Maybe they were not correctly defined in the first place.
Besides, that will open up the ContinuedFractions to more interesting options... like