I was remembering one of my classes from computer engineering when I was just starting it at the university. The definition of the XOR operator.
One of the ways to define XOR is:
A ⊕ B = ~A . B + A . ~B
In other words, one and only one of the variables is true. Great... couldn't be simpler.
When you consider operators like and and or, you find that you can break these operations into smaller pieces when you are considering more than two variable... like this:
A + B + C = ( A + B ) + C = A + ( B + C )
A . B . C = ( A . B ) . C = A . ( B . C )
All is fine and dandy.... but then let's say we find a case of exclusive or where all the variables (more than two) have to be considered at the same time.
Say you are considering a definition where you have three variables where one and only a single one of them is true to consider the result to be true. If we try the same logic that was applied to and or or, you would end up with this:
A ⊕ B ⊕ C = A ⊕ ( B ⊕ C )
If none of them is true, it will be false. If only one of the variables is true, it will be true, if two of them are true, it will be false but (and here's the problem) if all of them are true, it will be true.... and according to the problem definition, it will be wrong cause one and only one of them had to be true for the result to be true in the end. In other words, all operators have to be considered at the same time to say whether the result is true or not.
For this case I think there should be a different operator called "MultiXOR" or something like that. The problem with this operator is that there's no easy way to break it into smaller pieces. Here's a shot using a recursive definition:
x1 m⊕ x2 m⊕ x3 ... m⊕ xn = x1 . ~( x2 + x3 ... + xn ) + ~x1 . ( x2 m⊕ x3 ... m⊕ xn )
⊕m being the MultiXOR operator.
There you have it.