Euler's identity is as misunderstood, as it is misattributed. Euler himself never committed the equation to paper.
Firstly, it is not an identity as the values are not absolute.
Secondly, major misconceptions in mathematics arise from the unspoken rules, in this case that the symbol = means equal to a given value of significance.
E.g. 3.456432 = 3.4563456 (to 4 s.f.)
To one significant figure both numbers would be 3. To five significant figures they are no longer equal.
The equation e^{\pi i}+1=0 involves {e}, {\pi}, and {\i}, which are irrational and unreal numbers respectively. Raising these numbers to each other is not going to make them absolutely equal to anything, infinitely close yes, but not an absolute value.
If we use algebra laws on Euler's identity it fails immediately and we see e^{\pi i} = -1.
This means to one significant figure, the answer was negative one, however adding the arbitrary absolute value of 1 to the equation created an inequality and a value infinitesimally small, but still greater than zero.
Going back to our algebraic rules and actual identities we know that e^{pi i} = (e^\pi)^\i.
If we take the i root of both sides, we get e^\pi = \i \sqrt{-1}.
Dividing both sides by \i \sqrt{-1}, we find e^\pi / \i \sqrt{-1} = 1.
Now we have equivalence to an absolute value and this is the true essence of Euler's identity.
Comments
Post a Comment