'Limits' in Maths

In an earlier blog on why the number 1 is not a prime number, I’d quoted these lines from Steven Strogatz’s The Joy of x:

“It pulls back the curtain on how maths is sometimes done. The naïve view is that we make our definitions, set them in stone, then deduce whatever theorems happen to follow from them. Not so. That would be too passive. We’re in charge and can alter definitions as we please.”

 

I found another such instance of this “(we) can alter definitions as we please” power in Jordan Ellenberg’s book, How not to be Wrong. It involves a topic everyone encounters at school maths – what is the value of the infinite series:

0.9 + 0.09 + 0.009 + 0.0009 + … (the ellipsis means infinite terms)

Common sense tells the sum keeps getting closer to 1 (0.9, then 0.99, then 0.999 and so on). Such examples led mathematicians to a deeper question:

“What is the numerical value of an infinite sum?”

 

This is not just a silly, only-mathematicians-would-care query. It is foundational to calculus, a topic that is critical in physics and so many other real-world topics. In fact, this was a nagging question in the calculus that Newton had created – Newton himself didn’t bother to answer it - he was only interested in calculus as a tool to explain gravity, not in the rigour of the underlying maths.

 

The great mathematician, Cauchy, introduced the notion of “limit” to calculus via his great innovation in maths:

“What is the numerical value of an infinite sum? It doesn’t have one – until we give it one.”

This sounds crazy. How can you just decide (set) the answer? Isn’t maths about calculating the answer?

 

Yes, that is true. But remember, Cauchy was only talking of special cases. Like the 0.9 series above. Where the answer is (1) getting closer to a certain value the more terms you add; and (2) does not get further away from the value of the previous step when you add more terms. In such cases, Cauchy announced that you can define the sum as the value to which it is getting close to. In the 0.9 series example, you define the sum as 1.

“And then he (Cauchy) worked very hard to prove that committing oneself to his definition didn’t cause horrible contradictions to pop up elsewhere. By the time this labour was done, he’d constructed a framework that made Newton’s calculus completely rigorous.”

 

While Cauchy’s approach has worked for over 2 centuries without ever creating any “horrible contradictions”, it violates common sense on a different front. Common sense tells us the sum of the 0.9 series is obviously 0.999…, i.e., endless 9’s after the decimal. And here we have Cauchy telling us the answer is 1. How can two numbers that are obviously different (0.999… v/s 1) be the same? The mind-blowing answer - Cauchy was telling us that “the uniqueness of the decimal expression” needed to “go out of the window”!

 

However weird it may sound, mathematicians agree with Cauchy.

Comments

Popular posts from this blog

Student of the Year

Animal Senses #7: Touch and Remote Touch

The Retort of the "Luxury Person"