The Bishop Who Hated Infinitesimals (and Why It Matters)
Bishop Berkeley and why we should fix AI’s Black Box
Bishop Berkeley’s work is a strong reminder for AI developers and society.
Many of us know about the Newton-Leibniz feud, that bitter quarrel of the calculus pioneers. Who invented it first, who borrowed from whom, and who ultimately deserved the glory. But behind this main act was a peculiar side story, one that features neither Newton's gravity-defying apple nor Leibniz's elegant notation. This is the story of George Berkeley, a philosopher-priest who who took direct aim at the very core of mathematics and may have asked a simple yet devastating question: What, in God’s name, is an infinitesimal? The answer, he implied, was that there really wasn't one, not in any meaningful, provable way.
Who Was This Bishop, Anyway?
George Berkeley was not, by any stretch, your typical critic of mathematics. He was a philosopher, first and foremost, and he wore the robe of an Anglican bishop in a corner of Ireland not exactly famous for producing math prodigies. Berkeley lived in a world shaped by the Enlightenment, an era where reason and empirical science were beginning to challenge religious and traditional authorities. Newtonian physics was rapidly becoming a kind of intellectual gospel, a framework through which one might explain both falling apples and the movement of stars. The rise of empirical methods and a mechanistic worldview presented a direct challenge to metaphysical and theological explanations, creating a cultural backdrop in which Berkeley's critiques took on profound significance. Yet Berkeley's own interests were less about the celestial and more about the nature of knowledge. He was deeply concerned with how we know what we know, not just the content of our knowledge. And when he took a long, hard look at Newtonian calculus, he saw a big, glaring problem.
To Berkeley, calculus appeared to rest upon shaky conceptual ground. The mysterious infinitesimals, those infinitely small quantities that appear in the numerator or denominator only to vanish without a trace, bothered him. For example, in early calculus, one might consider the derivative of a function as the ratio of two infinitesimally small changes in variables, such as Δy/Δx, with Δ approaching zero. These quantities seemed like ghosts, phantasms that were somehow necessary to calculate derivatives but vanished the moment they had served their purpose. Berkeley coined the phrase, “ghosts of departed quantities,” and it stuck. He wanted mathematicians to explain, coherently and rigorously, how they justified their reliance on such specters. It wasn't until the introduction of the rigorous δ-ε (ε, δ) definition of limits by mathematicians like Cauchy and Weierstrass that calculus found a solid foundation. This approach allowed mathematicians to precisely define what it means for a function to approach a value, addressing Berkeley's concerns about the logical inconsistencies of infinitesimals. To Berkeley, these ghostly quantities were no better than a magician's sleight of hand, a trick where mathematicians pretended rigor while actually cutting corners.
How to Roast a Mathematician
To be fair, Berkeley wasn’t against mathematics. He was against bad epistemology. He published his arguments in The Analyst (1734), ostensibly as a critique of calculus, but also, perhaps even primarily, as a jab at atheists who used calculus to boast about the superiority of scientific over religious reasoning. His work challenged both mathematicians and those who saw themselves as the new intellectual priests of a secular age. Mathematicians like Colin Maclaurin and others attempted to counter Berkeley's arguments by defending the practical success of calculus, even if the foundational rigor was lacking. Maclaurin, for instance, worked to justify Newton's methods and demonstrated that the results of calculus were not merely coincidental but systematically reliable, despite Berkeley's philosophical objections. Berkeley did what any good philosopher does, he got under everyone's skin. He asked, in essence, ‘If you’re so smart, why are you still relying on ideas you can’t even define properly?’ The result was irritation, sure, but also introspection.
What Berkeley managed to do was shake the faith in the calculus as it then stood. Not because he proved it wrong, the answers calculus produced were undeniably correct, but because he revealed that no one was quite sure why they were right. Mathematicians had powerful tools but lacked a firm philosophical foundation. The power of Newton's fluxions or Leibniz's differentials was there for all to see, but Berkeley's critique revealed the scaffolding underneath was less than sturdy. How could they talk about “infinitely small quantities” and act as though these unseeable, untouchable entities had real, measurable existence? They couldn't, at least not without a clearer articulation of the concepts in play.
The Mathematicians Respond
Berkeley may have lacked mathematical training, but his philosophical training was sharp enough to provoke a substantial response. The very discomfort he caused led to a slow but monumental shift in mathematics. His critique highlighted the need for rigor. Mathematicians like Augustin-Louis Cauchy and later Karl Weierstrass set out to reframe calculus in terms that even a skeptic could accept. As I mentioned above, enter the “δ-ε definition or (ε, δ)” of limits, a notion that allowed mathematicians to rigorously define what they meant when they said something was approaching zero, but not quite there. It took roughly a century for calculus to shed the phantoms Berkeley had identified, but it did. The ‘rigorization’ of calculus became one of the crowning achievements of 19th-century mathematics. Moreover, in the 20th century, infinitesimals themselves were given a rigorous foundation in non-standard analysis, a branch of mathematics developed by Abraham Robinson. This new approach provided a formal way to work with infinitesimals, addressing Berkeley's concerns in a modern context. So, while Berkeley's critiques were valid in his time, mathematics has since evolved to address these issues comprehensively.
The Irony of Berkeley's Victory
There’s a twist in this story, though, which Berkeley might have appreciated. In questioning calculus, he paved the way for a stronger, more resilient mathematical framework, even if it meant giving legitimacy to the very secular science he so often combated. His critique indirectly led to developments that would ultimately make calculus indispensable in the very scientific worldview that was undermining religion's intellectual authority. He never got to see this outcome; Berkeley died in 1753, long before the mathematicians took his rebuke to heart. But his role as the gadfly of early calculus is, if not celebrated, certainly acknowledged by those who understand the history of mathematics. He forced math to grow up, to address its own inconsistencies, and to become what it is today: a field rooted in careful, precise definition.
A Bishop's Legacy for the Digital Age
So what are the lessons for today? Well, in an era of machine learning, quantum computing, and abstract mathematics, Berkeley’s challenge is profound. How often do we accept black boxes in our models, happy to treat them as magic so long as they deliver results? Berkeley’s relentless questioning reminds us that rigor isn’t just an academic pursuit, it’s a necessary condition for true understanding. Critical thinking and a demand for rigor are essential for the advancement of knowledge, regardless of the field. Without such foundations, even the most powerful tools risk becoming unreliable. He teaches us to ask hard questions, to poke at the foundational assumptions, and to demand not just utility, but clarity.
It’s tempting to dismiss Berkeley as a relic, a curious figure from an age before formal proofs were what they are now. But perhaps he’s more relevant than ever, a reminder that the quest for understanding should never rest comfortably. We still have our ghosts, only today, they wear the face of neural networks and probabilities instead of infinitesimals. Berkeley’s revenge, it seems, is an eternal one. The insistence that behind every useful equation, there ought to be a clear and coherent idea. And in a world increasingly driven by algorithms and automations, that’s advice we would do well to remember.
Stay curious
Colin
Image source Wikipedia