Impossible, irrational, delusionary, absurd, untrustworthy, fictitious, imaginary: You cant read much about religion today without encountering these adjectives, each intended to leave religious belief with a tired, messy, belittled sort of look.
You see it most with those who claim to be speaking in the name of scientific reason. The history of science itself, however, shows that the best scientists do not limit reality to the tangible or even the comprehensible. Unafraid of wild speculation, these are the minds making breakthroughs, thinking what was once un-thought, discovering what was deemed impossible. Scientific positivists may scoff at the irrational and the imaginary. But science does not.
Think of this in the history of mathematics. For most of us, the square root of negative one¯ √-1, the elusive i ¯is a hazy memory from high-school algebra class. But such key fields as quantum mechanics and electromagnetism depend on the mathematics of imaginary numbers. When engineers design airplane wings or cell-phone towers, imaginary numbers are vital to their calculations.
Yet, until the nineteenth century¯which is recent mathematical history, considering that Pythagoras lived in the sixth century B.C. and Euclid in the third¯the square root of negative one was seen as mystical, mysterious, and downright absurd. And for good reason: The square root of x, denoted √x, is a number that can be multiplied by itself to produce x. But since 1x1=1 and -1x-1=1 too, what is √-1? Beginning algebra students are naturally confused, and so were centuries of mathematicians.
Imaginary numbers almost appeared in the geometry of Heron of Alexandria in the first century A.D. Attempting to compute the volume of a truncated pyramid, he came across the expression √(81-144), which produces the square root of a negative number, √-63. Without explaining his logic or identifying his dilemma, Heron bypassed the negation and wrote √63. His mistake might seem sloppy, but negative numbers themselves were regarded warily¯if known at all¯in his time. No wonder he ignored their imaginary square roots, which must have seemed doubly absurd.
Imaginary numbers did not return to the mathematical scene until the sixteenth century, when Italian algebraists began solving cubic and higher-order equations. Scipione dal Ferro made some progress in this field, but he still overlooked the complex roots (impossible solutions, he called them) lurking in his equations. Instead, imaginary numbers made their entrance into algebra with Girolamo Cardano. Calling Cardano a scientist is both apt and preposterous. He worked as a physician and astrologer for some of Europes greatest men; he passed through Milan, Pavia, and Bologna as a mathematics professor (banished from each as scandals arose). Eventually exiled from the academic world¯in part because he claimed to have cast Christs horoscope¯he ended up as astrologer to the Vatican.
In the midst of his travels and debauchery, however, Cardano managed to have a productive mathematical career. He was the first Western mathematician to admit the possibility of negative numbers, and he went on to explore their square roots. Still, he wasnt ready to give credence to his results, adding the disclaimer, So progresses arithmetic subtlety, . . . as refined as it is useless. But even if Cardano considered √-1 useless, at least he considered it. His predecessors had deemed it dangerously magical, and given his reputation, Cardano was not the best spokesman for propriety. But slowly¯and ironically¯he helped dispel the superstitions of science.
Bolognese mathematician Rafael Bombelli made the next major move in 1560, recognizing that complex roots always come in pairs, and multiplication of these pairs gives real numbers. This result was for him startling and disconcerting: It was a wild thought in the judgment of many, he wrote in his Algebra , and I too for a long time was of the same opinion. The whole matter seemed to rest on sophistry rather than on truth. Yet I sought so long, until I actually proved this to be the case.
Over the next centuries, the opinion of the mathematical world wavered between skepticism and rejection, with a healthy dose of befuddlement thrown in. Gottfried Leibniz, Newtons archrival, added his two cents in 1702 by calling √-1 an elegant and wonderful resource of the divine intellect, an unnatural birth in the realm of thought, almost an amphibium between being and non-being.
Finally, the eighteenth-century mathematical genius Leonhard Euler began grappling with √-1, bringing it down to earth for practical study and computation. Swiss by birth but dividing his adult life between the St. Petersburg and Berlin academies of science, Euler derived the pivotal identity named in his honor . In simplified form, this identity says that e to the power of πi equals -1, thus linking the mysterious i with readily accepted math terms. Moreover, Euler introduced the notation of i to represent √-1, which he called an imaginary number.
Euler was not the first to speak in such language. In his 1637 Géométrie , Descartes had observed that the roots . . . are not always real, but sometimes imaginary. The French philosopher, however, was using the word as simply another term in √-1s litany of grimy adjectives: absurd, impossible, fictitious, and so forth. With Eulers use, imaginary gradually came to be an actual mathematical term with a universally recognized definition. All such expressions as √-1, √-2 . . . are consequently impossible or imaginary numbers , Euler wrote, for we may assert that they are neither nothing, not greater than nothing, nor less than nothing, which necessarily renders them imaginary or impossible.
Thus, in the same breath, Euler denied the existence of √-1 (after all, it is impossible) and gave reality¯at least mental reality¯to that impossibility. It is almost as though he left a clause unspoken: i is impossible to see or touch or grasp with contemporary mathematical norms, but it still deserves thought and discussion. We cannot count in i s, but i has true applications. Leaving aside the debate about the ontological status of numbers¯whether they actually exist or are only mental concepts¯Euler persuasively showed that his i has as much mathematical validity as the more familiar real numbers.
And so imaginary numbers made their entrance into the mathematical world. Of course, there were still the skeptics¯and no minor skeptics at that. Despite the advancements all around him, the great Victorian mathematician Augustus De Morgan expressed imperious contempt: We have shown the symbol √-1 to be void of meaning, or rather self-contradictory and absurd. De Morgan-like suspicion lingered for a few decades, even at strongholds of learning such as Cambridge and Harvard. But as more applications surfaced and the dust of imaginary insurrection settled, no self-respecting scientist dared any longer to reject i .
The history of mathematics has left De Morgans doubts far behind. As physicist Stephen Barr has observed in the pages of First Things , and the work of centuries of mathematicians illustrates, Science has shown us not only possibilities but limitations . . . . A mystery is not something incomprehensible in itself. It is something uncomprehended by us.
Scientific positivists, pencil and paper in hand, peer through shatterproof, UV-protected glasses at a world of animals, vegetables, and minerals. But genuine scientists¯true seekers of knowledge¯are not afraid to let the sunlight dazzle them, not afraid to seek and imagine what our myopic reason calls absurd.
Impossible, irrational, delusionary, absurd, untrustworthy, fictitious, imaginary: It is always easier to approach¯or rather, ignore¯mysteries of math by dismissing them as false or unintelligible. And how much more for mysteries of faith. So is God like an imaginary number, waiting to be discovered and accepted in a renaissance of faith? The simile is ridiculous, on its face. But, in a curious way, the ramblings of scientific history remind those who strive for reason just how vast reality is. The realization is at once unsettling and exhilarating: Truth is far richer than our minds¯always confined by the here and now¯can prove or even imagine.
Amanda Shaw is a Junior Fellow at First Things .