The discipline of history is the science of incommensurable things and unrepeatable events. Which is to say, it is no science at all. We had best be clear about that from the outset. This melancholy truth may be a bitter pill to swallow, especially for those zealous modern sensibilities that crave precision more than they covet accuracy. But human affairs, by their very nature, cannot be made to conform to the scientific method—unless, that is, they are first divested of their humanness.
This is not to criticize the scientific method. It is an admirable thing, when used in the right way, for the right purposes. You can simultaneously drop a corpse and a sack of potatoes off the Tower of Pisa, and together they will illustrate a precise law of science. But such an experiment will not tell you much about the spirit that once animated that plummeting corpse—its consciousness, its achievements, its failures, its progeny, its loves and hates, its petty anxieties and large presentiments, its hopes and aspirations, its moments of grace and transcendence. Physics will not tell you who that person was, or about the world within which he lived. All such things will have been edited out, until only mass and acceleration remain.
By such a calculus our bodies may indeed become indistinguishable from sacks of potatoes. But thankfully that is not the calculus of history. You won’t get very far into the study of history with such expectations, unless you choose to confine your attention to inherently trivial or boring matters—in which case, studying the past will soon become its own punishment. One could propose it as an iron rule of historical inquiry: there is an inverse proportion between the importance of the question and the precision of the answer. This is, of course, no excuse to be gassy and grandiose in one’s thinking, a lapse that in its own way is just as bad as being trivial. Nor does it challenge Pascal’s mordant observation that human beings are, in some respects, as much automatons as they are humans. It merely asserts that the genuinely interesting historical questions are irreducibly complex, in ways that exactly mirror the irreducible complexity of the human condition. Any author who asserts otherwise should be read skeptically (and, life being short, as quickly as possible).
Take, for example, one of the most fascinating of these issues: the question of what constitutes greatness in a leader. The word “great” itself implies a comparative judgment. But how do we go about making such comparisons intelligently? There are no quantitative units into which we can translate, and no scales upon which we can weigh, the leadership quotients of Pericles, Julius Caesar, Genghis Khan, Attila, Elizabeth I, Napoleon, Lincoln, Stalin, and Lyndon Johnson. We can and do compare such leaders, however—or others like them, such as the long succession of American Presidents—and learn extremely valuable things in the process. But in doing so, we cannot detach these very different leaders from their contexts, treating them as pure abstractions, ignoring relevant details such as whom they were leading, where they were going, and what they were up against. “Leadership” means nothing if not leadership exercised in very specific circumstances. How does one compare a twentieth-century democratic leader with an absolute monarch or tribal chieftain? Yet what is the point in studying the past if each epoch is to be treated as though sealed unto itself? Comparisons are both irresistible and perilous—and the more interesting they are, the more difficult they are. If made entirely without context, comparisons become meaningless. But if made entirely within context, comparisons become impossible.
There is, then, a certain quixotic absurdity built into the very task historians have taken on. History strives, like all serious thought, for the clarity of abstraction. We would like to make its insights as pure as geometry, and its phrases as effortless as the song warbled by Yeats’ golden bird of Byzantium. But its subject matter—the tangled lives of human beings, in their unique capacity to be both subject and object, cause and effect, active and passive, free and situated—forces us to rule out that goal in advance. Modern historians have sworn off forays into the ultimate. It’s just not part of their job description anymore. Instead, their generalizations are always generalizations of the middle range, carefully hedged about by messy qualifications and caveats, and weighted down by a certain plodding literal-mindedness.
This can, and does, degenerate into such an obsession with conscientious nuance that modern historians begin to sound like the Prufrocks of the intellectual world—self-henpecked, timid, and bloodless, never daring to eat a peach unless they are certain that they’re doing so in proper context. Historians too often are writing for other historians, consumed by the compulsions and nervous tics of the guild, parsing one another’s footnotes, thumping the tub for the latest theoretical gimmickry, heedless of the increasingly remote possibility that there might be a lay reader or two listening in on their seminars and catfights, hoping to find words of wisdom and insight. Yet there is something admirable in this professional modesty, however wrapped in self-absorption it may be. It at least preserves something of the genius of history, which is to remind us of our limits and boundaries, and of the knotty problems that are inherent in any act of self-awareness.
History reminds us that our origins linger on in us. It reminds us that we can never entirely remove the incidentals of our time and place, because they are never entirely incidental. At the same time, it reminds us that this has always been true, for all men and women at all times. In other words, it reminds us that historicity is a part of the human condition. Therefore an appreciation of the past cannot be reached by mere introspection, although it probably cannot be reached without it, and without a wide range of lived experience. C. S. Lewis, who was very far from being a relativist, nevertheless warned against the universalizing oversimplifications of what he called the “doctrine of the Unchanging Human Heart,” which posits that “the things that separate one age from another are superficial.”
Just as, if we stripped the armor off a medieval knight or the lace off a Caroline courtier, we should find beneath them an anatomy identical with our own, so, it is held, if we strip off from Virgil his Roman imperialism, from Sidney his code of honor, from Lucretius his Epicurean philosophy, and from all who have it their religion, we shall find the Unchanging Human Heart, and on this we are to concentrate.
None of which is to say that those authors are reducible to those attributes, so that Virgil becomes nothing but Roman imperialism, as the flat-footed historicists of our age might well contend. Nor is it to deny great literature’s power to touch chords of universal humanity, a position for which Lewis would be the least likely of spokesman. It is, however, to point out that such universals as are available to us can be apprehended only through careful attention to particulars. They cannot be reduced to neat propositions, neatly ingested. Like everything worthwhile, generalizations must be earned.
And even if they could be neatly codified, they would not stay that way for long. We can never finally reduce what we know about ourselves to a set of inert propositions, because whatever we know about ourselves, or think we know, becomes a part of what we are, at the moment we come to know it. At the very moment we absorb such propositions, we inch beyond their grip. Self-knowledge is, in that sense, constantly transformative. Writing history is even more so, because it means taking ever-moving aim at an ever-moving target with ever-changing eyes, ever-transforming weapons, and ever-protean intentions. “History,” writes the Hungarian-American historian John Lukacs, “by its very nature, is ‘revisionist,’” because it is “the frequent, and constant, rethinking of the past,” an enterprise that, unlike a court of law, “tries its subjects through multiple jeopardy.” The past changes, not only because it is constantly growing, but because the things we need from it change too.
The appropriation of this ever-changing past is, then, a paradoxical undertaking. And it becomes progressively more difficult precisely as one becomes more skilled, knowledgeable, and conscientious. Indeed, it is surprisingly easy to write bad history, and even easier to deliver oneself of crude but profound-sounding historical comparisons. It is easy, for example, for any layman to opine portentously about the ominous parallels between the histories of America and Rome, or between America and the Weimar Republic. And so there may be. But it is very difficult for experienced and knowledgeable historians to specify wherein those parallels are to be found—so hard that, these days, they will almost certainly refuse to try, particularly since they have no professional incentive to do so.
It is easy, in short, to treat the past as if it were just an overflowing grab bag of anecdotes, and careful professional historians are right to admonish those who do so. But only partly right. For man does not live by pedantry and careful contextualization alone. Historical insight is irreducibly an act of the constructive imagination, as much as it is a science of careful reconstruction. That will always be true, because the leap from a mountain of carefully compiled data to a compelling narrative or a persuasive theory will always be shrouded in mystery, propelled by the ineffable force of what Michael Polanyi called “tacit knowledge,” no matter the discipline in which the leap occurs.
And it will always be true, because the writing of history will always take its bearings from the needs of the present. How, indeed, could it be otherwise? So long as history is still a vital intellectual undertaking, indispensable to our civilized existence, then it will always be proper—and necessary—for us to seek out precedents in the past, and to do so energetically and earnestly, not being content to confine the past to a comfortable imprisonment in its own context. Nothing really has changed since Thucydides penned his History of the Peloponnesian War, sustained by the fragile hope that it would be consulted by “those who desire an exact knowledge of the past as a key to the future, which in all probability will repeat or resemble the past.” Probabilities aside, it remains true that the past’s few precedents are the only clues we have about the likely outcomes for similar endeavors in the present and future. Elusive as it is, the past is all we really have to work with, and all we can genuinely know. Clio’s laboratory may be disorderly and makeshift. But it has to be, if it is to remain true to the things it studies.
History, then, is a laboratory of sorts. By the standards of science, it makes for a lousy laboratory. No doubt about that. But it is all we have to work with. It is the only laboratory available to us for assaying the possibilities of our human nature in a manner consistent with that nature. Far from disdaining science, we should imitate many of its characteristic dispositions—the fastidious gathering and sifting of evidence, the effort to be dispassionate and even-handed, the openness to alternative hypotheses and explanations, the caution in propounding sweeping generalizations. Although we continue to draw upon history’s traditional storytelling methods, we also can use sophisticated analytical models to discover patterns and regularities in individual and collective behavior. We even can call what we are doing “social science” rather than history, if we like.
But we cannot follow the path of science much further than that, if only for one stubborn reason: we cannot devise replicable experiments and still claim to be studying human beings rather than corpses. It is as simple as that. You cannot experiment upon human beings, at least not on the scale required to make history “scientific,” and at the same time continue to respect their dignity as human beings. To do otherwise is murdering to dissect. It is not science but history that tells us that this is so. It is not experimental science, but history, that tells us how dreams of a “worker’s utopia” gave rise to one of the most corrupt tyrannies of human history, or how civilized, technically competent modern men saw fit to place their fellow men in gas chambers. These are not events that need to be replicated. Instead, they need to be remembered, as pieces of evidence about what civilized men are capable of doing, and perhaps by extension about the kinds of political regimes and moral reasonings that seem likely to unleash—or to inhibit—such moral horrors.
Thankfully, not all of history’s lessons are so gruesome. The history of the United States, for example, provides one reason to hope for the continuing improvement of the human estate, and such sober hopefulness is, I believe, reinforced by an honest encounter with the dark side of the American past. Hope is not real and enduring unless it is based upon the truth, rather than the power of positive thinking. The dark side is always an important part of the truth, just as everything that is solid casts a shadow when placed in the light.
Chief among the things history should teach us, especially those of us who live nestled in the comfortable bosom of a prosperous America, is what Henry James called “the imagination of disaster.” The study of history can be sobering and shocking, and morally troubling. One does not have to believe in original sin to study it successfully, but it probably helps. By relentlessly placing on display the pervasive crookedness of humanity’s timber, history brings us back to earth, equips us to resist the powerful lure of radical expectations, and reminds us of the grimmer possibilities of human nature—possibilities that, for most people living in most times, have not been the least bit imaginary. With such realizations firmly in hand, we are far better equipped to move forward in the right way.
So we work away in Clio’s makeshift laboratory, deducing what we can from the patient examination and comparison of singular examples, each deeply rooted in its singular place and moment. From the perspective of science, this is a crazy way to go about things. It is as if we were reduced to making deductions from the fragmentary journal of a mad scientist who constructed haphazard experiments at random and never repeated any of them. But the oddness is unavoidable. It indicates how different is the approach to knowledge afforded by the disciplines we call the humanities, among whose number history should be included. It also explains why the “results” can so often be murky.
There is not a sinister conspiracy behind this. Our professional historians do not, by and large, go out of their way to be obscure or inaccessible. They are hardworking, conscientious, and intelligent people. But their graduate training, their socialization into the profession of historical writing, and the structure of professional rewards and incentives within which they work have so completely focused them upon the needs and folkways of their guild that they find it exceedingly hard to imagine looking beyond them. Their sins are more like those of sheep than those of wolves.
Add to this, however, the fact that, for a small but increasing number of our academic historians, the principal point of studying the past is to demonstrate that all our inherited institutions, beliefs, conventions, and normative values are arbitrary—“social constructions” in the service of power—and therefore without legitimacy or authority. For them, history is useful not because it tells us about the things that made us who we are, but because it releases us from the power of those very things, and thereby confers the promise of boundless possibility. All that has been constructed can presumably be dismantled and reconstructed, and all contemporary customs and usages, being merely historical, can be cancelled. In this view, it would be absurd to imagine that the past should have anything to teach us, or the study of the past any purpose beyond the needs of the present. History’s principal value, in this view, is not as a glue but as a solvent.
There is some truth in these assertions. In the first place, scrupulous history cannot be written to please the crowd. And yes, history ought to be an avenue whereby the present escapes from the tutelary influence of the past. But the study and teaching of history ought to be directed not only at the accumulation of historical knowledge and the overturning of myths and legends, but also at the cultivation of a historical consciousness. Which means that history is also an avenue whereby the present can escape, not only from the past, but from the present. Historical study ought to enlarge us, deepen us, and draw us out of ourselves, by bringing us into a serious encounter with the strangeness—and the strange familiarity—of a past that is already a part of us.
In drawing us out, it “cultures” us, in all the multiple senses of that word. As such, it is not merely an academic subject or an accumulated body of knowledge, but a discipline formative of the soul. Historians should not forget that they fulfill an important public purpose simply by doing what they do. They do not need to justify themselves by their “practical” contributions to the formulation of public policy. They do their part when they preserve and advance a certain kind of consciousness and memory, traits of character upon which a culture of relentless change and instant erasure has all but declared war. Human beings are by nature remembering creatures, and storymaking creatures. History embraces and affirms those traits, even as it insists upon refining them by the light of truth. To do that alone is to do a great deal, at a time when all the forces seem to be arrayed on the other side.
History cannot do those things, however, unless it continues to be understood as Thucydides understood it, as a search for truth. And that proposition is, so to speak, very much in play. There are two characteristic fallacies that arise when we speak of truth in history, and we should be wary of them both. The first is the confident belief that we can know the past definitively. The second is the resigned conviction that we can never know the past at all. They are, so to speak, the respective fallacies of positivism and skepticism, stripped down to their essences. They are the mirror images of one another. And they are equally wrong.
The first fallacy has lost some of its appeal for academic historians, but rather less with the public. One hears this particular reliance upon the authority of history expressed all the time, and most frequently in sentences that begin, “History teaches us that. . . .” Professional historians and seasoned students, to their credit, tend to cringe at such words. And indeed, it is surprising, and not a little amusing, to see how ready the general public is to believe that history, unlike politics, is an entirely detached, objective, impersonal, and unproblematic undertaking. Not only the unsophisticated make this error. Even the jaded journalists who cover the White House, and the politicians they cover, imagine that the question of a particular President’s historical standing will be decided by the impartial “verdict of history.” I say surprising and amusing, but such an attitude is also touching, because it betrays such immense naive confidence in the transparency of historical authority. Many people still believe that, in the end, after all has been done and said, History Speaks.
Whatever their folly in so believing, however, it does not justify a movement to the opposite extreme—the dogmatic skepticism and relativism implicit in the second fallacy. That, in its crudest form, is the belief that all opinions are created equal, and since the truth is unknowable and morality is subjective, we all are entitled to think what we wish, and deserve to have our opinions and values respected, so long as we don’t insist too strenuously upon their being “true.” Such a perspective is not only wrong: because it renders genuine debate and inquiry impossible, it is damaging to the entire historical undertaking.
Truth is the basis of our common world. If we cannot argue constructively about historical truth and untruth, and cannot thereby open ourselves to the possibility of persuasion, then there is no reason for us even to talk. If we cannot believe in the reasonable fixity of words and texts, then there is no reason for us to write. If we cannot believe that an author has something to offer us beyond the mere fact of his or her “situatedness,” then there is no reason for us to read. If we cannot believe that there is more to an author, or a book, than political or ideological commitments, then there is no reason for us to listen. If history ever ceases to be the pursuit of truth, then it will in time become nothing more than self-regarding sentimentalism, which in turn masks the sheer will to power, and the war of all against all.
This description sounds rather dire, but there is no reason to believe that we have reached such a pass, notwithstanding the academy’s current multitudinous follies. Our actions as readers and writers of history betray the fact that we continue to believe in these things implicitly, and would be lost without them. And not only as readers and writers of history. It cannot be noted often enough that even high-flying postmodernist (and post-postmodernist) scholars tend to be notoriously literal-minded when it comes to the terms of their contracts. This brute Johnsonian fact offers more of the heft of truth than the collective oeuvre of all the Gallic savants and their epigones. Perhaps it does not reflect especially well on us that we indulge so much patent silliness. It is appalling to think of how much valuable time and talent is thereby wasted, a thought that cannot fail to occur to a sensitive soul contemplating what happens to bright students entering our top-flight graduate programs in the humanities. But serious writing has survived far worse travails, and it will likely survive these too—simply because it addresses itself to profound human needs that cannot be willed away. History’s tortoise-like resistance to theory may prove to have been one of its saving graces in a hare-brained era.
Still, we all are better off when we make the effort to acknowledge our actual operating notions and motives—and thereby make them available for rational examination. This need not entail the task of formulating a grand Philosophy of History (too often a grand distraction from actually studying history). It may be enough to remember the two fallacies, which I will for convenience’s sake dub the Fallacy of Misplaced Precision and the Fallacy of Misplaced Skepticism, as the extremes we want to avoid. There is a world of difference between saying that there is no truth, and saying that no one is fully in possession of it. Yes, the truth is elusive, and only fleetingly and partially glimpsed outside the mind of God. But it is no folly to believe that the truth is there, and that we are drawn by our nature to search endlessly for it. Indeed, the real folly is in claiming otherwise.
Wilfred M. McClay holds the SunTrust Chair of Excellence in Humanities at the University of Tennessee at Chattanooga, and is the author most recently of The Student’s Guide to U.S. History (ISI Books), from which this essay is adapted.
Photo by Raphael Panhuber licensed via Creative Commons. Image cropped.
You have a decision to make: double or nothing.
For this week only, a generous supporter has offered to fully match all new and increased donations to First Things up to $60,000.
In other words, your gift of $50 unlocks $100 for First Things, your gift of $100 unlocks $200, and so on, up to a total of $120,000. But if you don’t give, nothing.
So what will it be, dear reader: double, or nothing?
Make your year-end gift go twice as far for First Things by giving now.