It is of course the case that only God knows what will happen in the next century and the next millennium. But we human beings are created with an irrepressible disposition toward the future, as well as a capacity to recall the past. In the last year we published a “millennium series” of articles remembering, century by century, where we have been. Now we ask a group of notable thinkers, all of whom are familiar to our readers, what they expect of the future. Writers were free to choose their own topics, and we trust our readers will agree that the result is a suggestive, and frequently provocative, contribution to the right ordering of expectations, hopes, and fears for the century and millennium now upon us.
—The Editors
Elliott Abrams
Europe is the cradle of American civilization and the source of our language and politics. Many of our greatest statesmen have played out their foreign policies there—from our war for independence to the world wars in this century. Even in the last year of the millennium, American troops were landing in Europe again.
But in the coming century, this will change. Engagements like Kosovo are sideshows, and the future of American world power will not be determined by wars in Europe. Russia’s population is shrinking and its economy shows no signs of recovery, so its return to great power status is unlikely. Nor will world politics and our future depend on any likely events in Latin America or Africa, where no country is powerful enough to threaten our interests or become a world power. Wrenching though it will be for our traditional Eurocentrism, the great game is moving to Asia.
Consider the numbers first: China has 1.5 billion people and India 1 billion, and we should not overlook Indonesia at over 200 million, Pakistan at 150 million, and Bangladesh at 125 million. By 2050, India’s population will probably rise to 1.5 billion, Bangladesh’s will nearly double to 245 million, and Pakistan’s will more than double—to roughly the same level as our own. Put another way, Asia will become the world’s population center. It will be an increasingly powerful economic center as well, whatever the occasional setback, and it is clear from the amounts currently being spent on weaponry that India and China will increase their military influence, in the region and beyond.
The rise of Asia will present us with an ideological challenge as great as the economic and military challenges. The American notion that our own political system constitutes man’s highest political achievement will come under assault. This challenge will come most strongly not from dictatorships such as China, but from variants of the Singaporean—or perhaps a future Chinese—system that put far greater emphasis on community, order, and duty than on individual rights and personal autonomy.
Once the Asian systems become reasonably democratic, the debate will not be over elementary human rights but over which model of social and political organization is “best.” Since societies differ so much, that argument is not very useful in the abstract. It will, however, be a very good one for Americans to have in the twenty-first century, for it will force us to reconsider what the American system is and why we want to keep it. Too often in recent decades we have described it as a mixture of free markets and electoral institutions, a dry and lifeless formulation that would have surprised the Founders. They, like those in the Confucian tradition, understood that some conception of virtue—of what constitutes a life well led and a society well organized—must underlie our political institutions if they are to be sustained. The external debate over Asian vs. American “values” and political systems will enrich the discussion within this country. It will remind Americans of their own communal values, their own understanding of personal and civic virtue, and the Founders’ now nearly forgotten conclusion that religion was the irreplaceable source of both. Meanwhile, the large numbers of Latin and Asian immigrants to the United States, and the related debates over issues such as bilingual education and affirmative action, will force us to consider once again what it means to be an American and what “American values” really are.
In the early decades of the century the time will be ripe for us to begin this discussion. It is now widely understood that our current approach to these matters has failed. Just as the call for greater personal responsibility within the welfare system became bipartisan, many similar moves—to reform the value-free atmosphere of the public schools, to permit more religion in the public square, to end the years of sexual license—will achieve wide acceptance. The social mores of the last third of the twentieth century will be understood as an aberration, a largely failed experiment.
The alternatives presented by the very different societies coming to power in Asia will be helpful correctives, but not acceptable models. The genius of the traditional American system will, I predict, be revealed when we see it in the context of those alternatives—for Americans if not for the world. The unique American balance between community and individual, between rights and responsibilities, between private life and patriotic duty, between wealth and charity, and between local and national government, will be clarified, and Americans’ commitment to maintaining that balance strengthened. The view that the American way constitutes the only sound model for all mankind’s future may lose ground, but Americans’ understanding of and commitment to that way, and their pride in it, will grow. By the end of the twenty-first century it may even be as high as it was at the beginning of the twentieth.
Elliott Abrams is President of the Ethics and Public Policy Center, located in Washington, D.C.
Charlotte Allen
Predicting the future usually means extrapolating today’s fashions into long-term trends. In the early 1970s I read an article prophesying that by the year 2000 (now!) there would be no more broccoli. The idea was that since frozen and powdered foods, the latest innovations in their field, had captured much of the market, fresh produce would soon disappear, except maybe for those those cube-shaped, agribusiness-developed tomatoes that were supposed to be on their way in.
Of course the author didn’t foresee Dean & Deluca: that just twenty-five years later, humble low-tech vegetables, preferably dirt-farm organic and with names you’ve never heard of, like burdock root, would play a key role as turn-of-the-millennium status symbols, the consumption of which would separate snobs from slobs at high-end restaurants and upscale grocery chains. The giant food-processing megacorporations would turn out to have the problems, currently consolidating hurriedly in the face of static customer bases.
The lesson: whether you are a Long Boom optimist or a long-gloom pessimist, palm-reading the lifestyles of the future usually sets you up to be proved wrong. Right now the Silicon Valley is hot, so futurists are preaching that the next thousand years are going to be just like the Silicon Valley, only without the traffic and without disease, war, poverty, or death. Maybe—but don’t count on it. “Life is a great surprise,” wrote Vladimir Nabokov, and that goes for the next five years as well as the next five hundred.
It is possible, however, to make some very general predictions about the next millennium, not by projecting the fads of the present but by studying the patterns of the past. That is because those who do not understand history are condemned to repeat it, and hardly anyone understands history. One way to imagine what the long-term future will bring is to examine what one could have imagined the future would bring if one were living at the beginning of the second millennium in the year 1000, or the first millennium in the year One.
Let us look at the earlier period first. Two thousand years ago, most of the Western world lived under the political and cultural hegemony of a “lone remaining superpower” known as Rome. Over the preceding several centuries this upstart Italian city had extended its civic borders to encircle the Mediterranean and had demolished and absorbed the Hellenistic empire that Alexander the Great had cobbled together. Superbly trained and equipped Roman legions stationed everywhere ensured peace (except for border skirmishes) and allowed a maritime shipping trade to flourish on a scale that would not be surpassed until the nineteenth century. To its colonies, Rome offered a universal rule of law, a universal language (Latin), and the possibility of universal citizenship even for the foreign-born. The Romans did not have the microprocessor, but they did have a kind of functional equivalent: slaves, whose labors underlay the creation of immense wealth. Catering to the new prosperous class, Virgil, Horace, Ovid, and Livy turned out epic and lyric poetry and elegant prose to rival the finest literature of the Greeks.
The Roman Empire seemed destined to last forever. However, a shrewd observer of the first-century scene, perhaps a Roman general posted in Pannonia or Syria, might have noticed the beginnings of massive changes that would sweep away the entire edifice over the next few centuries. Those changes were not political but cultural and demographic. Our general would probably not have noticed the harbinger of the most significant change of all: the little Jewish boy playing with wood-shavings in the obscure hamlet of Nazareth. A few decades later, though, he might have observed with alarm, as the emperor Nero certainly did, that increasing numbers of people, even in Rome, were worshiping this Jew as a god (and refusing to worship the emperor, or to respect the empire beyond a point), even though the supposed god had been executed as a felon after full and fair Roman justice.
Our sharp-eyed general would have noticed other things: that the Latin language, for example, although widespread and official, was in slow decline. Already in the early first century, the best years of classical Latin literature were over, and by the middle of the second century the empire’s finest thinkers and stylists would be writing in Greek: Plutarch, Galen, the satirist Lucian. Even Marcus Aurelius, Edward Gibbon’s ideal noble Roman, used Greek for his Meditations. Among the unlettered, spoken Latin was either dying out or imploding into the Romance languages.
There were also ever fewer Romans, culturally speaking. As early as Jesus’ time, the imperial city was recruiting many of its fighting men from Asia, and they brought their customs and gods with them into Europe. Gradually, the empire’s center of gravity moved geographically and ethnically eastward. By the time of the emperor Decius in the mid-third century, the imperial throne itself was a preserve of ethnic Balkans, and not long after that, the emperor Diocletian split the empire into east and west. Most tellingly of all, Rome never managed to extend its northeastern frontier much beyond the Danube, and it spent some five hundred years waging defensive warfare against the increasingly numerous German tribesmen on the other side. Our prescient Roman general might have foreseen that eventually his government would lose its fighting energy, and that the descendants of Julius Caesar’s harriers would pour into every corner of Western Europe, pushed by even more aggressive nomads from farther east.
By the year 1000, the beginning of our own millennium, there was still an entity that called itself the “Roman Empire.” There were two such entities, in fact: a glittering, polyglot theocracy on the Sea of Marmara, and a loosely organized Frankish kingdom in Northern Europe. Neither bore much resemblance to the Rome of old. In contrast to the confident imperial mood at the beginning of the first millennium, the mood at the beginning of the second millennium was generally one of exhaustion. Christianity itself seemed tired, its monasteries fat but spiritually moribund, its prelates beholden to secular rulers, and its eastern and western flanks so embittered by their cultural differences that in 1054 they acrimoniously split apart. The populations, economies, and cities of Western Europe were shadows of their former selves, beaten down by centuries of invasions, most recently from the most belligerent of the Germans, the Vikings. Some thought the world was about to end.
And yet, during the very year 1000, something happened that turned out to set the course of the entire millennium in a new direction: one of the Vikings, Leif Ericsson, who had decided to convert to Christianity a few months before, discovered America. Fifty years later, Leif’s distant kinsman Thorfinn, the last of the Norse pirate chiefs, retired his longboat and made a pilgrimage to Rome; that was the end of Viking raids.
In the year 1016, the Benedictine abbey at Cluny in France declared itself independent of outside supervision and spearheaded a rolling reform of the monasteries that swept through Europe. Cluniac monks in turn supported Pope Gregory VII’s singleminded and ultimately successful efforts to free the Church from secular control later in the century (see Robert Louis Wilken, “Gregory VII and the Politics of the Spirit,” FT, January 1999). The monasteries had always been centers of learning, and they now became powerhouses of creative invention. Such medieval innovations as the horse-drawn plow and the water mill (which greatly enhanced farm productivity), the glass lens, the clock, the flying buttress, and double-entry bookkeeping marked the beginnings of the long wave of technological, scientific, and commercial development that, for good or ill, is the very signature of the second millennium in the West.
A very prescient monk at Cluny in the year 1000 would not have foreseen exactly how things would turn out today—he could not have predicted the Internet—but he might well have suspected that the renewed spiritual energy at his own monastery would flow infectiously outward. In fact, the eleventh century proved to be one of the West’s most energetic; it was the time of El Cid, the earliest Crusades, the Norman Conquest, and the beginning of a new synthesis of classical and Germanic civilizations.
All this says something about what it may be like in the year 3000, if the Last Days don’t overtake us first. Our own country, like Rome, is the richest and most powerful on earth, but it is also in the throes of sweeping demographic changes, as Rome was at the beginning of the first millennium. America’s language, English, is in universal use, but it is also in a state of grammatical decomposition and literary decline. A thousand years from now, there may still be an entity that calls itself “the United States,” just as there was a Rome in the year 1000, but don’t expect it to be much like today’s U.S.A. At the same time, there is currently a crisis of cultural self-confidence in the West that resembles the malaise of a thousand years ago. The combination of new people (waves of immigrants from south and east) and a fading sense of common values seems to spell disaster, and there will undoubtedly be upheavals aplenty in the future as there have been in the past.
But if history teaches us anything, a vibrant new civilizational synthesis may also be just around the corner. Furthermore, we can take comfort from the things that have survived for 2,000 years and are likely still to be around when another thousand have passed: wine and song; dogs and ball games; parties and horoscopes; sandals and earrings; the Greek and Latin classics; lovely young ladies and obnoxious aunts; courage and hope and fear of death; the love of parents for their children.
And faith. As the third millennium begins we seem to be on the verge of a great religious revival, and we should remember that it was a new spiritual beginning that set in motion those momentous changes the last two times around.
Charlotte Allen is the author of The Human Christ: The Search for the Historical Jesus (Free Press).
Andrew Bacevich
The dawn of the new millennium finds the United States flirting once again with its old Wilsonian Temptation. When Woodrow Wilson set out to “make the world safe for democracy,” he acted with the certainty that Providence had chosen this nation as its agent of global salvation. This was America’s calling and its duty. If we take seriously the rhetoric issuing from the nation’s foreign policy establishment, that remains America’s duty today.
Wilson failed, his diplomacy subverted by Clemenceau and Lloyd George and his vision trumped, in the eyes of some, by Lenin’s own promise of utopia. But ever since, the conviction has persisted that Wilson erred chiefly in being premature. Rippling through the undercurrents of American politics, this conviction periodically resurfaces, shimmering with expectation. The passing of the Cold War has opened the way for the latest such reappearance.
Embarking upon his “war to end all wars,” President Wilson promised that the outcome would “bring peace and safety to all nations and make the world itself at last free.” With far less eloquence and little of his moral fervor, Wilson’s successors in the aftermath of the Cold War have embarked upon an analogous quest.
Analogous, but not identical. The melody remains, but the lyrics have changed. For Wilson, politics—above all the creation of a League of Nations—was paramount. His latter-day disciples place economic considerations at the forefront. The new name of the game is globalization. America will export not its political principles—at least not immediately—but its economic precepts and its lifestyle. “Opening” the world to trade, investment, technology, and popular culture will make possible the creation of wealth on a scale hitherto unimaginable. In the wake of abundance will come democracy, peace, and unprecedented opportunities for human fulfillment.
Although paying ritual obeisance to the successor to Wilson’s League, U.S. officials know that the real action has long since moved elsewhere. It’s not the United Nations that counts, but the World Trade Organization, along with Wall Street, Silicon Valley, Hollywood—and Washington.
Indeed, the Wilsonian Temptation is enjoying its latest revival not because present-day American leaders identify with the twenty-eighth President himself—for starters, his no-nonsense Presbyterianism clashes with the frothy religiosity of contemporary politics—but because the statement of American purpose that Wilson first formulated has since become irreplaceable. Wilson’s enduring achievement was to reconcile the nation’s origins as an anti-imperial republic with its aspirations to global preeminence. To justify U.S. entry into a war that he abhorred, Wilson gave voice to the ultimate expression of American exceptionalism. Unlike the empires it was soon to supersede, the United States acted not in pursuit of selfish interests but on behalf of universal principles (indistinguishable, according to Wilson, from American principles) and in pursuit of common international interests (an extension, in Wilson’s view, of America’s own interests). So it was in 1917 and so it has once again become today.
A major question of the new millennium’s first century is whether the neo-Wilsonian prophets of globalization will come any closer to achieving their goals than did Wilson himself.
In one sense, the conditions appear to be more favorable. In terms of ideological competitors, American-style democratic capitalism has swept the field. And although Jacques Chirac, Tony Blair, and others among the current crop of statesmen are no more given to flights of altruism than were their World War I predecessors, the people they govern have long since lost their stomach for power politics. The nations that once vied with the United States for dominance now receive the honorific title “Great Power” only as a courtesy.
But in another sense, the conditions are less favorable. Woodrow Wilson’s America understood that no achievement comes without cost. “Peace without victory” did not imply peace without sacrifice. In the Republic of Good Times that is Bill Clinton’s America, concepts like self-sacrifice or self-denial appear increasingly antiquated. Indeed, the allure of globalization lies in the expectation that Americans can in the long run do good while in the short term doing very well for themselves. Popular willingness to enlist in this variant of a Wilsonian crusade derives from the promise of gain without pain.
But the paramount lesson of the post-Cold War era’s first decade—made manifest in the Persian Gulf, “Kurdistan,” Bosnia, Haiti, Somalia, Rwanda, Sierra Leone, Congo, Afghanistan, Russia’s “near abroad,” Kosovo, and East Timor—is that the process of globalization won’t advance on autopilot. The second lesson, displayed in the U.S. response to many of those same crises, is that when it comes to enforcing the ground rules of an “open world,” the American people balk. They are willing to expend little treasure and less blood.
In short, a yawning gap separates the grand designs of the political class from the willingness of citizens to foot the bill. The story of U.S. foreign policy in the 1990s has been the story of searching for ways to paper over that gap, with cruise missiles, high-altitude bombing, and spurious peacekeeping missions as the preferred instruments. How long the United States can conceal this disparity between national aims and popular will looms as one of the larger questions of the century now beginning.
Andrew J. Bacevich directs the Center for International Relations at Boston University.
Stephen M. Barr
A person living in the year 1000 could scarcely have imagined how much we would now understand about the physical world. So forecasting the next millennium in science is certainly rather foolhardy. Looking a century ahead is less unreasonable. After all, some of the most revolutionary insights in what is still taught as “modern physics” are now almost a century old: Planck’s discovery of the quantum in 1900, and Einstein’s of relativity in 1905. Of course, a truly new idea cannot be forecast, but we can at least say what the big questions are now that have some relevance to religion, and whether the answers to them are likely to come in the next century. Here is my list of the top ten questions in science.
1) What are the ultimate laws of physics? 2) Are those laws deterministic? 3) What is the correct physical description of what happened at the Big Bang? Was it the beginning of time? And, if not, did time have a beginning? 4) Is the universe infinite in size? And are there an infinite number of planets? 5) How did life begin? 6) How did evolution happen? Is natural selection enough to account for it? 7) Did life begin elsewhere, and are there other intelligent creatures in the universe? 8) How does the human brain work in detail? 9) What is consciousness, and how does it fit into our theories of the physical world? 10) What are the limits of computers, and can the human mind be completely explained in computational terms?
Every one of these questions is far harder than any scientific question that has ever been answered before. For most of them, a solution by direct observation or experiment is out of the question. We can never directly observe what (if anything) existed before the Big Bang. We can never directly see the entire universe and thus verify that it is finite (if it is) since relativity theory does not permit us to. There are no fossils of the first living things or the prebiotic entities from which they are presumed to have sprung. It is unlikely that we will ever completely map out the neural circuitry of a human brain because of the number of nerve cells and connections involved. And as far as consciousness and free will are concerned, scientists and philosophers seem divided between those who think that there is nothing to explain and those who think that the explanations are too hard for us ever to understand. Except for the first, perhaps, I see no grounds for confidence that any of the questions on this list will be answered definitively by the year 2100.
The three questions in this list of greatest theological importance (because they intersect with dogma) are the determinism of physical law, the beginning of time, and the nature of the human mind. While science may never be able to give definitive answers to these three questions, one can make educated guesses.
Determinism was overthrown by quantum mechanics in the 1920s, so the question here is really whether determinism will make a comeback. There does exist a way to reinterpret the mathematical formalism of quantum mechanics deterministically—it is called the “many worlds interpretation.” However, there are certain serious technical objections to this interpretation; and even if it can be shown to be technically viable, it is thought to be impossible, even in principle, to verify whether it is correct. Another possibility is that quantum theory itself will be overthrown. But if the superstring theorists are right, then the fundamental principles of quantum theory are probably here to stay. The bogey of physical determinism is thus likely gone for good.
As far as the beginning of time is concerned, there are certainly interesting speculations according to which something existed before the Big Bang. However, it is very hard to imagine at present how these ideas could ever be tested. And even if one of them were found to be right, it is hard to see how it could be settled whether or not time had a beginning at some point earlier than the Big Bang. But since the idea of a universe without temporal beginning does not seem to sit very well with the Second Law of Thermodynamics, I would guess that cosmological theory at the end of the next century will not prefer an eternal universe.
Finally, in spite of the unfounded optimism of the proponents of “strong Artificial Intelligence,” there are compelling reasons to disbelieve the computational theory of the mind. And so, while I would expect dramatic advances in what computers can do, I do not expect that machines will be built which can understand abstract concepts or exercise the other powers that philosophers traditionally attributed to the “active intellect” in man.
This brings us to some broader reflections. What gave birth to science and remains its motive force is a belief in the power of human reason. But this belief can be grounded only in a transcendent view of man and thus, ultimately, in religion. The early modern era lost sight of this fact, in part because the attempt of the Church to repress error was seen as an attempt to repress man himself and his intellect. Thus, for many, the scientific spirit came to be defined in opposition to faith. This hostility to religion on the part of some scientists, however, really involves an inner contradiction that is now coming to the surface.
As a human enterprise, science magnifies man. It displays in an unparalleled way the tremendous power of human reason. And yet, as a horizon of thought, physical science is extremely narrow. It deals with matter in motion, with what can be measured, and weighed, and clocked. It is all too tempting for the scientist to try to fit the human being within that horizon. But what is left when that is done? One is left with the human mind as “nothing but a pack of neurons,” in the words of Sir Francis Crick, and nothing but a “machine made of meat,” in the words of Marvin Minsky. One is left with human concepts, even the very mathematical concepts out of which science itself is built, as “neurological creations,” in the words of cognitive scientist Stanislas Dehaene. The scientist who succumbs to materialism is conflicted in his view of man. Like Hamlet he says, “How noble in reason! How infinite in faculty! . . . In apprehension, how like a god! . . . And yet, to me what is this quintessence of dust?”
As the present Pope has emphasized, revelation is not only about God; it also contains “the truth about man.” God reveals man to himself. Man is revealed to be a rational being, made in the image of God. Pope John Paul II has stated that human freedom is rooted ultimately in this truth about man. Science, too, is rooted in it. And that is why the battle of the next century will not be between science and religion, but within science and for its soul. By proclaiming the truth about man, religion will be found to be not an enemy of reason, which of course it never was, but perhaps its last defender.
Stephen M. Barr is a theoretical particle physicist at the Bartol Research Institute of the University of Delaware.
Robert H. Bork
Isaiah Berlin observed that “not one among the most perceptive social thinkers of the nineteenth century had ever predicted . . . the great ideological storms [of the twentieth century] that have altered the lives of virtually all mankind.” With that in mind, the one thing we may confidently predict for the next millennium is that our forecasts will have little relation to reality. Nevertheless . . .
Major changes in a civilization may be foreshadowed when words that once had power and weight become empty formulas rather than expressing vital, living principles. During the 1998 campaign on the Michigan referendum on legalizing assisted suicide, for example, opponents discovered, to their considerable surprise, that the phrase “the sanctity of human life” had absolutely no resonance with the public. What did give the opposition traction, and led to the defeat of the measure, was the example of the Netherlands, where assisted suicide gradually became euthanasia and then unconsented killing. People could imagine themselves as victims. The “sanctity of human life” had shriveled to the sanctity of one’s own life.
So it is today with the phrase “rule of law.” Born in Europe, exported to America, and fundamental to Western Civilization, the concept no longer commands much more than verbal allegiance. It would be possible to multiply examples, but the fact became obtrusive during the impeachment of Bill Clinton. It was indisputable that Clinton had committed perjury and obstructed justice, but his supporters and most of the public decried the proceedings as “just about sex,” which is apparently no longer morally serious. Perjury and obstruction of justice themselves were only as serious as what was being covered up. This means that law as such no longer has independent moral force or weight of its own.
That truth is worth reflection. Aside from the fact that justice is now defined on an ad hoc basis, there is the ominous reality that the rule of law is central to the practice of democracy. Rule by the people means that voters choose legislators according to the policies the candidates offer, that elected representatives will enact rules, and that judges and juries will apply those rules impartially and as intended. Unless that is true, public debate, elections, and legislative deliberation have little significance. Disconnected from governance, politics will become entertainment, and those who vote are likely to cast their ballots on the basis of celebrity. (Which leads to the somber thought that the Reform Party may be the model for the future.) Perhaps a highly complex and dynamic society, one that continually introduces more complicated issues than our institutions can cope with satisfactorily, cannot adhere to older notions of the rule of law and self-government.
But it is a matter for concern: when law is personalized and politicized, its force and impact are controlled by public relations and private moralities, not by majority preference. We have been on this course for some time, as shown by judicial rule without recourse to law, jury nullification of law, and, perhaps especially, bureaucracies that lay down most of the law that governs us with, at best, minimal accountability to either the people or their elected representatives and without concern for consistency. These developments could not occur without the inertia and weariness of the public, even their willingness to abandon the long-term safeguards and benefits of process for the short-term gratification of desires.
There surely has always been an element of this in our use of law, but that element seems to be expanding rapidly. If it is, the “rule of law” and “democracy” in the next millennium will be radically different from the idealized versions of them that most of us carry in our heads. Ruleless “law” will be a political weapon and control of the judiciary will therefore be a political prize. “Democracy” will consist of the chaotic struggle to influence decision makers who are not responsive to elections.
This is not a forecast of doom but merely of a continuing transformation of law and government, a journey to a new polity and society whose details we cannot even begin to imagine today. It may be tolerable, as many societies have been without the rule of law or democratic self-government. The future being uncertain, however, it will be the part of wisdom to resist the changes we see.
Robert H. Bork is the John M. Olin Scholar in Legal Studies at the American Enterprise Institute and author of The Tempting of America: The Political Seduction of the Law and Slouching Towards Gomorrah: Modern Liberalism and American Decline.
Willam A. Dembski
In a memorable scene from the movie The Graduate , Dustin Hoffman’s parents throw him a party to celebrate his graduation from college. The parents’ friends are all there congratulating him and offering advice. What should Hoffman do with his life? One particularly solicitous guest is eager to set him straight. He takes Hoffman aside and utters a single word—plastics!
In participating in this symposium, I feel like that guest. On the relation between religion and science in the coming millennium, I offer one word—information! Information is the primary stuff of the coming age. With the rise of the computer, we have come to appreciate the importance of information for technology, communications, and commerce. But already we are getting glimmers that information is something far more fundamental.
Mathematician Keith Devlin, for instance, ponders whether information should be regarded as “a basic property of the universe, alongside matter and energy (and being ultimately interconvertible with them).” Origin-of-life researchers like Manfred Eigen increasingly see the problem of the origin of life as a problem of generating information. Physicist Paul Davies sees information as poised to replace matter as the “primary stuff,” and with this replacement envisions a resolution of the mind-body problem. As he puts it, “If matter turns out to be a form of organized information, then consciousness may not be so mysterious after all.”
Although the information revolution is now in full swing on the technological front, it has lagged behind in the sciences. In part this is because information theory as originally developed in the 1940s was a purely mathematical theory that focused exclusively on the transmission of alphanumeric characters across communication channels. This limited conception of information is now being extended and generalized in ways that someday will allow us to fundamentally rethink the sciences.
Consider physics, for instance. Physics since Newton has sought to understand the physical world by positing certain fundamental entities (particles, fields, strings), specifying the general form of the equations to characterize those entities, prescribing initial and boundary conditions for those equations, and then solving them. Often, these are equations of motion that on the basis of past states predict future states. Within this classical conception of physics, the ultimate accomplishment is to formulate a “theory of everything”—a set of equations that characterize the constitution and dynamics of the universe at all levels of analysis.
But with information as its fundamental entity, this conception of physics gives way. No longer is the physical world to be understood by identifying an underlying structure that has to obey certain equations no matter what. Instead, the world consists of various systems that convey information, and the job of physical theory is to extract as much information from those systems as possible. Thus, rather than see the physicist as Procrustes, forcing nature to conform to mathematics, this informational approach turns him into an inquirer who asks nature questions, obtains answers, but must always remain open to the possibility that nature has more information to divulge.
Nothing of substance is lost with this informational approach to physics. As Roy Frieden has shown, the full range of physics as developed to date can be embedded within this informational approach (see his Physics from Fisher Information: A Unification , Cambridge University Press, 1998). The one thing that does give way, however, is the idea that physics is a bottom-up affair in which knowledge of a system’s parts determines knowledge of the system as a whole. Within the informational approach, the whole is truly greater than the sum of its parts, for the whole can communicate information that none of the parts can individually.
A universe with information as its primary stuff is radically open, and the physics describing it places no limits on what a physical system can in principle communicate. Physics without information as its primary unifying principle, on the other hand, always attempts to put reality back together from the bottom up. And while rebuilding the whole from its parts can admit a certain degree of novelty and unexpectedness, the information revealed will in the end (as in complex systems theory) still be determined by its parts.
Why are these considerations important to religion and faith? A world in which information is not primary is a world seriously hampered in what it can reveal. We’ve seen this with the rise of modern science—the world it gave us reveals nothing about God except that God is a lawgiver. But if information is the primary stuff, then there are no limits whatsoever on what the world can in principle reveal. In such a world, a man called Jesus can reveal the fullness of God, and bread and wine can reveal the fullness of this Jesus’ life and death. A world in which information is the primary stuff is a sacramental world; a world that mirrors the divine life and grace; a world that is truly our home.
William A. Dembski is a fellow of the Center for the Renewal of Science and Culture at the Seattle-based Discovery Institute. His latest book, Intelligent Design: The Bridge Between Science and Theology, was recently published by InterVarsity Press.
Francis Cardinal George
For most of this violent century, realism has been the dominant paradigm for theorists and practitioners of international relations in the United States. There are varieties of realism, of course. Classical realists such as Hans Morgenthau explain perpetual international competition and conflict by highlighting our selfish human nature, while neorealists such as Kenneth Waltz insist that all regimes behave similarly because they are similarly constrained by the “anarchic” structure of the post-Westphalian international system of sovereign states. The neorealist attempt to abstract from human nature begs several questions, one of which is “Why is the lack of certainty with regard to other states’ intentions a problem in the first place?” As realist explanations go, perhaps Hobbes’—which includes both human agents and structure—is the most analytically complete.
But differences in causal emphasis notwithstanding, most realists agree on the inevitability of the hallmarks of international politics: fear, attention to relative capabilities, the security dilemma, balancing through arming and alliances, costly arms races (and the supporting arms market), and war. Fortunately, most realists—contra Machiavelli—also agree that the challenges posed by human nature and freedom do not eliminate normative constraints upon the use of force, including forbidding the targeting of civilians. Instead, responsible realists have prescribed a combination of prudent balancing against capabilities in order to deter, and have argued for the just use of force when deterrence fails.
In his 1993 Foreign Affairs article “The Clash of Civilizations?” and later in his 1996 book, Samuel P. Huntington seems to argue from realist premises—with a civilizational twist—in predicting the international future and prescribing security measures. Huntington asserts that the perceived security and cultural threats posed by the developed and liberal democratic states of the West will be opposed by the collective will of modernizing and increasingly identity-conscious non-Western states, particularly those of Sinic and Islamic civilizations.
History is not ending. Instead, because of ideas—especially religious beliefs—and culture, the realist clash of states will be replaced with the multipolar clash of civilizations. For the West, Huntington adds, the very culture that threatens others also makes it vulnerable: consumerism and multiculturalism erode the strength and unity necessary for this coming clash. His conclusions follow logically enough from these premises. Since “what happens within a civilization is as crucial to its ability to resist destruction from external sources as it is to holding off decay from within,” Western security in this emerging world order will depend upon its collective ability to reverse cultural decay and incoherence. Huntington also recommends noninterventionist policies toward conflicts within other civilizations and core state mediation to prevent escalating conflicts along “fault lines” between civilizations.
While this argument is not devoid of theoretical and empirical weaknesses (e.g., interstate conflicts within civilizations still loom large, perceptions of the West among non-Westerners are more complex than Huntington allows, and the power of the West relative to other civilizations may not be declining), I wish to highlight one of its theoretical strengths. Although subscribing to realism in asserting that uncertainty about intentions will produce fear between civilizations, Huntington asserts that comfort with intentions can occur within civilizations. In his words, “Publics and statesmen are less likely to see threats emerging from people they feel they understand and can trust because of shared language, religion, values, institutions, and culture.” In other words, as liberal peace and critical theorists have been asserting for years, ideas shape human nature and intentions, and these intentions can be known.
The weakness of Huntington’s idealism lies in his overly simplistic depiction of how ideas can mitigate the security dilemma. More important than the mere sharing of ideas is the quality of the ideas shared. Two societies that universalize the notion of rights ought to be more comfortable with one another than two that simply share a commitment to forceful self-aggrandizement.
Weaknesses and oversimplifications notwithstanding, Huntington’s paradigm—a combination of realist and idealist elements—has merit, and illuminates an important international role for the Church in the coming decades. Although factions within Islamic societies may overstate the security threat posed by the West, they are rightly concerned with the effects of modernity and postmodernity on their premodern, faith-oriented cultures. But as the Church’s often difficult relationship with the Enlightenment has helped us to see, not all modern ideas are inconsistent with faith and human fulfillment—or security.
In the coming decades, the dialogue between Christianity and Islam can be a powerful influence on how Muslims perceive modernity and the West’s intentions. Such a dialogue can assure the Muslim world that science and authentic liberalism are neither totally grounded in nor necessarily conducive to secularism, relativism, and individualism, and that liberal societies’ human rights foundations promote peace among them. At the same time, this dialogue can also assure Islam that Christians are well aware of modern societies’ cultural shortcomings and that the Church is working to convert these cultures—an effort that Pope John Paul II has called “the evangelization of culture.” Christians should welcome Muslim cooperation in addressing the moral failures of modernity.
At the end of his book, Huntington offers a final peace-promoting recommendation to the West: “Renounce universalism, accept [global] diversity, and seek commonalities” that constitute the “thin minimalist morality” of “Civilization.” But as mentioned earlier, common ground is not as important as true ground. Cultures share truths because civilizations are products of a human nature created and graced by the one God. Cultures also deviate from the truth—in ways both universal and home-grown—because of the same human nature, created free and prone to pride. As an evangelizing Church makes publicly available the truth that is Jesus Christ, she is in a unique position in this next century and millennium to offer Christ’s gifts to all cultures and to create not just a common civilization but a globalization of solidarity—perhaps even what Pope Paul VI did not hesitate to call a civilization of love.
Francis Cardinal George is the Archbishop of Chicago.
Robert P. George
Anyone gazing into a crystal ball with the aim of divining the future of relations among members of different religious communities in the new millennium would do well to remember how things appeared as recently as 1965. In the euphoria occasioned by the Second Vatican Council, observers looked forward to a flowering of ecumenism and perhaps even the reunification of the Christian Church. Official interfaith commissions were formed to reexamine issues that had historically divided Eastern and Western Christians, Protestants and Catholics, Christians and Jews. Denominational leaders sought opportunities for interfaith cooperation, and theologians explored possible compromises and new understandings to overcome differences in areas of doctrine, discipline, and authority.
One thing seemed certain: the ecumenical action would be on the left wing of the various religious communities, not on the right. Traditional Catholics, conservative Protestants, and observant Jews were viewed as part of the problem, not part of the ecumenical solution. After all, interfaith dialogue would require “flexibility,” “openness,” “tolerance”—virtues of the religious and sociopolitical left (it was supposed), not the right. Indeed, the “rigidity,” “dogmatism,” and “authoritarianism” of conservative religious believers would (it was thought) make them obstacles to the dialogical enterprise. Ecumenism would have to proceed despite anticipated conservative resistance.
Then came the culture war.
The massive assault of the secularist left—largely acquiesced in and very often abetted by the religious left—on traditional Judeo-Christian moral beliefs about sexuality, marriage and family, and the sanctity of human life brought conservative elements of the various religious communities together in the pro-life/pro-family movement. In the beginning, the pan-orthodox alliance was understood by religious conservatives themselves as a sort of marriage of convenience. And even today there are religious conservatives—including some who are active in the movement—who view it that way. (Perhaps it goes without saying that liberal critics of the pan-orthodox alliance are certain that the alliance can never be anything other than a marriage of convenience.)
What is remarkable, and what was in 1965 surely unpredictable, however, is that at century’s end an alliance that began as a marriage of convenience in the moral-political sphere would, without anybody planning or even foreseeing it, blossom into a genuine—and profound—spiritual engagement. As things have turned out, the serious ecumenical action is almost entirely on the religious right—and we have the cultural depredations of the left to thank for it. God really does have a sense of irony, if not humor!
Today, traditional Catholics, Eastern Orthodox Christians, evangelical and other conservative Protestants, and believing Jews are not only working, but praying, together. Interfaith cooperation in pursuit of operational objectives in the culture war (e.g., banning partial-birth abortion, preserving the institution of marriage) has occasioned the emergence of genuine, and unprecedented, spiritual fellowship.
The ecumenism of the pan-orthodox alliance is what Baptist theologian Timothy George calls “an ecumenism of the trenches.” It unites Protestants and Catholics, Christians and Jews, who have in common very practical worries about what Dr. Ruth has in mind for their children and what Dr. Kevorkian has in store for their parents. It brings together people from different communities of faith who listen to Dr. Dobson for advice about parenting and to Dr. Laura for reassurance that they aren’t crazy.
Occasionally, pan-orthodox ecumenism takes the form of theologians and denominational officials sitting down to hammer out joint statements about “justification” or other doctrinal matters. More often, though, it happens when people of different religious traditions find themselves praying together in front of abortion clinics, marching together for life, and working together to assist pregnant women in need.
Although it sometimes takes the form of interfaith worship, it more commonly manifests itself in informal shared prayer, counseling, and mutual spiritual support among people who got to know each other not necessarily in church or synagogue, but at a school board meeting where they had come together to protect their children against, say, indoctrination into the mystery cult of secular sex education.
To focus on the “grass roots” ecumenism of the pan-orthodox alliance is not to suggest that it lacks an intellectual core or that it swings free of the judgments and actions of religious authorities in the various traditions. Andrew Sullivan may or may not be correct to say that First Things is the “spiritual nerve center of the new conservatism,” but it is surely the intellectual nerve center of pan-orthodoxy. The influential initiative known as “Evangelicals and Catholics Together” is the most prominent of many programs of the Institute on Religion and Public Life (which publishes First Things) aimed at deepening ecumenical engagement. And religious officials from the Pope and Catholic leaders such as Richard John Neuhaus, to national evangelical leaders such as Chuck Colson and Bill Bright, to notable rabbis such as David Novak, Marc Gellman, and Daniel Lapin are doing their parts to encourage the pan-orthodox movement and consolidate (and, where possible, formalize) its ecumenical achievements.
The ecumenism growing out of the pan-orthodox alliance is the real thing—it is ecumenism that takes religious faith, and therefore religious differences, seriously. It neither ignores nor trivializes (much less relativizes) the important points of doctrine, discipline, and authority that divide Protestants and Catholics, Christians and Jews. It proceeds not by pretending that all religions are equally true or that doctrinal differences don’t matter, but rather by respectful engagement of theological disagreements.
But this creates a puzzle. How can there be genuine spiritual fellowship between people who sincerely consider each other to be in error on profoundly important religious questions? Protestants and Catholics differ over issues of sacraments, priesthood, papal authority, the Marian dogmas; Jews and Christians disagree about whether Jesus of Nazareth is the messiah promised to the Jews, the eternally begotten son of the heavenly Father, the second person of the triune God.
The spiritual fellowship of the pan-orthodox alliance has been made possible by the promotion of interfaith understanding. The experience of the past three decades reveals that the misperceptions and mistrust that long impeded pan-orthodox fellowship in the days before the culture war were rooted in misunderstanding of the scope and content of religious differences. By largely eradicating misperceptions and overcoming mistrust, the pan-orthodox movement has been transformed from a mere marriage of political convenience. Without ignoring their differences, orthodox Protestants and Catholics, Christians and Jews, have, in other words, come to understand and appreciate that what they have in common goes far beyond a common morality. They share a larger set of beliefs—a worldview—that includes much that is common in theology, anthropology, sacred history, and religious practice.
Protestants need not accept Marian doctrines to understand that Catholics are truly Christians and not “worshipers” of Mary. Catholics do not compromise such doctrines when they understand Protestants who decline to accept them as Christian brothers. Christians do not abandon their belief in the divinity of Christ when they join Pope John Paul II in recognizing Jews as “elder brothers in faith.” Nor do even the most traditional Jews turn from the Torah when they acknowledge Christians as worshipers of the one true God rather than as pagans and idolaters.
What about the future of the pan-orthodox alliance?
Because it is built on a strong base of shared understanding and common worldview, I am confident that it will flourish in the twenty-first century. I have no doubt of its capacity to survive defeats, should they come, in the moral-political struggle. I am even hopeful of its capacity to survive victories—though that, of course, is the far greater challenge.
Will there be bumps along the way? Of course there will be. The “Evangelicals and Catholics Together” initiative has been vigorously opposed by some conservative Protestants and received only tepidly by some traditional Catholics. A lack of agreement on moral issues such as capital punishment and contraception could prove divisive. Catholics and Southern Baptists may continue to squabble about whether the United States should send an ambassador to the Vatican. Many Jews are offended by Baptist and other evangelical ministries directed to their conversion. Some Jews think that Catholic apologies for past anti-Semitism aren’t sufficient; some Christians think that Jewish accounts of the intellectual origins of the Holocaust often overlook the pagan and anti-Christian sources of Nazism.
If the pan-orthodox alliance is to flourish, there are certain developments that must occur. The process of healing the racial divide within American Christianity must begin in earnest. Grass roots spiritual engagement must bring about the fellowship of black and white believers. Of course, this is mainly, though not exclusively, an issue for the black and white evangelical communities, though Jews and Catholics have, as John DiIulio observes, an important supporting role to play in facilitating reconciliation. And believers from the black churches must be welcomed into, and must be willing to join in, the pro-life/pro-family movement at every level—including its leadership. There are black evangelical leaders such as Boston’s Rev. Eugene Rivers who are certainly able, and appear to be willing, to step into the breach.
In the area of Jewish-Christian relations, important intellectual work must be high on the agenda. Christian thinkers following the lead of the Pope must press more deeply to understand the Jewish core of Christian faith and the continuing religious significance of living Judaism. Jewish scholars must similarly work to achieve a theology that respects and makes sense of the Christianity that spread God’s Torah throughout the world.
Perhaps none of this will happen. It is possible that yet again the problem of interfaith relations will deceive the crystal ball. Whether unanticipated, even unimaginable, events will derail the pan-orthodox alliance, God alone knows. But for those of us who, from our various traditions of faith, seek to do His will, there is every reason to hope that He will continue to bless our cooperation.
Robert P. George is the McCormick Professor of Jurisprudence at Princeton University.
Paul J. Griffiths
To be human is to acquire and maintain a habit of being. Such acquisition and maintenance requires, in turn, institutional forms. And at the end of the second millennium there are, worldwide, only three institutional forms through which enough power flows to provide translocal habits of being: nation-states, corporations, and churches. A millennium ago, as the water clocks and the candles marked the turn of the era, two of these (nation-states and corporations) were either altogether absent or utterly insignificant. The churches, though, were then very much present and in most ways dominant; they were the principal donors of habits of being.
But as the silicon-controlled computer clocks mark the beginning of Y2K, things are different. The churches, all of them, are at the margins. The world’s intellectual life goes on without them, in universities resting comfortably in the arms of the corporations and nation-states that fund them; and its political life goes on without them, formed by the demands of the market and the calculations of realpolitik. Concomitantly and inevitably, the forces giving us our habits of being are now primarily economic and secondarily political: we have become consumers who occasionally think of ourselves as citizens. Any tattered remnants of religious habits of being are now subservient to (indeed, usually understood precisely in terms of) habits of consumption.
This means that nation-states and corporations are now the principal determinants of everyone’s character and action. The Church (speaking now of the Catholic Church) has always understood the deep importance of institutional forms that carry culture translocally; it is, after all, itself one of them. It should, then, care deeply about the situation we’re in, and should be reading the signs of the times with close and prayerful attention, attending with the wisdom of a serpent and the innocence of a dove in an effort to see where things are moving and how to shape them into conformity with the alien designs of God.
The first sign of the times is that the influence and significance of nation-states is declining while that of corporations is advancing. Many transnational corporations now have larger budgets than all but the largest of nations; and some, like the Korean conglomerate Daewoo, have larger loads of debt than all but the most heavily laden nations. National politics is, to an increasing extent, subject to the needs and demands of corporations without national identity or loyalty. Nation-states are dissolving themselves into economically motivated transnational groups, as in the case of the European Union.
As the third millennium continues, our habits of being will increasingly be formed principally by corporate forces, not national ones. Nation-states will become parasites upon corporations, subservient to the flow of capital and to the institutional forms produced by that flow. And we will come to understand ourselves primarily not as Americans or Japanese or Indonesians (much less as Christians or Buddhists or Muslims—that option is scarcely available now), but as consumers of the products of the Microsoft Corporation, or General Motors, or Sony.
This is not a comfortable development for the churches. No church—and certainly not the Catholic Church—can be happy that the principal donors of habits of being are corporations, institutions ordered by appetite. But here there is another, more hopeful sign of the times. It seems, increasingly, that the human heart remains unsatisfied with the rewards of appetite sated, or even with the passions of appetite kindled. Something more is always wanted, something deeper and broader and longer-lasting, and as the possibility of finding this something in citizenship fades, and as the bankruptcy and corruption of the corporate promise begins slowly to become evident, people turn again to the churches, and with renewed passion. And so we have resurgent Islam across the world, the explosive growth of Christianity in much of Africa and parts of East Asia, and the increasing evidence of inchoate desires on the part even of jaded and sophisticated European and American Catholic Christians for a habit of being that is truly Catholic, truly all-embracing.
This means that the issue for the churches, as the third millennium advances, will be to find ways to offer to their faithful just such all-embracing habits of being. This will mean, among other things, a depth of commitment to learning from one another not so far evident. Catholic Christians, for example, have much to learn from Muslims about (for instance) the senses in which the American experiment of separating church from state is a Trojan horse from which the corrosively destructive forces of the transnational corporation inevitably burst forth. Tibetan Buddhists in exile have much to learn from Jews about what it is like to attempt survival as a diaspora people. The churches must, if they are to survive as anything other than decorative appendages to consumerist capitalism, if they are to provide anything more than the kind of formation that permits their members to fill in the religious preference box on the census form, look away from the nation-state, already effectively in history’s dustbin, away from the corporation, the principal power of this present age, and toward one another, the only institutional forms that offer a hope of resistance.
A good beginning could be made by developing serious discussions among Catholic, Orthodox, and Protestant Christians, Muslims, and believing Jews about how best to offer to the faithful a catechesis of resistance and contradiction so that they might resist the demonic powers abroad in the world as the millennium begins. If this is not done, the idolatry and violence whose proliferation has soaked the last century of the second millennium in blood will proliferate still further in the third, for there will be nowhere to turn for a habit of being that could act as a sign of contradiction.
Paul J. Griffiths is Professor of the Philosophy of Religions in the Divinity School at the University of Chicago.
Hilton Kramer
It is with a feeling of immense foreboding that I regard the future of cultural life in the first decade of the new millennium, never mind what might be in store for our society further on in the coming century. All the portents point to an acceleration of the merry, mindless, technology-driven surrender to the complacent nihilism that has already overtaken so many of the institutions of cultural life—the universities, the museums, the book-publishing industry, the entertainment media, and the demoralized remains of liberal journalism. What looks to be a certainty in the next decade is that the telecommunications revolution will further imperil the already fragile sanctuaries of high culture—and thus the treasured intellectual traditions of the West that require a vigorous and creative high culture for their survival and renewal.
It is already a fact of life that the telecommunications revolution has had the effect of further immoralizing and infantilizing almost every aspect of popular culture, which is now massified on a greater scale than ever before. It is also a fact of life that our democratic society has lost the power to protect its citizens—and particularly its children—from the evil effects of this cultural imperative.
Consider, for example, the fate of the campaign to limit easy access—which means, of course, mass access—to pornography. In certain cities—New York most conspicuously—the effort to “clean up” districts formerly dominated by porn shops is regarded as a significant success. And in some respects it has been a success. In my own neighborhood in Manhattan—in what used to be called Hell’s Kitchen, a short walk from Times Square—the changes at street level have been dramatic. Yet the vilest forms of pornography are now more easily accessible on the Internet than they ever were on the streets. That, too, is a paradigm of the cultural future.
So is the fact that some of the milder forms of pornography—and some not so mild—have now become standard fare in mainstream television programs, movies, Broadway shows, museum exhibitions, and the many glossy magazines that cater to the fashion and “lifestyle” interests of young adult men and women. A similar decline in the standards of public decency has now become the norm in certain branches of the fashion advertising industry, especially those promoting the sale of underwear and jeans. Sexual freakishness is the norm, too, in, for example, the fashion pages of the New York Times Magazine , which deliberately compete with some of the seamier, so-called “transgressive” forms of visual art in the galleries and museums in an effort to appeal to young consumers. These, too, are portents of a cultural future that is likely to be further immoralized.
Against the growing power of this immoralizing imperative, our democratic institutions have so far proved to be powerless. Far from resisting it, our schools have largely surrendered to it, and so for the most part have the courts. Our political leaders talk about “family values” yet do nothing to support the moral integrity of marriage and the family. There are pockets of resistance, of course. I think the home-schooling movement may be the most significant, for what seems to be the driving force of this movement is not only the desire to secure a more sound education for children than can now be obtained in the public schools, but also a well-founded fear of the immoralizing influence that the culture of the public schoolroom may now be expected to exert on innocent minds. For the schoolroom, too, is now more decisively influenced by the popular culture than ever before. For these reasons, I would expect the home-schooling movement to continue to grow in the coming century, yet its very nature will prevent it from becoming anything more than a marginal phenomenon.
As for the fate of high culture, everything will depend on its ability to marshal a principled resistance to the influence of popular culture. With few exceptions, our universities can no longer be counted upon to contribute anything significant to that resistance. Our academic culture has become part of the problem. So, for the most part, have the liberal media. Everything now depends on those fragile sanctuaries of high culture that continue to exist at a distance from our mainstream institutions. Whether these embattled sanctuaries will play as great a role in cultural life in the next century as they did in some earlier periods of this century, no one can say with any certainty.
But we can take some solace, perhaps, in the fact that so-called “coterie” movements in high culture have contributed most of what we still value in the cultural life of this century. The odds against repeating that difficult success are now greater, to be sure, and we enter the next century with our ranks depleted. Yet the outcome in the cultural sphere of our civilization, as in the political sphere, will depend on the quality of its leadership, and no one can yet say what that will be.
Hilton Kramer is Editor of the New Criterion.
Peter J. Leithart
A millennium ago, Europe faced the dual prospect of economic and social collapse on the one hand and spiritual awakening on the other. From 970 to the mid-eleventh century, there were forty-eight years of famine. Trade and communications broke down, and travelers were threatened by bands of roving brigands. During much of the tenth century, the Church was in continual tumult, with seventeen Popes over fifty years. Simultaneously, the Cluniac movement was giving fresh impetus to monasticism, and during the 990s pagan peoples were flooding the Church: Poles, Magyars, Icelanders, Danes, and, in the East, the Kievan Rus. It was, as someone once said, the best of times and the worst of times.
So, it seems, are all times, including the years at the close of the second millennium of the Christian era. But if our day is also the best and worst of times, “best” and “worst” have been inverted. Materially, Western Europe and America enjoy unimaginable wealth, and the comparative peace of the past fifty years is, if not unprecedented, nonetheless a striking anomaly in world history. Yet the same nations whose rulers swore fealty to Christ a millennium ago are now among the most secular nations on the earth. In Christian terminology, the past several centuries have witnessed widespread apostasy.
As a result, the West is, as it was at the end of the tenth century, a mission field. Indeed, the West is the Church’s main challenge in the coming centuries. Evangelization of pagans has become something of a specialty; if Celts and Magyars and Vikings can be persuaded to beat their swords into plowshares, so can the Sawi and the Hutu. In Europe and America, however, the Church faces a mission field unlike any it has encountered before. Never has a civilization seemed so capable of successfully completing what T. S. Eliot called the experiment in forming “a civilized but non-Christian mentality.” Barbarians may be ruling us, but they are damned efficient barbarians.
More importantly, the West poses a unique religious challenge. New Age faddism notwithstanding, ours is not a pagan civilization, but a civilization once Christian and, in spite of itself, still Christian in many respects. Introduced into the world as heady new wine, Christianity is a hard habit to shake, and the part of the globe once known as “Christ’s domain” has a lingering hangover. During the past three centuries, Western history has been a search for relief that will enable us to turn on the light without wincing. Thus far, the search has been fruitless, but it continues, and even intensifies, as the new millennium approaches.
One prospect is that the medicine will be found; the destruction of what remains of Christendom may actually succeed, which would mean an end to what we know as “the West.” I disagree profoundly with the pop-apocalypticism recently portrayed in Tim LaHaye’s Left Behind series, and I am profoundly skeptical of the techno-apocalypticism associated with the Y2K bug. In one sense, though, I must stand with the loonies: history is not a seamless development but one full of gnarls and tangles, fits and starts, rises and falls. Catastrophes do happen, current trends don’t continue, it can happen here, and there is a God who judges. Success on the mission field of the West may come only when the West is no longer the West. Perhaps the rubble must be cleared before new building can begin. I do not say this with complacency. We simply cannot imagine the horrors that would be unleashed if the remaining constraints were removed. If this house falls, great will be the fall of it.
This is, as the apostle Paul would say, a perplexing prospect, but not a reason for despair. If the current trend of prosperity is not guaranteed to continue, neither is apostasy. Repentance on this scale cannot be manipulated and it is not a product of strategy. Renewal will come, if it comes, not from the head office, but from unexpected backwaters. Martyrs make the most successful missionaries, and air-conditioned churches with padded pews hardly nurture a spirit of martyrdom. Elsewhere, however, there are armies of martyrs-in-training. While the West has been busily trying to forget why it sent missionaries in the first place, missionary efforts have continued and in the past two centuries have met with unparalleled success. Mission fields have been transformed into fields of missionaries.
If it survives, the West will have to relearn the habits of Christian civilization from those once considered barbarians; and light will shine from what was once the deepest darkness. If good things can come from Galilee—and they can—then surely good things can come from Kenya and South Korea.
Peter J. Leithart is Fellow in Theology and Literature at New St. Andrews College in Moscow, Idaho.
George McKenna
It is hard to find many encouraging prospects for the culture of life in the new century. The culture of death seems to be on a roll.
In 1973, after Roe v. Wade was decided, pro-lifers predicted that the right to kill unborn children would turn into the right to kill already-born children and other vulnerable people. The reaction, from their opponents and from the press: “Oh, come on!” It took twenty years for “Oh, come on!” to become “Why not?” They started pulling out plugs on respirators in 1976, and in the 1980s they started pulling out feeding tubes. In 1987 a North Dakota court declared that even spoon-feeding is “artificial and intrusive.” In the 1990s the culture of death made the next big leap: to the act of neo-infanticide that is partial-birth abortion.
Then, in 1996, two U.S. Appeals Courts, one in California and one in New York, struck down state laws banning euthanasia. (The appeals court in California explicitly confirmed the “slippery slope” predictions of abortion opponents by basing its decision on the “compelling similarities between right-to-die cases and abortion cases.”) The Supreme Court reversed the two decisions, but the majority opinion left the door open to future claims, and a series of concurring opinions, adding up to a majority, outlined circumstances that would justify a constitutional “right” to die with the help of doctors.
The state of Oregon hasn’t waited for the courts. In 1994, 52 percent of Oregon voters backed a measure allowing physician-assisted death. California and Washington voters had turned down similar proposals, but this one was sold to the public as ultra-cautious: physicians could only prescribe, not administer, the drugs; the drugs were only for patients with six months or less to live; and the patients had to request them three times.
We know what happens to these guidelines. The Netherlands—the North Star of the right-to-die movement—began physician-assisted suicide fifteen years ago with guidelines like Oregon’s, and virtually every one of the guidelines has been violated with impunity. At least one out of five so-called assisted suicides in Holland is involuntary (called “termination of the patient without explicit request”). That seems to be the direction we’re heading.
Is there any reason to hope that things can be turned around? Well, consider this statement: “Our progress in degeneracy appears to me to be pretty rapid.” Abraham Lincoln wrote it in 1855 as he reflected on some discouraging developments, including the Kansas-Nebraska Act of 1854, which had opened the western territories to slavery. Slavery was spreading like a cancer (a “wen,” Lincoln called it). Yet, within ten years, the whole damned thing was gone. Even before the terrible war, even by 1860, it had been shaken to its foundations by acting men and women: by ministers, novelists, newspaper editors, “conscience Whigs,” and by Lincoln himself.
The present “progress in degeneracy” is as rapid as that of slavery in 1855. But the death culture has the same vulnerability as the slave culture: just about everyone knows that it is wrong. In his 1858 debates with Stephen A. Douglas, Lincoln kept probing the raw nerve of slavery: “The real issue in this controversy—the one pressing upon every mind—is the sentiment on the part of one class that looks upon the institution of slavery as a wrong, and of another class that does not look upon it as a wrong.”
And so today: Even most pro-choicers know that killing innocent people is wrong. The mutterings of conscience can be heard not only in the public’s responses to polling questions but in the embarrassed, euphemistic language used by abortion advocates: abortion is “reproductive choice,” abortion clinics are “women’s clinics,” unborn children are “products of conception,” and so on. Consider the New York Times’ struggles to avoid saying “partial-birth abortion.” First they called it “a rarely used procedure,” until they found out it wasn’t rare; then it was “a procedure known officially as D&X,” until they found out that “D&X” was not in the medical literature but was just made up by its inventor; now they call it “a type of late-term abortion.” That a newspaper that prides itself on its lucid English should resort to such weirdly elliptical language shows that something is troubling its editors.
It is getting so hard to deny the humanity of unborn children. Remember when pro-choicers used to call them “blobs of tissue”? Ultrasound pictures have retired that expression, and fetology keeps giving us astounding new information about their mental and physical activities in the womb. It may be just because of all these developments, because the perceived line between the unborn and the newly born has thinned into nonexistence, that the most hardened pro-choicers are tempted to go all the way to infanticide with Peter Singer.
My hunch is that most in the abortion movement will not. But, speaking as someone interested in political change, I don’t much care which way hard-core abortion advocates go. They are a distinct minority. The majority of Americans belong in the “mushy middle,” allowing women to have a “right” to abortion while agreeing that it is wrong and should be limited.
Eventually, as Lincoln said of slavery, it will be all one thing or all the other. Right now the death culture seems to be winning. But history is not like astronomy; we are not passively watching the inexorable movement of things. Human beings act into history, nudging events into surprising tangents. Suppose a candidate in a general election were to say: “The underlying division in this campaign is between those who regard abortion as a wrong, and would limit it, and those who do not look upon it as a wrong and would tolerate its extension.” This might or might not help the candidate (I think it would) but it would certainly blow away much of the smog that has obscured public thought on this issue. And, once people start thinking, all bets are off.
The rest of us, the nonpoliticians, have our own role to play. Like those in the 1850s who refused to stand back and watch the “progress in degeneracy,” we must act into history. In his 1947 novel The Plague, Albert Camus wrote of what had to be done “by all who, while unable to be saints, but refusing to bow down to pestilences, strive their utmost to be healers.” Reaching out to the ambivalent majority, getting it to think, and getting its support for feasible measures that will stop the further spread of the abortion culture and put it in the course of ultimate extinction must be on the agenda of healers in this new century.
George McKenna is Professor of Political Science at City College of New York and author of The Drama of Democracy (McGraw Hill).
David Novak
The twentieth century has witnessed the worst and the best in Jewish-Christian relations. On the negative side of the ledger is the Holocaust. Even if responsibility for the systematic murder of six million Jews lies with the anti-Christian ideology of the Nazis, the fact is that traditional Christian anti-Judaism was easily appropriated by that ideology, and too many Christians either supported the murder of the Jews or did nothing about it when much could have been done against it. But on the positive side of the ledger, in the postwar era Jews and Christians have been talking and working together in a constructive and wholly unprecedented way. Why did the worst and the best emerge in the same century? How one answers this question will suggest an agenda for good Jewish-Christian relations in the twenty-first century.
The world Jews and Christians now inhabit is neither Jewish nor Christian. That has led Jews and Christians interested in surviving in this world to become more genuinely religious for the sake of their own identity. This must be seen from the background of a more startling fact: in this past century, especially during its latter half, the political power of Jews has grown at the same time the political power of Christians has shrunk. Our culture, which formerly could be considered “Christian,” no longer looks to Christianity for its justification. This has come as a great shock to many Christians, and it largely explains the worst and the best of the ways Christians have related to Jews.
At worst, Christians have blamed Jewish support of secularization for their loss of power, and some of them have turned to anti-Semitism as a way to restore the anti-Judaism that once was an element of Christian social and cultural hegemony. But the Holocaust showed that modern, racial anti-Semitism not only does not need religious anti-Judaism, but is actually intent on destroying the Jewish roots of Christianity. Without those roots, Christianity cannot survive.
At best, the full realization of what the Holocaust meant has influenced Christians to seek out Judaism and Jews, and to come to understand that the inherent theological rivalry between Judaism and Christianity cannot be overcome by Christian defeat of Judaism and the Jewish people. Today, Christian scholars learn Judaism from Jewish scholars and thinkers, along the way discovering just who the Jews really are. The results, both intellectual and political, have been impressive.
Despite the Holocaust, Jewish power in the secular world has grown enormously. Proof of this is how well the Jewish people survived the Holocaust, with a greater determination to be more active and less vulnerable in the world. Jews have not only become equal citizens in Western democracies, they have become leading citizens. And, of course, the reestablishment of the State of Israel has given Jews a political presence in the world they have not had since biblical times. In the course of all this, many Jews have looked to the increasing secularity of the world as the source of their newly won power. Jews of this mindset are usually anti-Christian, since they regard Christians as the primary group wanting the Jews to return to the ghetto—or worse.
However, an increasing number of Jews are now realizing that it is problematic to look to secularism for our survival. Secularism has no need for Judaism, for what makes Jews Jewish in the first place. It is, therefore, a recipe for our disappearance—either with a bang or a whimper. That is why more and more Jews are turning inward to the religious content of the Jewish tradition to justify their continued identity. And, while some of these returning Jews look at Christianity and Christians as an ancient foe, others are beginning to realize that Christians are facing challenges similar to ours, and for the same reasons. Christians and Jews alike are the new exiles of the contemporary world, struggling with how to sing the Lord’s song in a strange land.
I think Jewish-Christian relations in the next century and millennium will grow in breadth and depth if Jews and Christians accept that neither community can or should control the secular realm. Christians cannot and should not attempt to regain a world they have lost, and Jews cannot and should not try to gain a world they never had. When Jews and Christians are able to say more and more “I am a stranger on earth, do not hide from me Your commandments” (Psalm 119:19), they will find how much they need each other to be able to keep these commandments here and now.
David Novak holds the J. Richard and Dorothy Shiff Chair of Jewish Studies at the University of Toronto.
Edward T. Oakes
Leaving aside genuine scientific discoveries such as relativity and the helical structure of the genetic molecule DNA, or the truly original innovations in linguistics from Noam Chomsky, the twentieth century has, in retrospect, shown itself to be singularly poor in intellectual creativity. Even a quick glance at the events from 1900 to the present reveals that nearly every idea driving the passions of what Nietzsche called “Great Politics” in the last one hundred years was conceived during the reign of Queen Victoria: communism, anticolonialism, psychoanalysis, socialism, protest atheism, evolution by natural selection (and its malign cousin, Social Darwinism), feminism, utilitarian ethics, behaviorism, racism (and the racist version of anti-Semitism)—all were hatched in that overheated terrarium, the late-nineteenth-century mind. (Fascism might seem uniquely twentieth century in its genealogy, but it is merely the fusion, however awkward, of socialism and ethnocentric racism, as the term National Socialism already indicates.)
Moreover, with the lone exceptions of feminism and evolution, not a single one of these ideas has survived its (at times bloody) trials in the laboratory of the twentieth century. In fact the two major legacies of the eighteenth century, free-market capitalism and universal human rights, show much more promise of animating the politics of the next millennium than anything conjured up in the nineteenth, let alone the twentieth, century.
Something similar applies to Christian theology, although because there is almost always a significant time lag between developments in secular culture and the Christian churches, one must move the threshold ahead by almost a century. For in the decades between 1919 (when Karl Barth published the first edition of his famous commentary on Romans) and 1988 (when Hans Urs von Balthasar died) the Christian churches witnessed an extraordinary outpouring of theological genius, while after that date everything seems faintly derivative and hollow, where positions have already begun to harden into rote ideologies, and “ignorant armies clash by night.” To compare the achievements of Barth and Rudolf Bultmann on the Protestant side or Balthasar and Karl Rahner on the Catholic side with contemporary efforts immediately leads to the dismaying realization that perhaps the era of theological creativity will pass from the churches in the next century in much the same way that genuine intellectual creativity almost completely disappeared from twentieth-century culture generally (again, Einstein, Chomsky, and a few others excepted).
One recent and sure sign of contemporary decadence is the habit of pigeonholing every theological position into the hoary categories of liberal and conservative and then judging the position on that basis, often without any direct acquaintance with the text advocating that position. I do not wish to deny some initial utility to these terms, but their constant invocation remains worrisome. For one thing, these terms are largely irrelevant in the history of theology (was Arius conservative or liberal?). And if the liberal/conservative spectrum cannot truly describe the past, why are these categories not similarly pointless today?
In my opinion, the reason for this dreary state of affairs is that great curse of the contemporary intellect, what I shall call the digital mind. In the famous debate over whether computers can think, I hold with John Searle and Roger Penrose that they cannot, precisely because they operate by manipulating binary oppositions. But I would also say the same of humans: they too prove incapable of anything passing for real thought whenever they browbeat every insight, like cattle being herded in a slaughterhouse, into the binary boxcars of liberal/conservative, progressive/traditional, liberating/constraining.
For this reason, predicting the future of theology becomes not just impossible but bootless. Any real theology will be startlingly new, as unexpected as the emergence of Augustine’s Confessions, or Pascal’s Pensées—or Karl Barth’s famous No to liberal theology with his commentary on Romans, which, in Karl Adam’s well-known phrase, hit the playground of the theologians like a bombshell. But such newness, should it come, will strike contemporaries not just as unexpected but also as a grace. For whatever the history of theology teaches, it surely must be that theological creativity is not a birthright of the Church. On the contrary, some centuries are numbingly sterile in terms of theological creativity: the sixth to the tenth centuries in Western Europe; the era of nominalism in the fourteenth and fifteenth; the eighteenth in Roman Catholic seminaries.
Speaking of seminaries, Catholic theologians in the academy are currently in a dither about Roman control of their product, especially after the promulgation of Ex Corde Ecclesiae and Ad Tuendam Fidem, Roman documents that, to judge by some reactions, have reduced the role of the theologian to that of a public relations officer for Microsoft Church. Cardinal Bellarmine once said that the Church’s visibility differs in no way from that of the Republic of Venice; and in that regard the Catholic Church has in recent times taken on (perhaps inevitably, if Bellarmine is right) certain aspects of corporate culture, with gleaming skyscraper chanceries, slick videos, banks of lawyers advising the CEO bishop. (I once called an out-of-town chancery and got a voice-mail message that gave a menu of options that came close to this parody: “for annulments, press 1; for liturgical complaints against your local pastor, press 2,” and so forth.)
But this lament ignores Rome’s entirely legitimate worries about an even more obvious ideologization of the academy. “Pious scholars are rare,” Pascal once observed, and under that rubric the increased professionalization of theology cannot be an entirely unmixed blessing. I hunger for real debate in theology as much as anyone, free of the instant kibitzing of some low-rung Vatican cleric in the Roman Curia; indeed I salute the medieval university precisely because it embodied the reality of structured debate so well (in which, of course, the hierarchy was fully a participant, making an institutional adversary relationship between theologian and Magisterium inconceivable).
But is this true of the academy today? For example, how much debate really goes on in the Catholic Theological Society of America nowadays? Has it finally become, as so many critics assert, the equivalent of the General Assembly of the United Nations, that is, a group that likes to pass resolutions that seem daring in their desire to épater les Curialistes (that is, to tweak Rome on women’s ordination, nuclear weapons, and so forth) but which in fact never stray from the politically correct norms of the academic left?
Rather than answer such questions or seek to predict the future, I will simply say that insofar as either Rome or the academy regards any particular position across a range of issues—especially pertaining to the deliverances of science—as the only one that can be publicly advocated, then we are still stuck in the era of digital theology, which in fact is not theology but ideology.
Edward T. Oakes, S.J., teaches in the Religious Studies Department of Regis University in Denver, Colorado.
John J. Reilly
The term “postmodern” is an unsatisfactory description of the last few decades of the twentieth century. Postmodern is a definition-by-negation, which is rarely a good idea: consider the example of those atheists who devote their lives to combating a nonexistent god. Moreover, there never really was much evidence that the period was moving beyond the modern era in any serious sense. In both its popular and elite forms, the postmodern spirit is largely a matter of living off the achievements of the modern age by making fun of them.
But then, what of the term “modern” itself? Strictly speaking, any era can (and does) call itself modern. When we speak of modernity, we usually have something more specific than “the present” in mind. Even so, the term is elastic: modernity can mean the twentieth century after the First World War, or the nineteenth and twentieth centuries, or everything after Columbus. The historian William McNeill once plausibly suggested that the modern world system actually began in eleventh-century China.
It makes most sense, I think, to consider that our modern world began with the French Revolution. Our era is an episode within the Enlightenment, some of whose possibilities it realized and some it forever precluded. Modernity has had a great deal in common with the Hellenistic Age of the classical West and with the Warring States period in ancient China. It is a good bet that, like those epochs, it will last rather less than three centuries. Probably some watershed like 1789 lies ahead in the twenty-first century—more likely in its second half than in its first—on the other side of which, history flows in another direction.
The future will look after its own nomenclature, but I for one find it hard to resist speculation about how the future will characterize our modernity. Even if we entertain the notion that there have been analogous periods in the past, still every such era must also be unique. “Warring States” would not be appropriate for the modern West, for instance, since the era has not been one of continual warfare, but of unusually long periods of tranquillity punctuated by apocalyptic explosions. Herman Hesse made a better suggestion in The Glass Bead Game, where modernity is seen from the future as the “Age of Feuilletons.” That is just strange enough to happen.
Certainly the name would have to evoke the tendency toward analysis and reduction that has characterized the West these last two centuries. The great movements in intellectual life, from philosophy to economics, have been toward atomization, even as sovereign states multiplied in accordance with the principle that every little language must have its own country. The modern era is really the Age of Nominalism. As for its postmodern coda, these decades are simply the stage when nominalism achieved its natural culmination in solipsism, in language speaking itself.
This brings us to the age to come. In naming the future, it seems fitting to proceed with a little help from Hegel. Historical epochs do tend to react against the excesses of their predecessors, though that is never all they do. If the Age of Nominalism is the thesis, then any medievalist can tell you that the obvious antithesis will be an Age of Realism.
Maybe already we see the beginnings of an age that is more interested in synthesis than analysis. These adumbrations take various forms, from the proposals for a “final theory” of physics to the two-steps-forward, one-step-back progress toward world government. Perhaps we see a hint of the mind of the future in E. O. Wilson’s ambitious and metaphysically naive notion of “consilience,” a universal structure of knowledge that would have a sociobiological backbone. More ambitious and not at all naive is the project outlined in John Paul II’s Fides et Ratio , which looks toward a harmonization of our understanding of all levels of reality, something not seen since the Thomistic synthesis. None of these projects is likely to have quite the results their proponents have in mind, but they may tell us something about the cultural climate of 2100.
John J. Reilly is a writer living in Jersey City, New Jersey.
Jeffrey Satinover
Millennia are really big events—like when the odometer on your car rolls over from 9999.9. Well, the kid brought his car in for its 2,000 year check-up, so I did it. But I warned him, the news is really bad, so fixing it is really going to cost a lot.
Here’s one problem. The redeemers have already arrived. The Thousand Year Reich may seem to have lasted a scant few years, but if you look carefully, you’ll see that after three days in the bunker, almost every one of its core ideas was resurrected to radiate future-ward over ever spreading territory. Mercy killing, abortion, infanticide, the whole conceptual structure of unlebenswertige life, once seen as repulsive, has within but one generation been transformed into the very portrait of beauty. Eugenics, insistent racialism, and nationally demarcated socialism are now the common heritage of all enlightened Westerners. Governor George W. Bush got it (inadvertently) right: it’s ridiculous to whine that we’re “slouching toward Gomorrah”—we’re in a dead sprint, chest at the tape, proud of our imminent triumph.
That’s one problem. Another is this: Fairness. The hereditary aristocracies have vanished, but a new cognitive elite has just begun its ascent to dominance, and everyone is invited to join—if they are smart enough to know how. At our best universities, the largest proportion allow their brains to be shaped by useless drivel: deconstructionism, alternate sexualities, loony methods for achieving “social justice.” The American mind isn’t just being closed, it’s being evacuated.
But there is another, much smaller, group on campus. It pursues quantitative studies. Year by year, the content grows ever more sophisticated and complex. Many years of training in sophisticated mathematics will allow a student to but scratch the surface of, say, early quantum mechanics. To really get its present state requires many years more. This group is naturally both elite and meritocratic.
Math and physics students are now being recruited by Wall Street because even finance is becoming ever more quantitative. The right shoes and the right social network may still help at the beginning, but increasingly, it’s the right number on your Stanford-Binet, and the application of that number to the right stuff, that does the trick. It’s not for nothing that the riches of Silicon Valley were created by nerds. Of all the major universities, MIT produces the largest proportion of entrepreneurs and has the reputation elsewhere in the world of being “the best.”
When the rising elite have consolidated their ability to manipulate emerging biomolecular and quantum computational technologies, they will form a club whose barriers to entry will be the most scrupulously fair in history—and the most ruthlessly impenetrable to the unqualified. Having little need to preserve dominance by force or trickery, they may form, if they have a mind to, the most benign and self-centered ruling class imaginable. The arrogance of today’s “caring” elites is a mere foretaste of the unasked-for helpfulness to come.
Perhaps we will even alter human nature itself, and turn ourselves into something utterly alien, a race for whom the old standards—wisdom, humility, nobility, kindness—will be discarded like a serpent’s skin. We’ll just become winners, until we meet an alien race better at it than we, but since at that point we will be genetically convinced that might alone makes right, it won’t matter.
Yet another vehicular system that seems to be failing is religion. I suppose that God Himself is doing just fine, but His earthly defenders are on the ropes—and it’s our own fault. Religion deservedly comes in for more criticism in its failures than does science, because genuine religion claims for itself the ability to know what’s true, whereas genuine science claims for itself only the ability to quantify the probability of a thing being wrong. (Bad science and bad religion simply swap roles, the former proclaiming Truth, the latter worshiping Doubt.) Religion’s bête noire is the fact that a genuine truth arrogantly asserted—that is, without so much as a moment’s consideration that it might be false—is a most pernicious kind of falsehood, far worse in its effects on the humane than a flat mistake.
It’s a matter of modesty. It never uses the term, but science itself is a method to insure modesty of claims (however arrogant its practitioners). Religion, on the other hand, speaks constantly of the virtues, and then, on the whole, displays them with no greater consistency than does any other human institution.
This defect interacts dangerously with a second one. The rising cognitive elite doesn’t care for religion. For them, fine-sounding phrases about “brotherly love” are a joke. In their world, it’s simple: if you’ve got the brains and can back them up with action, you’re a full-fledged member. It is among this elite that the highest proportion of truly multiracial progeny can already be found, and more than anything else, that expanding reality will be a far more convincing argument that they’re right and the religionists wrong.
But the biggest problem is this: the world is changing far more dramatically than I think the boy can appreciate. It’s a world where quantum teleportation, quantum computation, and quantum cryptography, for example, are not only being taken seriously, some have already been implemented at practical scales and are the object of intensive commercial research and development. It’s not just a matter of some really cool technologies for us to gape at, but of a world where only those capable of mastering the wizardry behind the technologies will rise, and where our creations may well outstrip their creators.
How about, say, self-evolving brains composed of teleporting quantum computational elements processing information simultaneously in multiple universes? Science fiction? Nope. Between July 16 and 19, 1999 at the Jet Propulsion Laboratory, NASA and the Department of Defense held their first annual conference on evolvable hardware. A sample of the presentations:
- Evolving Circuits by Means of Natural Selection
- Embryologic Electronics
- The Design and Use of Massively Parallel Fine-Grained Self-Reconfigurable Infinitely Scalable Architecture
- Self-Repairing Evolvable Hardware
- Genetically Engineered Nanoelectronics
- Co-Evolutionary Robotics
- Evolving Wire Antenna
Willy-nilly, we have embarked upon an adventure that leads to shores far more distant and alien than any we have ever set out for before. Attempts are being made, naturally, to link all this weirdness to philosophies and theologies of yore, to take the utterly mysterious and make it at least sound familiar. But I suspect that we are on the verge of something that we’re not going to be able to grasp quite so simply. It’s possible that we’ve all been wrong in important ways all along. Even human immortality is not so remote a scientific possibility as was once thought.
Picture a world, then, in which, long before the dawning of the fourth millennium, mankind has created conscious, brilliant semiconductor simulacra made of endlessly self-repairing parts; it has itself eaten of the second tree in the Garden, that of Life, and thus acquired the immortality it has long sought—or at least a select group has, whose members have likewise devised methods for the enhancement of their already concentrated pool of intelligence-associated genes. Where in such a world would there be a place for divine justice? Nowhere. (Unless, of course, there really is a Heaven, in which case the justice would be perfect.)
So, what were the damages for all this? You won’t be surprised at the reaction I got: “Change myself? But it’s the car that’s got the problem!” All I can say is, between now and the next checkup he better bring the thing in to an authorized service center—and on a regular basis. And you know, I don’t get his dad. What lunatic gives a teenager with a long record of moving violations a souped-up Lamborghini, an instruction booklet, and a set of keys? It’s no wonder that sometime around 2450 a group of traditionalists are going to take off for the new world found orbiting around Cygnus 351.
Jeffrey Satinover is the author of Homosexuality and the Politics of Truth, Cracking the Bible Code, and The Quantum Brain (Wiley, forthcoming). He has long been a psychiatrist in private practice in Connecticut and is currently a student in physics at Yale.
Glenn Tinder
Our clear awareness that there are many different ways of looking at the world, and at man and his place in the world, is one of the most troubling circumstances of our time, and it is sure to endure well into, if not throughout, the third millennium of the Christian era. It is troubling simply because it makes it hard to think that one’s own particular way of looking at things is altogether true. The idea that there are numerous and conflicting truths, or, to put the same idea in other words, that there are numerous and conflicting illusions—all shaped mainly by the historical situations of those clinging to them—is, for many people, nearly irresistible.
Manifestly, present cultural diversities, which not only appear on a global scale but encounter one another within single societies, must on the whole be tolerated. But tolerance easily becomes acquiescence in the submergence of truth into a shifting variety of opinions and impressions. To those with a merely sentimental attachment to a culture or group, tolerance of that sort may be acceptable. It cannot be acceptable to followers of the God of Israel, however, and Christians and Jews are challenged, as they enter the new millennium, to develop an attitude toward the religious and cultural confusions surrounding them that is tolerant, yet, in refusing any dalliance with relativism, is distinct from traditional tolerance. To mark the distinction I shall call this attitude “forbearance.”
The nature of forbearance can be understood in terms of three principles, all drawn from St. Paul. The first, eloquently described in 1 Corinthians 13, is charity, or, to accentuate the aspect of charity that concerns us mainly, neighborly love. Our neighbors today, in a world of instant communication, swift travel, and extensive emigration, are Muslims, Buddhists, Confucianists, and Hindus, as well as Marxists, Freudians, and innumerable other kinds of agnostics and atheists. Ignoring them would not express neighborly love, but neither would trying to force them all to think, feel, and live as we do. The only charitable relationship to them is one formed by attentiveness, which is a readiness to listen in a genuine effort at understanding, and a readiness to speak truthfully, persuasively, and even, in proper times and places, evangelically. In short, neighborly love amid diverse faiths means communality, or dialogical patience.
Such is one mark of forbearance. Yet the God of Israel did not call his people into communality with the Canaanites. In view of God’s “jealousy”—a divine characteristic stressed about as forcefully in the Bible as are righteousness and mercy—how can communality, on the part of believers, be justified?
The answer may lie in a second Pauline principle defining forbearance. This is found in chapter 11 of the letter to the Romans, where the apostle grapples with one of the most distressing facts of his life: that most of his fellow Jews rejected Christ. Such “diversity” did not shake his faith, but it tried him sorely. The insight which quieted his concern—that the recalcitrance of the Jews was part of a divine strategy of redemption, encompassing Jews and Gentiles alike—seems to me of utmost significance for people struggling toward truth amid a multiplicity of creeds. It tells us that such a multiplicity is not accidental. It is set in the context of what Augustine calls “the beauty of the ages”—that is, the providential form of all historical time—and its ultimate consequence will be clarification. It has a place in the process of divine instruction that is fashioning the human race into a perfect and enduring community, the kingdom of God. Accordingly, the jealousy of God does not command intellectual tyranny but a heedful and articulate fidelity to His word, a truth not vulnerable to the doubts and perplexities inherent in human culture.
Forbearance cannot be understood, however, wholly in terms of a general rule of conduct like communality, or of universal human destiny. It needs to be sharply focused on the life of every individual. This is made clear in a third Pauline principle, the doctrine of election. The beauty of the ages is a drama in which every person is given a distinctive part. Every part is played by caring in one’s own assigned way for the truth. Election is often thought of as ordination to salvation, and of course it is. But it carries responsibilities, and these decisively affect the situation of the elect. Not only are they, so to speak, given custody of the truth; if they fail to meet their responsibilities they may, at least for a time, be cast aside. Such occurrences belong to what Paul refers to as God’s “unsearchable judgments” and “inscrutable ways” (Romans 11:33).
Hence there is no room for pride or complacency. One may be sure of possessing the truth—but only in a mood of fear and trembling. Nor is there room for contempt toward those still in darkness, for rejection is no more an inalterable determination than election is. Illustrative of the meaning of forbearance, given a fully nuanced consciousness of election, is Paul’s anguished love and inextinguishable hope for his Jewish brethren. Certitude, humility, and charity are perfectly reconciled.
Forbearance means bearing the discord of minds and hearts occasioned by our fallen state. (Since the word “tolerance” derives from tolerare, meaning to bear, forbearance may be seen as tolerance in its precise signification.) Given the complex obligations it imposes—embracing doctrinal and cultural opponents in a spirit of communal readiness, enduring diversity as mysteriously integral to the divine work of redemption, and discerning and meeting distinctive responsibilities for the care of truth—forbearance is demanding and severe. It must therefore be an art. Like any art, however, it does not only involve difficulty and labor; it can bring peace and happiness. Thus Paul ends his agonizing reflections on the Jewish rejection of Christ with an exultant exclamation—”O the depth of the riches and wisdom and knowledge of God!”—which breaks out when he realizes the providential power with which God will enlist every intellectual and spiritual disorder in the cause of truth (Romans 11:33). In an era that says to us every day, “There is no Truth,” the art of forbearance might at least help us resist the temptations of relativism. And it might even help us enter with joy into a destiny that will finally show forth the Truth in such plainness and splendor that no one who has ever lived will be able to misunderstand or ignore it.
Glenn Tinder is Professor of Political Science Emeritus at the University of Massachusetts at Boston and author of The Political Meaning of Christianity.
You have a decision to make: double or nothing.
For this week only, a generous supporter has offered to fully match all new and increased donations to First Things up to $60,000.
In other words, your gift of $50 unlocks $100 for First Things, your gift of $100 unlocks $200, and so on, up to a total of $120,000. But if you don’t give, nothing.
So what will it be, dear reader: double, or nothing?
Make your year-end gift go twice as far for First Things by giving now.