Dame Rebecca West had a theory that the history of civilization since Christ could be divided into three panels like a triptych. In the first panel, stretching roughly from the Crucifixion to the Middle Ages, the language of theology so dominated learned debate that all complaints were expressed in religious terms, even when the problem at issue was economic or political. The poor and discontented “cried out to society that its structure was wrong . . . and said that they did this because they had had a peculiar revelation concerning the Trinity. The hungry disguised themselves as heretics.” After a few brief centuries of clarity, mankind proceeded to the third panel, in which the opposite problem prevails: “Those suffering from religious distress reverse the process, and complain of it in economic terms. Those who desire salvation pretend that they are seeking a plan to feed the hungry.”
West, writing in 1949, was thinking primarily of communism. From the Stalinists recently ascendant in her beloved Balkans to Fabian grandees Sidney and Beatrice Webb in England, from whose dinner parties she had lately been banned for being too argumentative, the socialists of her day were united in their endorsement of the Marxist axiom that all human behavior can be traced back to material motivations. The purpose of this logical razor was to discredit their opponents by attributing all bourgeois beliefs to class interest, with religion and morality reduced to power plays designed to keep the proletariat in subjection.
West was clever enough to realize that vulgar Marxism was just as likely to be directed inward. A socialist, especially a Western European one, was often someone who had perceived within himself certain longings that an earlier age would have properly identified as moral or religious, but whose intellectual equipment could only process these longings as commitment to social justice. An entire generation had developed a warped idea of what moral seriousness sounds like, and they ended up pledging their souls to economics as a result.
The moral vocabulary that now prevails in the United States is less Marxist but no less vulgar, for it is just as adamant that all moral claims be translated into material terms. The only difference is that material self-interest is now permitted to coexist with material altruism. Bad behavior can be condemned only if it is shown to correlate with some quantifiable negative outcome like a greater likelihood of receiving a free or reduced-price lunch among grade-schoolers, a higher incidence of antidepressant use among adults, or a measurable decline in the national GDP. Moral questions are treated as if they were, at the end of the day, merely empirical. We are hesitant, almost to the point of paralysis, about making moral claims on moral grounds.
This error is not the same thing as scientism, that ripe intellectual leftover of the Progressive Era. In fact, it is almost the reverse. Scientism was an error of extravagant overconfidence, an optimistic faith that experimentation could lead us to grand truths formerly unknown. Nowadays, we suffer more from timidity than arrogance. Rather than expecting science to solve all our hitherto insolubles, we lean on science when making even the most modest claims. The arrogance has not entirely abated—especially not among economists—but still: It is one kind of madness to expect science to put a permanent end to war abroad and inequality at home, as the Progressives did, and another kind of madness to hope that science will someday find evidence suggesting that adultery is in fact wrong or drug addiction in fact undesirable.
In its 2005 decision in Roper v. Simmons, the Supreme Court ruled that criminals who commit their offenses before turning eighteen are protected from execution by the Eighth Amendment. It is possible that the court erred in its treatment of precedent and legislative prerogative, but certainly it erred in relying as heavily as it did on studies by developmental psychologists showing, for example, that “adolescents are overrepresented statistically in virtually every category of reckless behavior,” as if such findings could settle the moral meaning of cruelty. The Roper majority did not explicitly refer to any neuroscientific studies, though several amici did. This omission was rectified in the subsequent juvenile-offender case Miller v. Alabama (2012), in which Justice Elena Kagan, distressingly confident that scientific and social-scientific conclusions are reliable and impartial and mean just what they appear to mean to the average layman, cited an amicus brief that states: “It is increasingly clear that adolescent brains are not yet fully mature in regions and systems related to higher-order executive function such as impulse control, planning ahead, and risk avoidance.”
The Roper and Miller majorities obviously meant to convey the message, “Executing juvenile offenders, or sentencing them to life without parole, is wrong. We abhor the idea. It revolts us. We would as soon bring back drawing and quartering as affirm this practice in twenty-first-century America.” In a decision hinging on the definition of cruelty, such ethical resolve is entirely appropriate. Yet when taking this moral stand, the justices felt it necessary to reach for evidence, and when reaching for evidence, they looked to behavioral science and neurology.
Perhaps judges are just overscrupulous by nature, or by training, and should be allowed to err on the side of evidential punctilio. If that’s the case, then surely politicians should tend in the opposite direction. Making grand moral pronouncements based on no more than anecdotal evidence was once every politician’s bread and butter. It is much less so now. In reviewing the two parties’ platforms ahead of the 2013 parliamentary elections here in Australia, I came across this startling sentence in a platform brochure about mental health policy: “The financial cost to Australia of mental illness in young people aged 12 to 25 was $10.6 billion in 2009, approximately 70 percent of which is productivity lost due to lower employment, absenteeism, and premature death.” There are only two things to be said about that $10.6 billion figure: as an attempt at numerical accuracy, it is useless; as an attempt to demonstrate that youth mental illness is a bad thing, it is superfluous.
Plenty of politicians talk this way in America, too—take President Obama’s claim that universal pre-K will return seven dollars for every dollar invested by reducing drop-out rates, teen pregnancy, and violent crime, as if that could possibly be true. But infinitely worse are the journalists. Writers on economics predominate among the left’s star columnists and bloggers (Paul Krugman, Matthew Yglesias, Ezra Klein, et al.), and they share a bad habit of denigrating anyone who does not approach politics the way an economist would as unserious or indifferent to facts.
Lesser pundits and journalists parrot academic studies as if they were unimpeachable, even when the resulting headlines are as absurd as “Racial Inequality Costs GDP $1.9 Trillion,” “Feminists Have Better Sex Lives,” or (my favorite, courtesy of Yahoo! News) “Holy Water May Be Harmful to Your Health, Study Finds.” Even Ross Douthat, generally reputed as a moralist, can be caught buttressing with social-scientific evidence a claim as uncontroversial as that serious downsides attend being a pothead: Excessive marijuana use, he reports, “can limit educational attainment, and with it economic mobility.” The impression left by these sorts of citations is not rigor so much as lack of confidence in one’s assertions, and persuasion, like seduction and stand-up comedy, is 90 percent confidence.
At this point in the argument, an optimist might suggest that we should count our blessings. It wasn’t so long ago that vast swaths of the mainstream left were unwilling to pass any moral judgments at all. Today’s bien-pensant elites may praise and condemn things in a weirdly bloodless and utilitarian way, but at least they praise and condemn things. Half the time it’s even the right things they praise, like the two-parent family (boosts test scores) or regular church attendance (lowers divorce rates). They are extending a hand of friendship, after a fashion. Do we really want to bat it aside?
This argument occurred to Thomas Babington Macaulay, who was the first man to attempt a systematic unmasking of utilitarianism as a pernicious influence on his country’s political discourse. The vehicle for this rhetorical assassination attempt was a review of James Mill’s Essay on Government, which was itself a distillation of the thought of Mill’s mentor, Jeremy Bentham (incidentally, the man who invented the word “maximize”). Macaulay’s review pointed out that utilitarianism was, among other things, pointless: “If the only fruit of the ‘magnificent principle’ is to be that the oppressors and pilferers of the next generation are to talk of seeking the greatest happiness of the greatest number, just as the same class of men have talked in our time of seeking to uphold the Protestant constitution—just as they talked under Anne of seeking the good of the Church, and under Cromwell of seeking the Lord—where is the gain?” Though he proceeded to sneer at it, this is the crucial question: What is gained, and what is lost, when political discussion must be conducted in utilitarian and social-scientific terms?
The first three answers to that question are related, for they all have to do with the tendency of social scientists to run around identifying problems in need of solutions. Daniel Patrick Moynihan thought that this proactivity was the defining characteristic of modern social scientists, and he saw it as the driving force behind the transformation in American political culture between the 1930s and the 1960s. During the Depression, the problems that government sought to address had mostly been brought to its attention by cries from below, expressed by people who could see the problem with their own eyes. From Kennedy’s presidency onward, bureaucrats armed with national statistics—then a fairly new phenomenon, not coincidentally—began searching their data for problems to solve, whether popular demand for such solutions existed or not. Moynihan’s term for this phenomenon was “the professionalization of reform,” and although a social scientist himself, he was intensely ambivalent about it.
When professionals put such zest and seriousness into persuading people that they have a problem that can be solved, several things can go wrong. It may be that the targets of their attentions have a problem that cannot be solved. It may be that they do not have a problem at all. Or it may be that they do have a problem and it can be solved, but it would be better for them in the meantime to be able to appreciate, relish, draw from, or find the richness in their problem instead of simply deprecating it. The professionals’ response to each of these three possibilities ends in false hope, false despair, or false resentment for the sufferers, yet ever greater self-satisfaction for their would-be saviors.
For the first two of these offenses, the rogues’ gallery is especially full of representatives of the medical sciences. The American Academy of Anti-Aging Medicine has for the last twenty years tried to convince patients that old age is not a fact of life but merely a collection of symptoms, each of which is in theory curable (their motto: “Aging is not inevitable!”). Purveyors of estrogen-progestin treatments tell women that the “disease” of menopause can be deferred indefinitely. And of course, no profession has been more aggressive in applying the label of “disorder” to normal human experiences than psychology.
But social scientists engage in solution mongering of the same kind, and more of it lately. To the extent that inequality is a live topic in politics today, it is entirely attributable to the efforts of economists who think it ought to be an issue because their graphs tell them so. Lack of prosperity is something people suffer from, but “rising inequality” is something social scientists notice in data. In the field of education—probably the most massive black hole of bogus expertise in the policy world—professionals use statistical legerdemain to make an immutable fact of human life sound like an eminently remediable policy problem. If the governor of New York were to promise to abolish stupidity within ten years, anyone hearing him would think, “Physician, heal thyself.” By framing the same goal in the language of narrowing achievement gaps and bringing scores up to grade level, the futile endeavor begins to sound rather sensible.
The third offense—when professionals identify a solvable problem but, by seizing hold of it, flatten the experience of it—is the cruelest of the three, because it snatches longstanding consolations from the hands of the most vulnerable. The process is simple. The first thing a social scientist will do when addressing a problem is to break it down into distinct elements, to make it easier to analyze and to tweak. In the case of poverty, that means separating out its component difficulties. In the case of marginalized groups, it means isolating all the obstacles imposed by their race, sex, gender, disability, or whatever other characteristic.
Applying this analytic solvent is helpful to the social scientist’s mission because it separates a permanent problem (like poverty or sex differences) into fragments that in themselves seem fixable (finding jobs for welfare recipients or eliminating some statistical disparity between men and women). But speaking about disadvantaged groups in these discrete and objective terms robs their experience of its coherence. It takes a rich identity and shatters it to pieces. Membership in the lower class, for example, has never been a picnic, but it used to be something that a person could draw from and take pride in. Described in the terms that politics permits us to use today, as “socioeconomic disadvantage” (or worse, “lack of privilege”), it sounds like nothing more than a list of things to complain of.
Finally, it should not go unmentioned that, for all its vaunted empiricism, social science is very often wrong. Sometimes it is wrong in ways we do not necessarily regret, as in the case of Dr. Kenneth B. Clark’s “doll tests,” which cemented the Supreme Court’s decision in Brown v. Board of Education but were so flawed in their method as to be scientifically worthless. Sometimes a social science is saved from being wrong only by being persistently inconclusive, with economics being the prime example. As the George Mason University professor Russ Roberts is fond of asking, “Is there any statistically sophisticated study of a politically controversial question in economics that was so iron-clad and undeniably reliable that it has persuaded the people on the other side of the issue to change their views?” Bernard Lewis, speaking from outside the discipline, put the case more mordantly: “In the history of human thought, science has often come out of superstition. Astronomy came out of astrology. Chemistry came out of alchemy. What will come out of economics?”
Something similar happened to Lewis’ own discipline in the 1960s with the rise of “social history.” Oxford professor Brian Harrison, who studied for his Ph.D. during the social historians’ heyday, remembers that they “were all keen to count things wherever possible, to display material diagrammatically and graphically, to engage in prosopographical analysis, and to see how far sociological and other models could help to make sense of the material.” Often this quantifying tendency was justified as a way to capture the voices of the poor and powerless, who did not leave the same paper trails as kings and politicians but did turn up in census rolls.
The vogue for social history subsided for many reasons—not least because its plodding books made for excruciating reading—but perhaps most relevant to the current situation was the realization that it did not actually rescue the neglected populations it was intended to serve. Georg Iggers summarizes this aspect of the backlash in his survey of twentieth-century historiography: “‘Little people’ . . . had been neglected as much in social science–oriented history as they had been in the conventional political history that focused on the high and mighty . . . . [It] missed the point by attending to material conditions without examining how these conditions were experienced” (emphasis added).
How did we get here? The most obvious answer is that there is simply more social science being done today than ever before. The old Progressives would have loved to have been able to ground all of their favored policies in academic studies with irrefutable-looking charts and graphs, but there were not enough practitioners around to produce them. As late as the 1960s, only a minority of professors at four-year colleges published regularly. Now, not only are there vastly more credentialed social scientists, but they are producing more findings, thanks to the peer-reviewed-journal treadmill on which their careers have come to depend.
On the demand side, the audience for these social-scientific pronouncements has increased with the growth of college enrollment. Most adult Americans today receive some form of higher education, which, whatever else it may do for them, acculturates students into a mindset where academic findings are granted serious weight and being skeptical of eggheads is considered philistine. Not since the Edwardian era, when philanthropy was replaced by social work, has there been such a pronounced swing against amateurs and toward professionals. Indeed, echoes of that old battle can be heard in today’s expressions of contempt for those who dare to put forward a political argument without a white paper’s worth of research to support it.
One remembers how Mrs. Ramsay in To the Lighthouse (1927) “wrote down in columns carefully ruled for the purpose wages and spending, employment and unemployment, in the hope that thus she would cease to be a private woman whose charity was half a sop to her own indignation, half a relief to her own curiosity, and become what with her untrained mind she greatly admired, an investigator, elucidating the social problem.” (Lady Bountiful and Mrs. Jellyby have not become any more popular in the intervening century, though in light of what replaced them, their particular brand of foolishness seems more harmless than not.)
It may also be that the supremacy of social science came about not entirely because of impersonal forces but because people actively wanted it. The medicalization of personal suffering is probably the trend most similar to the economicization of political suffering, and the thing everyone forgets about medicalization is that it has been largely patient driven. The term “medical imperialism,” though catchy, is a misnomer. The mantra that alcoholism is a disease, the extension of ADHD to adult patients, the application of the term “PTSD” to trauma sufferers other than combat veterans, the use of human growth hormone to treat “idiopathic short stature” in children of non-disabling but less than average height—it was the sufferers themselves (or, in the case of ISS, their parents) who drove these developments.
Convincing a doctor to legitimize their moderate levels of suffering with a medical diagnosis brings no small amount of psychic relief: validation that their suffering is real; hope that their situation can be cured, since it is officially abnormal and not one of the ordinary pains of human existence; and, of course, absolution from moral responsibility. It is not hard to see how some of these same comforts might accompany the recategorization of social ills as economic or sociological errors.
It should be clear from the foregoing description that there are a great many people with good reason to resent the “social-science-ification” of public discourse. There is, first of all, the large portion of the population that would like to participate in the political conversation but cannot because they lack the familiarity with economic jargon that has been made a criterion for being taken seriously. There are those who know social science well enough to frame their arguments in those terms but who feel that in doing so, rather a lot would get lost in translation.
Among consumers of news media, there are—or I hope there are—many who rue what social science has done to the quality of prose. Much of the writing that appears in political magazines nowadays is unreadable, which is a dereliction of both art and civic duty on the authors’ part. It is not entirely their fault. Poor things, it is simply not possible to write an elegant sentence with two statistics in it.
Finally, there are those who rightly complain that social science has made politics dull. This is not a frivolous point. Edward Banfield, the great theoretician of urban politics, observed that when the big-city bosses were replaced by colorless reformers, the lower and working classes stopped paying attention to urban politics. The decline in entertainment value led to a decline in participation among those who needed local politics most, as one of the very few arenas where they could make their voices heard. Dullness is regressive.
Should this coalition of the ignored, the reluctant, and the bored to tears ever attempt to take back our political discourse from the technocrats, the first lesson we will have to keep in mind is not to fight the enemy on its own terms. During the Cold War, especially its early stages, the books written in defense of the Soviet model fairly bristled with statistics. Wisely, the West’s more effective defenders did not attempt to refute tractor-production figures from the Ukraine with tractor-production figures from Moline, Illinois. They made more fundamental points, like the difficulty of collecting accurate statistics in a police state, or the conclusiveness with which even accurate statistics are trumped by the brute fact of mass starvation.
At a more Kirkian level of abstraction, there were such simple observations as: Our people are free, yours are not; we produce poetry, you produce propaganda; our cities are beautiful, yours are hideous. The equivalent arguments in the modern context might be (1) no amount of creative accounting will convince a sane person that you have made a money-saver out of a vast new entitlement like Obamacare; (2) no study could ever refute the fact that character is both a cause and a casualty of government-subsidized poverty; and (3) I will listen to econometricians as soon as you show me one that can write with more fluency than a high school sophomore.
The mistake most frequently made by modern anti-social-science polemicists is to focus too much on their target’s philosophical shortcomings. This would be excellent high ground to stake out if Americans were attached to terms like “empirical” and “falsifiable” because they had all given deep consideration to the works of Bertrand Russell and Karl Popper, but the reality is more superficial. They put their faith in academics and research analysts because they have an idealized picture of the sciences as a self-policing community of disinterested truth-seekers with laboratories and databases and state-of-the-art modeling programs.
This superficiality is not negligence, really. Most of our decisions about whom to trust take place at this aesthetic level. You and I do not know enough neuroscience to refute a convinced phrenologist, but if we have a feel for the pseudoscientist type (say we have read Martin Gardner), we will have no trouble identifying our bumpologist as a textbook example. So it is, in reverse, with social scientists. We come to them with a jumble of opinions and half-formed personal judgments, and they repeat them back to us as facts. Their main contribution is not information, but authority.
But there are other kinds of authority than a philosophiae doctor. The civil rights movement deployed not only the authority of the gospel but also the moral authority that comes from nonviolent protest. Anyone who can withstand brutal treatment and still speak in a calm voice is someone saintly enough to demand a hearing. It is not so different from the moral authority of ex-generals, another phenomenon with precedent in American politics. In circumstances less conducive to displays of moral authority—and political issues very rarely invite actual heroism—there are other options. Christopher Hitchens was not an expert in anything, but people cared what he had to say for two reasons: It was evident that he had read widely, and he expressed himself beautifully. Both of these are forms of authority. Things that require discipline often are.
Authority does not have to be self-generated. It can also be borrowed, or rather assumed, like a mantle. When Rep. Herbert Parsons spoke in favor of child-labor laws on the floor of the House in 1909, he quoted Matthew 18: “Our doctrine . . . is that, if possible, ‘not one of these little ones should perish.’” I would guess that Congressman Parsons began with the same starting point a modern politician would have: a vague but definite conviction that the law ought to address child labor. When he proceeded to ask himself what it would sound like to make that assertion in a public forum in a way that would command agreement, the answer came back to him: Scripture. To a modern, it would have been: lifetime earnings differentials.
I do not mean to say that all political arguments should be made more biblical. I only suggest that, when we find ourselves looking for ways to bring some authority to our political convictions (or looking for spokesmen who might carry such authority), we should broaden our search. It may be that integrity, erudition, literary genius, holiness, or wisdom carry as much weight in a democracy as expertise.
Herman Kahn was no one’s idea of a moral authority. Hugely obese, slovenly in dress, and insensitive in manner, Kahn was the RAND Corporation egghead who tallied up megadeaths and mineshaft gaps in earnest years before Dr. Strangelove turned the concepts to satire. He was also the most passionately utilitarian policy analyst of the last century. When the public ridiculed him for subjecting the moral issue of nuclear war to such grotesque calculation, he snapped back, “Would you prefer a warm, human error? . . . a nice emotional mistake?” His colleague Malcolm Hoag laid out RAND’s position in more detail: “The man who solves complex problems in the space of five minutes on the intuitive basis of ‘sound’ . . . judgment and experience” will sleep well, and the analyst who employs Kahn’s method will “sleep badly, but only because he has become acutely aware of all the pitfalls in the problem.”
This is what it looks like to try to achieve authority by sheer mechanical effort. Modern politics is full of people who think like this, who prefer to make decisions in a way that makes them sleep badly because the laboriousness seems like proof of their own diligence and good intentions. The giveaway is their tendency to lose enthusiasm for social science at exactly the point in the process where it might do some good: afterwards, to evaluate whether a policy has actually worked. They prefer their empirical analysis front-loaded as they try by sheer force of will to put a harness on fate: If they had fallen in the first panel of Rebecca West’s triptych, they would have been pelagians.
But authority, like salvation, is a gift, not an item for purchase. It will take effort for non–social scientists to reclaim some of it, but we should not make our opponents’ mistake of thinking that sheer quantity of effort is all that is required.
Helen Andrews, a former associate editor of National Review, has written for the American Conservative, the Weekly Standard, and the American Spectator. This essay was underwritten by the Issachar Fund.