The following is excerpted and adapted from The Dumbest Generation Grows Up: From Stupefied Youth to Dangerous Adults, out today.
A dozen years ago, those of us watching with a skeptical eye couldn’t decide which troubled us more: the fifteen-year-olds averaging eight hours of media per day or the adults marveling at them. How could the older and wiser ignore the dangers of adolescents’ reading fewer books and logging more screen hours?
There should have been many, many more critics. The evidence was voluminous. Even as the cheerleaders were hailing the advent of digital youth, signs of intellectual harm were multiplying. Instead of heeding the signs, people in positions of authority rationalized them away. Bill Gates and Margaret Spellings and Barack Obama told Millennials they had to go to college to acquire twenty-first-century skills to get by in the information economy, and the schools went on to jack up tuition, dangle loans, and leave them five years after graduation in the state of early-twentieth-century sharecroppers, the competence they had developed in college and the digital techniques they had learned on their own often proving to be no help in the job market. The solution? Be more flexible, mobile, adaptive! High school students bombed NAEP exams (“the Nation’s Report Card”) in U.S. history and civics, but, many shrugged: Why worry, now that Google is around? The kids can always look it up! An August 2013 column in Scientific American featured an author recalling his father paying him five dollars to memorize the U.S. presidents in order and reflecting, “Maybe we’ll soon conclude that memorizing facts is no longer part of the modern student’s task. Maybe we should let the smartphone call up those facts as necessary.”
Given how pedestrian Facebook, Twitter, and Wikipedia seem today, not to mention the oddball auras of their founders and CEOs, it is difficult to remember the masters-of-the-universe, march-of-time cachet they enjoyed in the Web 2.0 phase of the Revolution (the first decade of the twenty-first century). Change happens so fast that we forget the spectacular novelty of it all, the days when digiphiles had all the momentum, the cool. As a friend who’d gone into technical writing in the ’90s told me recently, “It was sooo much fun back then.” Nobody wanted to hear the downsides, especially when so much money was being made. SAT scores in reading and writing kept slipping, but with all the texting, chatting, blogging, and tweeting, it was easy to find the high schoolers expressive in so many other ways, writing more words than any generation in history.
A much-discussed 2004 survey by the National Endowment for the Arts (NEA), Reading at Risk: A Survey of Literary Reading in America, found an astonishing drop in young adults’ consumption of fiction, poetry, and drama, with only 43 percent of them reading any literature at all in leisure hours, 17 percentage points fewer than in 1982, but in my presentation of the findings at dozens of scholarly meetings and on college campuses (I had worked on the NEA project), the professionals dismissed them as alarmist and reactionary, arising from a “moral panic” no different from the stuffy alarm about Elvis and comic books fifty years earlier.
On October 26, 2018, a story appeared in the New York Times about a surprising trend in Silicon Valley. It bore the title “The Digital Gap Between Rich and Poor Kids Is Not What We Expected,” and it cited the common concern during the late 1990s and 2000s that well-off kids would have abundant access to digital tools and the internet, while poor kids, lacking a computer, would fall further behind in academic achievement and workplace readiness. The digital revolution wouldn’t be a great equalizer. The fear was that it would exacerbate inequalities, with privileged students “gaining tech skills and creating a digital divide,” the story said.
In 2018, however, eleven years after the first iPhone was sold and fourteen years after Facebook was founded, something different and unexpected was happening: “But now, as Silicon Valley’s parents increasingly panic over the impact screens have on their children and move toward screen-free lifestyles, worries over a new digital divide are rising.” As public schools serving poor and minority kids were pushing one-to-one laptop programs, the reporter observed, executives in Palo Alto and Los Altos were sending their children to vigilantly low-tech private campuses such as the Waldorf Schools. A psychologist who had written a recent book about the hazards of screens told the reporter that when he urged poor families in the East Bay to pull their kids away from the internet, they blinked in surprise, while parents in Silicon Valley crowded his seminars, having already read and appreciated his work.
The troubled parents quoted in the story were the opposite of Luddites. Neither were they social conservatives, fundamentalist Christians, or Great Books types. They came right out of the belly of the digital beast, including the ex-Microsoft executive who noted the customary hype (“There’s a message out there that your child is going to be crippled and in a different dimension if they’re [sic] not on the screen”) and added an understated fact that communicates his disdain nicely: “That message doesn’t play as well in this part of the world.” The story doesn’t mention him, but Steve Jobs himself famously kept his own household and kids fairly tech-free, and a parallel Times story published at the same time and by the same reporter, Nellie Bowles, found more tech celebrities doing likewise. Why? Because, explained Chris Anderson, ex-editor of Wired and head of a robotics company, “We thought we could control it. And this is beyond our power to control. This is going straight to the pleasure centers of the developing brain. This is beyond our capacity as regular parents to understand.” He actually compared it to crack cocaine.
No ideological or principled objections to social media on these defectors’ part, just a desire not to have their kids swallowed up in screen time. They want their children to go to Stanford and Cal Tech, and they know that hours online don’t help. They’ve seen how much money tech companies make selling tools to school districts (“Apple and Google compete furiously to get products into schools and target students at an early age”), because once a youth adopts a brand, he tends to stay with it. They are familiar, too, with the many psychologists helping companies with “persuasive design,” the science of getting people onto a site and keeping them there. They didn’t have to watch the 60 Minutes segment the year before on “brain-hacking” in order to realize the manipulations at work or to hear Bill Maher comment on it thus: “The tycoons of social media have to stop pretending that they are friendly nerd-Gods building a better world and admit that they’re just tobacco farmers in T-shirts selling an addictive product to children.” Nobody could claim that these parents were uninformed alarmists. They knew too much.
The people interviewed in the story weren’t outliers, either—not within their elite group. They exemplified a national trend, a contrary digital divide: Kids in lower-income households in the United States tally 42 percent more minutes of screen diversions per day than kids in upper-income households (487 minutes to 342 minutes, according to a Common Sense Media study). While academics insisted for years, in urgent and radical terms, that youths needed to acquire the latest tools in order to get ahead and join that elite (“To keep pace with a globalized technological culture, we must rethink how we educate the next generation[,] or America will be ‘left behind’”), the most successful and aware individuals in this super-competitive techno-culture acted the opposite way. When they observed their own children at the screen, these high-tech wizards regretted what they had created.
That these skeptics inhabited the very industry that produced the tools and nonetheless warned against the damage they do more loudly than the professional guardians of education and tradition; that people who worked at Google showed more circumspection than humanities teachers, school consultants, and culture journalists; that public school leaders pressed ahead on wiring devices into classrooms in a way that led the inventors of those very devices to pull their kids out—this wasn’t merely an ironic twist. It was a condemnation of the professionals. From the very start, they should have been telling everyone to slow down—above all the kids.
Mark Bauerlein is contributing editor at First Things.
First Things depends on its subscribers and supporters. Join the conversation and make a contribution today.
Click here to make a donation.
Click here to subscribe to First Things.
You have a decision to make: double or nothing.
For this week only, a generous supporter has offered to fully match all new and increased donations to First Things up to $60,000.
In other words, your gift of $50 unlocks $100 for First Things, your gift of $100 unlocks $200, and so on, up to a total of $120,000. But if you don’t give, nothing.
So what will it be, dear reader: double, or nothing?
Make your year-end gift go twice as far for First Things by giving now.