The War on Stupid People

American society increasingly mistakes intelligence for human worth.

Edmon de Haro

As recently as the 1950s, possessing only middling intelligence was not likely to severely limit your life’s trajectory. IQ wasn’t a big factor in whom you married, where you lived, or what others thought of you. The qualifications for a good job, whether on an assembly line or behind a desk, mostly revolved around integrity, work ethic, and a knack for getting along—bosses didn’t routinely expect college degrees, much less ask to see SAT scores. As one account of the era put it, hiring decisions were “based on a candidate having a critical skill or two and on soft factors such as eagerness, appearance, family background, and physical characteristics.”

The 2010s, in contrast, are a terrible time to not be brainy. Those who consider themselves bright openly mock others for being less so. Even in this age of rampant concern over microaggressions and victimization, we maintain open season on the nonsmart. People who’d swerve off a cliff rather than use a pejorative for race, religion, physical appearance, or disability are all too happy to drop the s‑bomb: Indeed, degrading others for being “stupid” has become nearly automatic in all forms of disagreement.

It’s popular entertainment, too. The so-called Darwin Awards celebrate incidents in which poor judgment and comprehension, among other supposedly genetic mental limitations, have led to gruesome and more or less self-inflicted fatalities. An evening of otherwise hate-speech-free TV-watching typically features at least one of a long list of humorous slurs on the unintelligent (“not the sharpest tool in the shed”; “a few fries short of a Happy Meal”; “dumber than a bag of hammers”; and so forth). Reddit regularly has threads on favorite ways to insult the stupid, and fun-stuff-to-do.com dedicates a page to the topic amid its party-decor ideas and drink recipes.

Magazine Cover image

Explore the July/August 2016 Issue

Check out more from this issue and find your next story to read.

View More

This gleeful derision seems especially cruel in view of the more serious abuse that modern life has heaped upon the less intellectually gifted. Few will be surprised to hear that, according to the 1979 National Longitudinal Survey of Youth, a long-running federal study, IQ correlates with chances of landing a financially rewarding job. Other analyses suggest that each IQ point is worth hundreds of dollars in annual income—surely a painful formula for the 80 million Americans with an IQ of 90 or below. When the less smart are identified by lack of educational achievement (which in contemporary America is closely correlated with lower IQ), the contrast only sharpens. From 1979 to 2012, the median-income gap between a family headed by two earners with college degrees and two earners with high-school degrees grew by $30,000, in constant dollars. Studies have furthermore found that, compared with the intelligent, less intelligent people are more likely to suffer from some types of mental illness, become obese, develop heart disease, experience permanent brain damage from a traumatic injury, and end up in prison, where they are more likely than other inmates to be drawn to violence. They’re also likely to die sooner.

Rather than looking for ways to give the less intelligent a break, the successful and influential seem more determined than ever to freeze them out. The employment Web site Monster captures current hiring wisdom in its advice to managers, suggesting they look for candidates who, of course, “work hard” and are “ambitious” and “nice”—but who, first and foremost, are “smart.” To make sure they end up with such people, more and more companies are testing applicants on a range of skills, judgment, and knowledge. CEB, one of the world’s largest providers of hiring assessments, evaluates more than 40 million job applicants each year. The number of new hires who report having been tested nearly doubled from 2008 to 2013, says CEB. To be sure, many of these tests scrutinize personality and skills, rather than intelligence. But intelligence and cognitive-skills tests are popular and growing more so. In addition, many employers now ask applicants for SAT scores (whose correlation with IQ is well established); some companies screen out those whose scores don’t fall in the top 5 percent. Even the NFL gives potential draftees a test, the Wonderlic.

Yes, some careers do require smarts. But even as high intelligence is increasingly treated as a job prerequisite, evidence suggests that it is not the unalloyed advantage it’s assumed to be. The late Harvard Business School professor Chris Argyris argued that smart people can make the worst employees, in part because they’re not used to dealing with failure or criticism. Multiple studies have concluded that interpersonal skills, self-awareness, and other “emotional” qualities can be better predictors of strong job performance than conventional intelligence, and the College Board itself points out that it has never claimed SAT scores are helpful hiring filters. (As for the NFL, some of its most successful quarterbacks have been strikingly low scorers on the Wonderlic, including Hall of Famers Terry Bradshaw, Dan Marino, and Jim Kelly.) Moreover, many jobs that have come to require college degrees, ranging from retail manager to administrative assistant, haven’t generally gotten harder for the less educated to perform.

At the same time, those positions that can still be acquired without a college degree are disappearing. The list of manufacturing and low-level service jobs that have been taken over, or nearly so, by robots, online services, apps, kiosks, and other forms of automation grows longer daily. Among the many types of workers for whom the bell may soon toll: anyone who drives people or things around for a living, thanks to the driverless cars in the works at (for example) Google and the delivery drones undergoing testing at (for example) Amazon, as well as driverless trucks now being tested on the roads; and most people who work in restaurants, thanks to increasingly affordable and people-friendly robots made by companies like Momentum Machines, and to a growing number of apps that let you arrange for a table, place an order, and pay—all without help from a human being. These two examples together comprise jobs held by an estimated 15 million Americans.

Meanwhile, our fetishization of IQ now extends far beyond the workplace. Intelligence and academic achievement have steadily been moving up on rankings of traits desired in a mate; researchers at the University of Iowa report that intelligence now rates above domestic skills, financial success, looks, sociability, and health.

The most popular comedy on television is The Big Bang Theory, which follows a small gang of young scientists. Scorpion, which features a team of geniuses-turned-antiterrorists, is one of CBS’s top-rated shows. The genius detective Sherlock Holmes has two TV series and a blockbuster movie franchise featuring one of Hollywood’s most bankable stars. “Every society through history has picked some trait that magnifies success for some,” says Robert Sternberg, a professor of human development at Cornell University and an expert on assessing students’ traits. “We’ve picked academic skills.”

What do we mean by intelligence? We devote copious energy to cataloging the wonderfully different forms it might take—interpersonal, bodily-kinesthetic, spatial, and so forth—ultimately leaving virtually no one “unintelligent.” But many of these forms won’t raise SAT scores or grades, and so probably won’t result in a good job. Instead of bending over backwards to find ways of discussing intelligence that won’t leave anyone out, it might make more sense to acknowledge that most people don’t possess enough of the version that’s required to thrive in today’s world.

A few numbers help clarify the nature and scope of the problem. The College Board has suggested a “college readiness benchmark” that works out to roughly 500 on each portion of the SAT as a score below which students are not likely to achieve at least a B-minus average at “a four-year college”—presumably an average one. (By comparison, at Ohio State University, a considerably better-than-average school ranked 52nd among U.S. universities by U.S. News & World Report, freshmen entering in 2014 averaged 605 on the reading section of the SAT and 668 on the math section.)

How many high-school students are capable of meeting the College Board benchmark? This is not easy to answer, because in most states, large numbers of students never take a college-entrance exam (in California, for example, at most 43 percent of high-school students sit for the SAT or the ACT). To get a general sense, though, we can look to Delaware, Idaho, Maine, and the District of Columbia, which provide the SAT for free and have SAT participation rates above 90 percent, according to The Washington Post. In these states in 2015, the percentage of students averaging at least 500 on the reading section ranged from 33 percent (in D.C.) to 40 percent (in Maine), with similar distributions scoring 500 or more on the math and writing sections. Considering that these data don’t include dropouts, it seems safe to say that no more than one in three American high-school students is capable of hitting the College Board’s benchmark. Quibble with the details all you want, but there’s no escaping the conclusion that most Americans aren’t smart enough to do something we are told is an essential step toward succeeding in our new, brain-centric economy—namely, get through four years of college with moderately good grades.

Edmon de Haro

Many people who have benefited from the current system like to tell themselves that they’re working hard to help the unintelligent become intelligent. This is a marvelous goal, and decades of research have shown that it’s achievable through two approaches: dramatically reducing poverty, and getting young children who are at risk of poor academic performance into intensive early-education programs. The strength of the link between poverty and struggling in school is as close to ironclad as social science gets. Still, there’s little point in discussing alleviating poverty as a solution, because our government and society are not seriously considering any initiatives capable of making a significant dent in the numbers or conditions of the poor.

That leaves us with early education, which, when done right—and for poor children, it rarely is—seems to largely overcome whatever cognitive and emotional deficits poverty and other environmental circumstances impart in the first years of life. As instantiated most famously by the Perry Preschool Project in Ypsilanti, Michigan, in the 1960s; more recently by the Educare program in Chicago; and by dozens of experimental programs in between, early education done right means beginning at the age of 3 or earlier, with teachers who are well trained in the particular demands of early education. These high-quality programs have been closely studied, some for decades. And while the results haven’t proved that students get a lasting IQ boost in the absence of enriched education in the years after preschool, measures of virtually every desirable outcome typically correlated with high IQ remain elevated for years and even decades—including better school grades, higher achievement-test scores, higher income, crime avoidance, and better health. Unfortunately, Head Start and other public early-education programs rarely come close to this level of quality, and are nowhere near universal.

In lieu of excellent early education, we have embraced a more familiar strategy for closing the intelligence gap. Namely, we invest our tax money and faith in reforming primary and secondary schools, which receive some $607 billion in federal, state, and local revenues each year. But these efforts are too little, too late: If the cognitive and emotional deficits associated with poor school performance aren’t addressed in the earliest years of life, future efforts aren’t likely to succeed.

Confronted with evidence that our approach is failing—high-school seniors reading at the fifth-grade level, abysmal international rankings—we comfort ourselves with the idea that we’re taking steps to locate those underprivileged kids who are, against the odds, extremely intelligent. Finding this tiny minority of gifted poor children and providing them with exceptional educational opportunities allows us to conjure the evening-news-friendly fiction of an equal-opportunity system, as if the problematically ungifted majority were not as deserving of attention as the “overlooked gems.” Press coverage decries the gap in Advanced Placement courses at poor schools, as if their real problem was a dearth of college-level physics or Mandarin.

Even if we refuse to prevent poverty or provide superb early education, we might consider one other means of addressing the average person’s plight. Some of the money pouring into educational reform might be diverted to creating more top-notch vocational-education programs (today called career and technical education, or CTE). Right now only one in 20 U.S. public high schools is a full-time CTE school. And these schools are increasingly oversubscribed. Consider Chicago’s Prosser Career Academy, which has an acclaimed CTE program. Although 2,000 students apply to the school annually, the CTE program has room for fewer than 350. The applicant pool is winnowed down through a lottery, but academic test scores play a role, too. Worse, many CTE schools are increasingly emphasizing science, technology, engineering, and mathematics, at risk of undercutting their ability to aid students who struggle academically—rather than those who want to burnish their already excellent college and career prospects. It would be far better to maintain a focus on food management, office administration, health technology, and, sure, the classic trades—all updated to incorporate computerized tools.

We must stop glorifying intelligence and treating our society as a playground for the smart minority. We should instead begin shaping our economy, our schools, even our culture with an eye to the abilities and needs of the majority, and to the full range of human capacity. The government could, for example, provide incentives to companies that resist automation, thereby preserving jobs for the less brainy. It could also discourage hiring practices that arbitrarily and counterproductively weed out the less-well-IQ’ed. This might even redound to employers’ benefit: Whatever advantages high intelligence confers on employees, it doesn’t necessarily make for more effective, better employees. Among other things, the less brainy are, according to studies and some business experts, less likely to be oblivious of their own biases and flaws, to mistakenly assume that recent trends will continue into the future, to be anxiety-ridden, and to be arrogant.

When Michael Young, a British sociologist, coined the term meritocracy in 1958, it was in a dystopian satire. At the time, the world he imagined, in which intelligence fully determined who thrived and who languished, was understood to be predatory, pathological, far-fetched. Today, however, we’ve almost finished installing such a system, and we have embraced the idea of a meritocracy with few reservations, even treating it as virtuous. That can’t be right. Smart people should feel entitled to make the most of their gift. But they should not be permitted to reshape society so as to instate giftedness as a universal yardstick of human worth.