Google Isn’t Grad School

Having so much information at our fingertips is useful but seductive, easily fooling us into thinking we know more than we do.

An illustration featuring a computer tablet with dangling upside-down question marks (styled to resemble a mortarboard hat) resting on a person's nose rather than on their head
Illustration by Jan Buchczik
An illustration featuring a computer tablet with dangling upside-down question marks (styled to resemble a mortarboard hat) resting on a person's nose rather than on their head

Listen to this article

Listen to more stories on curio

Want to stay current with Arthur’s writing? Sign up to get an email every time a new column comes out.

The goal ofHow to Build a Life” is fairly simple: to bring the world of academic social-science research to a wide audience, using my academic training to translate sometimes-esoteric scholarship into practical happiness lessons. In the course of this project, I find my task is as likely to be combatting poor or incomplete advice that people have read on the internet as giving solid counsel based on scholarship and science. The web is full of self-styled experts in my field who claim to have the One Weird Trick that will change your life completely.

And not just in my field. The internet has fed a huge reservoir of good information, but it has also created an explosion of nonsense: technical-sounding nutrition advice about a new dietary supplement that miraculously stimulates the body to convert fat into muscle, financial jargon pushing dubious investment tips, health guidance that promises a miracle treatment your physician doesn’t know about. As my own doctor once told me, his greatest challenge these days is “undoing the handiwork of Dr. Google.”

Some of what people see is straight-up fake news—predatory attempts to swindle consumers. But much of the bad advice on the web actually originates in a psychological phenomenon called “the illusion of explanatory depth.” Understanding this illusion can make you a better consumer of knowledge, as well as less likely to promote bad information yourself.

In 2002, two psychologists noticed in experiments that when people are first exposed to technical information, they usually overestimate how deeply they understand it. The researchers asked graduate students to read basic descriptions of how eight common mechanical items worked: a speedometer, a zipper, a piano key, a flush toilet, a cylinder lock, a helicopter, a quartz watch, and a sewing machine. Then they asked the students to rate their understanding on a 1–7 scale. The average self-rating was about 4.

Next, the researchers asked the participants to re-rate their knowledge after being prompted to explain clearly how the items worked in their own words (without simply parroting what they had heard). The students were also quizzed on the information and had to compare their own understanding with a true expert’s. Nearly every participant’s self-rating dropped at these stages, with the average falling to as low as about 3 at one point. In other words, the participants initially felt as if they had more expertise than they really did.

The phrase illusion of explanatory depth was what researchers dubbed their finding. The phenomenon is similar to the famous Dunning-Kruger effect, which describes how people with low levels of skill in an activity tend to overrate their competence. One explanation for this is “hypocognition,” that people don’t know what they don’t know.

We all exhibit this tendency. When you first hear an explanation intended for a layperson of string theory, you aren’t aware of the immense quantity of technical scholarship behind the physics; you just feel that you “get it” and experience a surge of intellectual power. But when you yourself have to explain something as complex as the structure of a Bach fugue, or hear an expert in the field actually go deep on such a subject, you realize that you have barely skimmed the surface.


Want to hear more from Arthur C. Brooks? Join him and a selection of today’s best writers and boldest voices at The Atlantic Festival on September 28 and 29. Get your pass here.


The overconfidence of people laboring under the illusion of explanatory depth can lead to the spread of misinformation. As researchers have shown, when a person’s confidence is highest though their actual knowledge is low, they become very believable to others—despite not being reliable. And the more inaccurate people are—or perhaps the more they want to believe the validity of their perception—the more they tend to be swayed by their own underinformed overconfidence.

This explains the problem of internet experts and those who rely on them: Practically everywhere you look on the web, you can find technical information of dubious accuracy. This is not necessarily because we are being deliberately lied to—although plenty of that is going on there too—but because the internet is a free, democratic platform. This very freedom and accessibility causes many people to succumb to the illusion of explanatory depth, confidently sharing their newly acquired expertise in some technical information gleaned from reading a single article or watching a couple of videos.

The two ways we fall prey to the illusion are as consumers and as producers. The plight of the consumer of misinformation is the hardest to address, because it isn’t always easy to know when someone is a true expert or just flush with false confidence. The key question to ask is, Does the source of this technical assertion have a genuine technical background? If the answer is no, proceed with caution.

If you’re hearing from a nonexpert who is relying on the work of researchers, consult the original sources if you can, to make sure that they are reliable and not cherry-picked to make an argument favorable to the research author’s biases. A good rule of thumb is that if a piece of technical information seems too good to be true, it probably is. And that generally applies to any promise of a simple, easy solution to a problem that has been around forever.

The second condition—being a supplier of bad information—is easier to treat. Just remember: Google isn’t graduate school. Learning about novel ideas is a thrill, and indeed, many researchers believe that interest itself is a positive emotion—a source of pleasure rooted in the evolutionary imperative to learn new things. Cruising the web in search of interesting things is great fun. But beware your own susceptibility to the illusion of explanatory depth. If you think you understand something technical and complicated after cursory exposure, you might be able to put the knowledge to good use in your life, but you almost certainly don’t understand it well enough to hold forth on the topic.

I have written here about people with insufficient expertise in a technical field who inadvertently pass on bad information. But I’d be the first to acknowledge that experts can give bad information as well. This is especially true when it comes to predictions about the future, an endeavor in which experts tend to be right only a little more often than a coin flip. But experts can also be wrong about what is right in front of them—falling prey like anyone else to groupthink, social convention, politics, threat of community disapproval, and cultural fads. I try to remind myself of this fact every day.

No matter whom you are taking advice from, think for yourself and never entirely suspend your skepticism. No one has perfect knowledge or insight; everyone has biases and blind spots. And if you are the expert, remember that there really is One Weird Trick that solves a lot of problems: It’s called humility.

Arthur Brooks is a contributing writer at The Atlantic and the host of the How to Build a Happy Life podcast.