The Right to Not Have Your Mind Read

Can “neurorights” protect us from the future?

A person's head, an orange orb, and a wireframe red globe on a black background
Illustration by The Atlantic. Source: Getty.

Updated at 9:20 p.m. ET on August 21, 2023

Jared Genser in many ways fits a certain Washington, D.C., type. He wears navy suits and keeps his hair cut short. He graduated from a top law school, joined a large firm, and made partner at 40. Eventually, he became disenchanted with big law and started his own boutique practice with offices off—where else—Dupont Circle. What distinguishes Genser from the city’s other 50-something lawyers is his unusual clientele: He represents high-value political prisoners. If you’re married to a troublesome opposition leader in a place where the rule of law is thin on the ground, one night the secret police might kick in your door, slip a hood over your spouse, and vanish into the dark. That’s when you call Genser.

Earlier this year, Genser helped obtain the release of two men who had run for president against Daniel Ortega, Nicaragua’s on-again, off-again strongman, and found themselves imprisoned for their trouble. He still remembers the early-morning call letting him know that his clients were airborne and headed for Dulles International Airport. But not every case ends in a euphoric release. Genser has represented the three most recent imprisoned winners of the Nobel Peace Prize, including the Chinese prodemocracy activist Liu Xiaobo, who died in custody at the age of 61, and Ales Bialiatski, who was just sentenced to 10 years in a grim penal colony in Belarus, where inmates receive beatings between long shifts of hard labor.

Genser’s clients face the full technological powers of the Leviathan. By the time they’ve been arrested, in many cases after a mass protest, they may have been spied on for months, if not years, by plainclothes police and networks of cameras. Their personal messages, website clicks, and purchases could already be in the hands of the state. Post-arrest, they may be tortured by agents looking to extract the sort of secrets that a prisoner stores only in the inner sanctum of their mind: future plans, the names of people who send them money, any informants they might have in government. Genser's clients have even been subject to electrocution, and recently, he has begun to worry that dictators will soon have access to another tool of interrogation: mind-reading devices that no human being can resist.


In theory, nothing about the brain’s squishy wetware prevents its internal states from being observed. “If you could measure every single neuron in the brain in real time, you could potentially decode everything that was percolating around in there,” Jack Gallant, a cognitive scientist at UC Berkeley, told me. That includes “all of your perceptions, all of your intentions, all of your motor actions, and also a bunch of stuff you’re not even consciously aware of,” he said.

Scientists have no way of measuring the individual activity of every neuron in the brain, or even a sizable fraction of them, so mind reading of the sort that Gallant described would be impossible. But there are cruder ways to get at neural data: A person could be slid into an MRI machine, for example, and have their brain’s activity imaged by a head-permeating magnetic field. Configured in a certain way, an MRI can detect minute, local shifts in oxygenated blood flow inside the skull. Because neurons that have just fired tend to need more oxygen, these shifts are a decent proxy for the brain’s activity. They give a blurry afterimage of thought.

In 2011, Gallant published a set of experiments that pushed mind-reading technology into a new era. He asked volunteers to watch hours of video clips while their head was stuck in an MRI, and then trained a neural network on a dataset that linked every moment of each video to the brain activity recorded by the machine. Afterward, he asked the volunteers to watch new videos. When Gallant’s team fed the resulting data into the AI, it was able to generate very rudimentary reconstructions of some of the new imagery.

Genser worries that the same approach—using learning algorithms to correlate neuronal activity with mental states—could be scaled up in power and eventually deployed in wearable, mind-reading caps. He imagines secret police plopping one onto a client’s head. They could then ask questions, he said, while watching a real-time feed of whatever pictures or words popped unbidden into a prisoner’s mind. “This will transform interrogations around the world,” he told me.

Genser first became concerned about this risk in 2020, when he met Rafael Yuste, a Spanish American neurobiology professor at Columbia University. Yuste, now 60, helped create the project that became President Barack Obama’s BRAIN Initiative, and has since become a prominent scientific voice arguing that advances in AI and neuroscience may require a new legal regime. He told me that he thinks of it as his second career. Shortly after he met Genser, he invited him to be a co-founder of the NeuroRights Foundation. Among its goals: a globally enshrined right to mental privacy and free will that would forbid anyone from ever using brain-imaging technology to force open a rear window onto your theater of consciousness.

I asked Gallant about the urgency of this campaign. He is well positioned to know how soon this technology could really be upon us: In addition to his pioneering image-reconstruction work, he has mentored several of the field’s younger practitioners. (His former student, Alexander Huth, runs a lab at the University of Texas at Austin that recently managed to reconstruct the rough gist of a text narrative that had been read aloud to a person in an MRI machine.) Gallant told me that the deep-learning revolution of the past 10 years has yielded greater success in decoding brain activity. The reconstructed imagery from his 2011 mind-reading study wasn’t very precise. “If you look at the pictures, it’s not random; there’s something there,” he said. More recent work, like that from a team led by Yu Takagi at Osaka University, in Japan, produced more-accurate reconstructions of mental imagery. Scientists are getting better at reading minds.

That’s not to say that the world’s tyrants will soon be buying mind-reading kits off the shelf. The mental reconstructions that are possible right now are a far cry from the where-is-the-rebel-base scenario that Genser fears. Even if methods like the ones described above could be used in interrogations, there would be practical challenges. Takagi’s and Huth’s experiments required research subjects to spend many motionless hours inside an MRI machine to generate training data for AI models. That alone could pose problems for, say, a dictator who hoped to peer inside the head of his prisoners. If someone wanted to resist, Gallant told me, “all they would have to do is wiggle their head a little to mess up the signals.”

Companies are developing portable helmets that use small, pulsed lasers to monitor changes in the brain’s blood flow. In 2021, a start-up called Kernel debuted a model that cost just $50,000. But the spatial resolution of the brain data they capture is lower than the data you get from an MRI machine. According to Gallant, the helmets are able to gather sufficient data to tell whether a person is sleeping, or whether they’re paying attention, but not to perform image or narrative reconstruction. Overall, he told me, he shares Yuste’s belief that this technology will eventually pose new ethical concerns, but he made clear that, in his view, mind-reading caps are a long way off.


In the meantime, Genser and Yuste are working on other threats to mental privacy that aren’t quite as lurid. In recent years, the consumer market for devices that collect brain data has been growing fast; even Apple has applied for a patent on a new earbud outfitted with electrodes that could, in theory, detect brain activity. Medical devices that use this technology are of course highly regulated, but products that you can buy with a few taps on Instagram may not be.

The NeuroRights Foundation recently reviewed the user agreements of 17 neurotech companies for a report that it plans to release in September. The agreements cover headsets that record electrical activity generated by the brain to monitor sleep patterns, mental concentration, or even meditative calm. “Every one of them takes possession of all the brain data of the user,” Yuste said. To be clear, this sort of brain data could not be used to read someone’s inner thoughts; at best, it provides something more like an impressionistic image of their mental state. Marcello Ienca, a philosopher at the Swiss Federal Institute of Technology in Lausanne, told me that even these data deserve special protections.

“In the digital world, we have been trading privacy for services almost nonstop for the last 20 years,” he said. But however mesmerized we might be by the dopamine slot machines that are our social-media feeds, our online activity is still voluntary. We can decide whether to post a given thought on Instagram, or to keep it in our minds, where it’s not accessible to the outside world, Ienca said. When it comes to brain data, we may not even know what we are sharing, and companies may be in no rush to tell us. Nor would we know where our data might end up: Yuste told me that almost all of the user agreements reviewed for the NeuroRights Foundation’s forthcoming report allowed the company to send data to third parties.

In some workplaces, sharing brain data may become a condition of employment. Chinese companies are reportedly using neuromonitoring technology to record the brain activity of high-speed-train conductors and people who execute important functions in nuclear plants, Ienca said. These devices may be recording only concentration levels and emotional states. But nothing prevents those companies—or the Chinese military, which is reportedly also monitoring cognitive focus in troops—from banking as much brain data as they can for later analysis. “If this isn’t a human-rights issue,” Yuste said, “what is a human-rights issue?”

Jennifer Blumenthal-Barby, a medical-ethics professor at the Baylor College of Medicine, isn’t quite sure that it’s a human-rights issue, or at least not a novel one. When I spoke with her, she pointed out that we already have a right to privacy under international law. Under Article 17 of the International Covenant on Civil and Political Rights—a treaty that has been ratified by 173 countries, including the United States—“no one shall be subjected to arbitrary interference with his privacy.” (Of course, dictators have routinely flouted the very treaties their countries have signed.) Many countries have also passed domestic laws that forbid various invasions of privacy. These existing treaties and laws could cover cases where a person’s mental states are read without their consent, Blumenthal-Barby said.

Genser and Yuste disagree, and argue that without more-explicit guarantees, current human-rights law may not protect mental privacy. But Blumenthal-Barby said that such guarantees, if enacted, could be overly expansive. “We have to be able to draw a line,” she told me. “We read off people’s mental states by their behavior all the time without their consent,” by looking at facial cues or gestures or body language, and “we don’t want to include those cases.” In place of a catch-all mental-privacy right, she said that she’d be a lot more comfortable seeing laws that address specific technologies—consumer headsets, for instance—that could be used to retrieve brain data without consent.

Yuste and Genser are still focused on getting the word out about their efforts—they recently collaborated on a documentary about neurotechnology with Werner Herzog—but they have also achieved genuine legislative victories. Yuste was instrumental in the drafting of a law passed by Chile’s national legislature near the end of 2021, which enshrined several neurorights. (Memories of Augusto Pinochet’s purges and mass internments are still fresh in Chile’s national psyche, he told me.) The NeuroRights Foundation is now working with Brazil to draft a constitutional amendment modeled on Chile’s law. Yuste said they’re also in talks with Colorado’s governor about the first such legislation at the state level in the United States.

Genser told me that it takes at least a decade to stand up a new international rights treaty, but that changes in how current treaties are interpreted could be achieved on a much shorter timeline. If Gallant is right that we won’t see anything close to a mind-reading helmet for a while, the NeuroRights Foundation may not need to rush. That’s not to say that the group’s work isn’t useful, if only to name the risks, but it’s operating in a competitive space. A great many people are currently scanning the horizon for threats from emerging technologies, especially those powered by AI. Policy makers are doing their level best to address the most pressing threats. The line between foresight and alarmism can sometimes seem blurry, like the readout from an MRI.


This article previously misstated the year when Jared Genser met Rafael Yuste.

Ross Andersen is a staff writer at The Atlantic.