This story originally published in the June 2017 issue of National Geographic magazine.
In the fall of 1989 Princeton University welcomed into its freshman class a young man named Alexi Santana, whose life story the admissions committee had found extraordinarily compelling.
He had barely received any formal schooling. He had spent his adolescence almost entirely on his own, living outdoors in Utah, where he’d herded cattle, raised sheep, and read philosophy. Running in the Mojave Desert, he had trained himself to be a distance runner.
Santana quickly became something of a star on campus. Academically too he did well, earning A’s in nearly every course. His reserved manner and unusual background suffused him with an enigmatic appeal. When a suite mate asked Santana how his bed always seemed to be perfectly made, he answered that he slept on the floor. It seemed perfectly logical that someone who had spent much of his life sleeping outdoors would have no fondness for a bed.
Except that Santana’s story was a lie. About 18 months after he enrolled, a woman recognized him as somebody she’d known as Jay Huntsman at Palo Alto High School in California six years earlier. But even that wasn’t his real name. Princeton officials eventually learned that he was actually James Hogue, a 31-year-old who had served a prison sentence in Utah for possession of stolen tools and bike parts. He was taken away from Princeton in handcuffs.
In the years since, Hogue has been arrested several times on theft charges. In November, when he was arrested for stealing in Aspen, Colorado, he tried to pass himself off as someone else.
The history of humankind is strewn with crafty and seasoned liars like Hogue. Many are criminals who spin lies and weave deceptions to gain unjust rewards—as the financier Bernie Madoff did for years, duping investors out of billions of dollars until his Ponzi scheme collapsed. Some are politicians who lie to come to power or cling to it, as Richard Nixon famously did when he denied any role in the Watergate scandal.
Sometimes people lie to inflate their image—a motivation that might best explain President Donald Trump’s demonstrably false assertion that his Inauguration crowd was bigger than President Barack Obama’s first one. People lie to cover up bad behavior, as American swimmer Ryan Lochte did during the 2016 Summer Olympics by claiming to have been robbed at gunpoint at a gas station when, in fact, he and his teammates, drunk after a party, had been confronted by armed security guards after damaging property. Even academic science—a world largely inhabited by people devoted to the pursuit of truth—has been shown to contain a rogues’ gallery of deceivers, such as physicist Jan Hendrik Schön, whose purported breakthroughs in molecular semiconductor research proved to be fraudulent.
These liars earned notoriety because of how egregious, brazen, or damaging their falsehoods were. But their deceit doesn’t make them as much of an aberration as we might think. The lies that impostors, swindlers, and boasting politicians tell merely sit at the apex of a pyramid of untruths that have characterized human behavior for eons.
Lying, it turns out, is something that most of us are very adept at. We lie with ease, in ways big and small, to strangers, co-workers, friends, and loved ones. Our capacity for dishonesty is as fundamental to us as our need to trust others, which ironically makes us terrible at detecting lies. Being deceitful is woven into our very fabric, so much so that it would be truthful to say that to lie is human.
The ubiquity of lying was first documented systematically by Bella DePaulo, a social psychologist at the University of California, Santa Barbara. Two decades ago DePaulo and her colleagues asked 147 adults to jot down for a week every instance they tried to mislead someone. The researchers found that the subjects lied on average one or two times a day. Most of these untruths were innocuous, intended to hide one’s inadequacies or to protect the feelings of others. Some lies were excuses—one subject blamed the failure to take out the garbage on not knowing where it needed to go. Yet other lies—such as a claim of being a diplomat’s son—were aimed at presenting a false image. While these were minor transgressions, a later study by DePaulo and other colleagues involving a similar sample indicated that most people have, at some point, told one or more “serious lies”—hiding an affair from a spouse, for example, or making false claims on a college application.
That human beings should universally possess a talent for deceiving one another shouldn’t surprise us. Researchers speculate that lying as a behavior arose not long after the emergence of language. The ability to manipulate others without using physical force likely conferred an advantage in the competition for resources and mates, akin to the evolution of deceptive strategies in the animal kingdom, such as camouflage. “Lying is so easy compared to other ways of gaining power,” notes Sissela Bok, an ethicist at Harvard University who’s one of the most prominent thinkers on the subject. “It’s much easier to lie in order to get somebody’s money or wealth than to hit them over the head or rob a bank.”
As lying has come to be recognized as a deeply ingrained human trait, social science researchers and neuroscientists have sought to illuminate the nature and roots of the behavior. How and when do we learn to lie? What are the psychological and neurobiological underpinnings of dishonesty? Where do most of us draw the line? Researchers are learning that we’re prone to believe some lies even when they’re unambiguously contradicted by clear evidence. These insights suggest that our proclivity for deceiving others, and our vulnerability to being deceived, are especially consequential in the age of social media. Our ability as a society to separate truth from lies is under unprecedented threat.
When I was in third grade, one of my classmates brought a sheet of racing car stickers to school to show off. The stickers were dazzling. I wanted them so badly that I stayed back during gym class and transferred the sheet out of the classmate’s backpack into mine. When the students returned, my heart was racing. Panicking that I would be found out, I thought up a preemptive lie. I told the teacher that two teenagers had shown up on a motorbike, entered the classroom, rifled through backpacks, and left with the stickers. As you might expect, this fib collapsed at the gentlest probing, and I reluctantly returned what I had pilfered.
My naive lying—I got better, trust me—was matched by my gullibility in sixth grade, when a friend told me that his family owned a flying capsule that could transport us anywhere in the world. Preparing to travel on this craft, I asked my parents if they could pack me a few meals for the journey. Even when my older brother snickered, I refused to disbelieve my friend’s claim, and it was left to my friend’s father to finally convince me that I’d been duped.
These lies that my friend and I told were nothing out of the ordinary for kids our age. Like learning to walk and talk, lying is something of a developmental milestone. While parents often find their children’s lies troubling—for they signal the beginning of a loss of innocence—Kang Lee, a psychologist at the University of Toronto, sees the emergence of the behavior in toddlers as a reassuring sign that their cognitive growth is on track.
To study lying in children, Lee and his colleagues use a simple experiment. They ask kids to guess the identity of toys hidden from their view, based on an audio clue. For the first few toys, the clue is obvious—a bark for a dog, a meow for a cat—and the children answer easily. Then the sound played has nothing to do with the toy. “So you play Beethoven, but the toy’s a car,” Lee explains. The experimenter leaves the room on the pretext of taking a phone call—a lie for the sake of science—and asks the child not to peek at the toy. Returning, the experimenter asks the child for the answer, following up with the question: “Did you peek or not?”
Most children can’t resist peeking, Lee and his researchers have found by monitoring hidden cameras. The percentage of the children who peek and then lie about it depends on their age. Among two-year-old transgressors, only 30 percent are untruthful. Among three-year-olds, 50 percent lie. And by eight, about 80 percent claim they didn’t peek.
Kids also get better at lying as they get older. In guessing the toy that they secretly looked at, three- and four-year-olds typically blurt out the right answer, without realizing that this reveals their transgression and lying. At seven or eight, kids learn to mask their lying by deliberately giving a wrong answer or trying to make their answer seem like a reasoned guess.
Five- and six-year-old kids fall in between. In one study Lee used Barney the dinosaur as the toy. A five-year-old girl who denied having looked at the toy, which was hidden under a cloth, told Lee she wanted to feel it before guessing. “So she puts her hand underneath the cloth, closes her eyes, and says, ‘Ah, I know it’s Barney,’ ” Lee recounts. “I ask, ‘Why?’ She says, ‘Because it feels purple.’ ”
What drives this increase in lying sophistication is the development of a child’s ability to put himself or herself in someone else’s shoes. Known as theory of mind, this is the facility we acquire for understanding the beliefs, intentions, and knowledge of others. Also fundamental to lying is the brain’s executive function: the abilities required for planning, attention, and self-control. The two-year-olds who lied in Lee’s experiments performed better on tests of theory of mind and executive function than those who didn’t. Even at 16, kids who were proficient liars outperformed poor liars. On the other hand, kids on the autism spectrum—known to be delayed in developing a robust theory of mind—are not very good at lying.
On a recent morning, I took an Uber to visit Dan Ariely, a psychologist at Duke University and one of the world’s foremost experts on lying. The inside of the car, though neat, had a strong odor of sweaty socks, and the driver, though courteous, had trouble finding her way. When we finally got there, she asked me smilingly if I would give her a five-star rating. “Sure,” I replied. Later, I gave her three stars. I assuaged my guilt by telling myself that it was better not to mislead thousands of Uber riders.
Ariely became fascinated with dishonesty about 15 years ago. Looking through a magazine on a long-distance flight, he came across a mental aptitude test. He answered the first question and flipped to the key in the back to see if he got it right. He found himself taking a quick glance at the answer to the next question. Continuing in this vein through the entire test, Ariely, not surprisingly, scored very well. “When I finished, I thought—I cheated myself,” he says. “Presumably, I wanted to know how smart I am, but I also wanted to prove I’m this smart to myself.” The experience led Ariely to develop a lifelong interest in the study of lying and other forms of dishonesty.
In experiments he and his colleagues have run on college campuses and elsewhere, volunteers are given a test with 20 simple math problems. They must solve as many as they can in five minutes and are paid based on how many they get right. They are told to drop the sheet into a shredder before reporting the number they solved correctly. But the sheets don’t actually get shredded. A lot of volunteers lie, as it turns out. On average, volunteers report having solved six matrices, when it was really more like four. The results are similar across different cultures. Most of us lie, but only a little.
The question Ariely finds interesting is not why so many lie, but rather why they don’t lie a lot more. Even when the amount of money offered for correct answers is raised significantly, the volunteers don’t increase their level of cheating. “Here we give people a chance to steal lots of money, and people cheat only a little bit. So something stops us—most of us—from not lying all the way,” Ariely says. The reason, according to him, is that we want to see ourselves as honest, because we have, to some degree, internalized honesty as a value taught to us by society. Which is why, unless one is a sociopath, most of us place limits on how much we are willing to lie. How far most of us are willing to go—Ariely and others have shown—is determined by social norms arrived at through unspoken consensus, like the tacit acceptability of taking a few pencils home from the office supply cabinet.
Patrick Couwenberg's staff and fellow judges in the Los Angeles County Superior Court believed he was an American hero. By his account, he had been awarded a Purple Heart in Vietnam. He’d participated in covert operations for the Central Intelligence Agency. The judge boasted of an impressive educational background as well—an undergraduate degree in physics and a master’s degree in psychology. None of it was true. When confronted, Couwenberg’s defense was to blame a condition called pseudologia fantastica, a tendency to tell stories containing facts interwoven with fantasy. The argument didn’t save him from being removed from the bench in 2001.
There appears to be no agreement among psychiatrists about the relationship between mental health and lying, even though people with certain psychiatric disorders seem to exhibit specific lying behaviors. Sociopathic individuals—those diagnosed with antisocial personality disorder—tend to tell manipulative lies, while narcissists may tell falsehoods to boost their image.
But is there anything unique about the brains of individuals who lie more than others? In 2005 psychologist Yaling Yang and her colleagues compared the brain scans of three groups: 12 adults with a history of repeated lying, 16 who met the criteria for antisocial personality disorder but were not frequent liars, and 21 who were neither antisocial nor had a lying habit. The researchers found that the liars had at least 20 percent more neural fibers by volume in their prefrontal cortices, suggesting that habitual liars have greater connectivity within their brains. It’s possible this predisposes them to lying because they can think up lies more readily than others, or it might be the result of repeated lying.
Psychologists Nobuhito Abe at Kyoto University and Joshua Greene at Harvard University scanned the brains of subjects using functional magnetic resonance imaging (fMRI) and found that those who acted dishonestly showed greater activation in the nucleus accumbens—a structure in the basal forebrain that plays a key role in reward processing. “The more excited your reward system gets at the possibility of getting money—even in a perfectly honest context—the more likely you are to cheat,” explains Greene. In other words, greed may increase one’s predisposition to lying.
One lie can lead to another and another, as evidenced by the smooth, remorseless lying of serial con men such as Hogue. An experiment by Tali Sharot, a neuroscientist at University College London, and colleagues showed how the brain becomes inured to the stress or emotional discomfort that happens when we lie, making it easier to tell the next fib. In the fMRI scans of the participants, the team focused on the amygdala, a region that is involved in processing emotions. The researchers found that the amygdala’s response to lies got progressively weaker with each lie, even as the lies got bigger. “Perhaps engaging in small acts of deception can lead to bigger acts of deception,” she says.
Much of the knowledge we use to navigate the world comes from what others have told us. Without the implicit trust that we place in human communication, we would be paralyzed as individuals and cease to have social relationships. “We get so much from believing, and there’s relatively little harm when we occasionally get duped,” says Tim Levine, a psychologist at the University of Alabama at Birmingham, who calls this idea the truth default theory.
Being hardwired to be trusting makes us intrinsically gullible. “If you say to someone, ‘I am a pilot,’ they are not sitting there thinking: ‘Maybe he’s not a pilot. Why would he say he’s a pilot?’ They don’t think that way,” says Frank Abagnale, Jr., a security consultant whose cons as a young man, including forging checks and impersonating an airline pilot, inspired the 2002 movie Catch Me if You Can. “This is why scams work, because when the phone rings and the caller ID says it’s the Internal Revenue Service, people automatically believe it is the IRS. They don’t realize that someone could manipulate the caller ID.”
Robert Feldman, a psychologist at the University of Massachusetts, calls that the liar’s advantage. “People are not expecting lies, people are not searching for lies,” he says, “and a lot of the time, people want to hear what they are hearing.” We put up little resistance to the deceptions that please us and comfort us—be it false praise or the promise of impossibly high investment returns. When we are fed falsehoods by people who have wealth, power, and status, they appear to be even easier to swallow, as evidenced by the media’s credulous reporting of Lochte’s robbery claim, which unraveled shortly thereafter.
Researchers have shown that we are especially prone to accepting lies that affirm our worldview. Memes that claim Obama was not born in the United States, deny climate change, accuse the U.S. government of masterminding the terrorist strikes of September 11, 2001, and spread other “alternative facts,” as a Trump adviser called his Inauguration crowd claims, have thrived on the Internet and social media because of this vulnerability. Debunking them does not demolish their power, because people assess the evidence presented to them through a framework of preexisting beliefs and prejudices, says George Lakoff, a cognitive linguist at the University of California, Berkeley. “If a fact comes in that doesn’t fit into your frame, you’ll either not notice it, or ignore it, or ridicule it, or be puzzled by it—or attack it if it’s threatening.”
A recent study led by Briony Swire-Thompson, a doctoral candidate in cognitive psychology at the University of Western Australia, documents the ineffectiveness of evidence-based information in refuting incorrect beliefs. In 2015 Swire-Thompson and her colleagues presented about 2,000 adult Americans with one of two statements: “Vaccines cause autism” or “Donald Trump said that vaccines cause autism.” (Trump has repeatedly suggested there’s a link, despite the lack of scientific evidence for it.)
Not surprisingly, participants who were Trump supporters showed a decidedly stronger belief in the misinformation when it had Trump’s name attached to it. Afterward the participants were given a short explanation—citing a large-scale study—for why the vaccine-autism link was false, and they were asked to reevaluate their belief in it. The participants—across the political spectrum—now accepted that the statements claiming the link were untrue, but testing them again a week later showed that their belief in the misinformation had bounced back to nearly the same level.
Other studies have shown that evidence undermining lies may in fact strengthen belief in them. “People are likely to think that familiar information is true. So any time you retract it, you run the risk of making it more familiar, which makes that retraction actually less effective, ironically, over the long term,” says Swire-Thompson.
I experienced this phenomenon firsthand not long after I spoke to Swire-Thompson. When a friend sent me a link to an article ranking the 10 most corrupt political parties in the world, I promptly posted it to a WhatsApp group of about a hundred high school friends from India. The reason for my enthusiasm was that the fourth spot in the ranking was held by India’s Congress Party, which in recent decades has been implicated in numerous corruption scandals. I chortled with glee because I’m not a fan of the party.
But shortly after sharing the article, I discovered that the ranking, which included parties from Russia, Pakistan, China, and Uganda, wasn’t based on any metrics. It had been done by a site called BBC Newspoint, which sounded like a credible source. But I found out that it had no connection to the British Broadcasting Corporation. I posted an apology to the group, noting that the article was in all likelihood fake news.
That didn’t stop others from reposting the article to the group several times over the next day. I realized that the correction I’d posted had not had any effect. Many of my friends—because they shared my antipathy toward the Congress Party—were convinced the ranking was true, and every time they shared it, they were unwittingly, or perhaps knowingly, nudging it toward legitimacy. Countering it with fact would be in vain.
What then might be the best way to impede the fleet-footed advance of untruths into our collective lives? The answer isn’t clear. Technology has opened up a new frontier for deceit, adding a 21st-century twist to the age-old conflict between our lying and trusting selves.