What do you really know about gullibility?

What do you really know about gullibility?

By Hugo Mercier

Scroll to Article Content

Not Born Yesterday explains how we decide who we can trust and what we should believe—and argues that we’re pretty good at making these decisions. In this lively and provocative book, Hugo Mercier demonstrates how virtually all attempts at mass persuasion—whether by religious leaders, politicians, or advertisers—fail miserably. Drawing on recent findings from political science and other fields ranging from history to anthropology, Mercier shows that the narrative of widespread gullibility, in which a credulous public is easily misled by demagogues and charlatans, is simply wrong. Mercier has identified 10 things about gullibility you may not know.

1) People believe they’re not gullible, but other people are

Do you feel like your political opinions are strongly influenced by the ads you see on TV or on Facebook? Probably not. But do you suspect that other people’s political opinions are dictated by MSNBC, Fox News, or ads specially targeted by Cambridge Analytica? Maybe. After all, you might think, how could so many people be so wrong? The problem is that everyone thinks they’re hard to influence, while others are easily swayed even by the most shallow or biased information. This is called the third-person effect. Clearly, not everyone can be right on this. Which is it? Are we more gullible than we think, or are others less gullible than we think?

2) People aren’t gullible

It’s the latter. People aren’t gullible; they aren’t easy to fool into believing unfounded things. There is now a wealth of studies in experimental psychology showing that people, instead of accepting everything they read or hear, consider a variety of cues to decide how much they should listen to others. For starter, we compare what we’re told with what we already believe and, if that doesn’t fit, our first reaction is to reject what we’re told. Fortunately, we can overcome this initial reaction if we have reasons to believe the source of the information is well-informed, competent, well-intentioned, part of a broader consensus, or if they offer us good arguments. Indeed, over the past twenty years, a slew of research in developmental psychology has shown that preschoolers—3- to 5-year-olds—can already take all of these cues into account, being more likely to listen to someone who is well-informed, competent, etc. For instance, my colleagues and I have shown that two-and-a-half year-olds are more receptive to strong arguments than to vacuous, circular arguments

3) Propaganda fails

 Even if psychological experiments show that adults and children, in the lab, can be very sophisticated when it comes to deciding who to trust and what to believe, isn’t that contradicted by so much of history? What about the Germans saluting the Führer in unison, or North Koreans wallowing upon the death of Kim Jong-il? Could these people have been exerting discernment in who they trusted? Hadn’t they gullibly taken in government propaganda? In fact, all the existing quantitative research suggests that propaganda in totalitarian regimes doesn’t change anyone’s mind. It can allow existing preferences to express themselves from forcefully. For example, it seems Nazi propaganda made already anti-Semitic Germans engage in more anti-Semitic acts, but it had no effect, or the opposite effect, on non-anti-Semitic Germans. By and large, displays of allegiance to authoritarian regimes stem not from persuasion but from self-interest, as people seek to ingratiate with those in power or, on the contrary, fear their wrath.

4) Political campaigns fail

Could modern political campaigns, with their sophisticated data analytics, targeted advertising, and professional spin doctors, do better than repetitive, stuffy, dull authoritarian propaganda? Apparently not—or maybe only at the margins. A recent meta-analysis looked at all the studies on the effects of political campaigns in the US. These studied had carefully estimated whether mailing, canvassing, cold calling, or advertising could get people to vote for another candidate. In general elections—such as the presidential election—there are simply no effects on voting behavior. Only when voters can’t rely on simple heuristics—“I’ll vote Republican,” “I’ll vote Democrat”—, as in primaries or some ballot measures, do campaigns seem to have an impact, and even then it remains quite modest. 

5) Advertising fails 

The handful of billions spent on political campaigns in the US pales by comparison with the dozens of billions spent on advertising. To what end? In 1982 already, researchers were sounding the alarm bell, as their review of the literature led them to ask, in the very title of their article “Are you overadvertising?” Nearly 40 years later, the state of the field is even more dire. A working paper, the first to conduct a large-scale literature review taking into account publication bias (which makes positive results more likely to be published), found that only a small minority of ads had positive effects, most had no discernible effects, and a few backfired. Whether the ads are for politicians or for products, the vast majority is wasted on people who, far from gullibly taking in whatever they’re told, mostly ignore the messages. 

6) Making people dumb doesn’t make them gullible 

There is a longstanding association between lack of intellectual sophistication and gullibility. Throughout history, people—women, minorities, slaves, workers—thought to have lesser cognitive abilities were also thought to be easier to influence. Relatedly, it was supposed that making people stupid—by stopping them from thinking—would make them suggestible. This idea underlies for instance subliminal advertising—in which a message is presented outside of conscious awareness, for example because it is flashed very quickly—or brainwashing—in which the victim’s ability to think is supposed to be wiped out by lack of sleep, harsh conditions, or downright torture. Fortunately, none of this works. Neither subliminal advertising nor brainwashing have ever changed anyone’s mind. Instead, to influence people we must make them think more, given them grounds for trusting us, reasons to think we’re right.

7) Most false beliefs are largely harmless

Voltaire famously quipped that “those who can make you believe absurdities can make you commit atrocities” (in fact, his translator had a hand in this, but the expression proved popular). If true, this would be quite terrifying, as people do believe in a number of absurdities. For example, a few years ago, a sizable minority of Americans seemed to believe that the basement of the Comet Ping Pong pizzeria, in the suburbs of Washington DC, was used by Democratic operatives to abuse children (the conspiracy theory known as pizzagate). One guy did commit, if not an atrocity, at least something extremely stupid: storming the place, gun ablaze, asking for the children to be freed. Voltaire was right about him. But not about the 99.999% of the people who, even though they supposedly believed small children were being assaulted with impunity, did nothing at all—or, if they were really outraged, left a one-star review on the Google page of the restaurant, not quite an appropriate answer to suspected child molestation. Most popular false beliefs—rumors, urban legends, conspiracy theories—are like pizzagate: people say they believe, and they do, but the beliefs do not influence the rest of their thoughts, or their behavior. This helps explain why people accept such beliefs: as the beliefs are largely inconsequential (at least for those who hold them), the stakes are low, and their vigilance relaxed.

8) Most false beliefs serve some goal

If false beliefs often have little consequence on people’s behavior, there’s one glaring exception: their verbal behavior. People will loudly share views in spite of not behaving in line with the views at all. Take 9/11 truthers who believe the CIA is powerful, nefarious, and out to get them… and are as vocal as they can about their opinion. That makes no sense. When you live with an actually powerful and nefarious intelligence agency in your country, you shut up about it, or get dead. If people share such beliefs, it is likely because their behavior serves some goal. Indeed, sharing beliefs, even false beliefs, can serve a variety of social ends. We can justify a common course of action—as when people share rumors of atrocities before an ethnic riot. We can entertain our audience with thrilling urban legends or salacious rumors. We can even commit to a fringe group by saying things so apparently absurd or evil that almost everyone is sure to reject us, thus proving to this fringe group that we really are in with them (think flat earthers). When you hear someone profess some absurd belief, it’s often more productive to wonder not how gullible they are, but what goal they might be trying to accomplish.

9) People aren’t hopelessly pigheaded

People aren’t gullible. Mass persuasion—from authoritarian propaganda to advertising—fails massively. Does this mean, then, that people are merely pigheaded? No. People are not pigheaded, they are rationally skeptical. In the absence of good reasons to change their minds, they don’t. But when the reasons are there, people do change their minds. In our everyday life, we are constantly influenced by what our friends, family members, colleagues tell us. That’s because we know them, we know they don’t want to con us, we know in which domain they are more likely to be right, and we have time to exchange arguments.

10) We need more information, not less

A common fear in our online age is that of information overload. We are bombarded with such a mass of information that we would become unable to sort through it, and as a result we would end up taking in a bunch of nonsense. If the information overload is undoubtedly present, its consequences are the opposite. Because we don’t have the time, the motivation, or even the extra information required to properly evaluate most of the information we encounter, we revert to a state of rational skepticism. For instance, I might have been more persuaded by a newspaper article if I had known the author, realized she had good intentions, appreciated the depth of their knowledge of the issue, and had time to exchange arguments with her. By orders of magnitude, the largest problem with the current informational environment isn’t the information we accept which we should have rejected, but the information we reject which we should (had we had more information) have accepted.

Hugo Mercier is a cognitive scientist at the Jean Nicod Institute in Paris and the coauthor of The Enigma of Reason. He lives in Nantes, France. Twitter @hugoreasoning