Home > News > If you’re worried that Russian bots are brainwashing the world, take a deep breath
138 views 8 min 0 Comment

If you’re worried that Russian bots are brainwashing the world, take a deep breath

A new book on cognitive psychology explains that people aren’t nearly as gullible or easily swayed as the media would have you think.

- February 23, 2020

Hugo Mercier is a cognitive scientist at the Jean Nicod Institute in Paris and the author of a new book, Not Born Yesterday: The Science of Who We Trust and What We Believe, from Princeton University Press, which looks at the science behind what people believe and don’t believe. I asked him a series of questions about the book.

HF: Many scholars and experts argue that people are extremely gullible (themselves of course excepted). Why do you think they are wrong?

HM: The idea that people are easy to manipulate, swayed by prestige and charisma, is widespread not only among laypeople, but also intellectuals. Yet the experts in each relevant discipline pretty much all concur that people are, in fact, very difficult to persuade. Political scientists know how ineffectual political campaigns are. Advertising researchers know that most ads have tiny effects. Historians know that authoritarian propaganda is reviled and mocked by the population (if they can get away with it). Psychologists have catalogued the cues people (as young as preschoolers) take into account when evaluating what they’re told — plausibility of the content, quality of the arguments, competence and honesty of the source. Ironically, these scholars often believe that people are gullible in areas beyond their own domain of expertise. One of my goals in the book is to reveal a consistent pattern: People aren’t gullible; they have all the cognitive mechanisms required to carefully evaluate communication, and mass persuasion attempts — from propaganda to advertising — nearly always fall on deaf ears. I’m not denying that people get many things wrong: The book also argues that gullibility isn’t a good explanation for the popularity of most unfounded beliefs.

HF: You say that the critical question for why crazy-seeming beliefs spread “is not why people accept them, but why people profess them.” Why do people sometimes publicly claim that they believe grotesquely improbable things if they don’t really believe them?

To believe in something can mean two things. I believe there’s a computer in front of me. This is an intuitive belief, which guides my behavior. I believe that I’m revolving at a fantastic speed around the center of the earth. This is a reflective belief, which has no impact on my behavior. Many unfounded beliefs are reflective. Take Pizzagate, the debunked conspiracy theory claiming that high-level Democrats were abusing children in the basement of a D.C. pizzeria. According to polls, millions of Americans were supposed to believe it. Yet only one — Edgar Maddison Welch — behaved as if he intuitively believed children were being abused, storming the pizzeria with a big gun and demanding that the children be freed. Others did nothing or, at worst, left one-star reviews with a bizarre mix of straightforward accusations (the staff kept eyeing my kids) and mundane complaints (not only are they pedophiles, but the pizza was cold). Such behavior can only be explained if the belief in Pizzagate is held reflectively.

In many ways, people holding unfounded reflective beliefs is less of a problem than if they held such beliefs intuitively: witness the difference between leaving a stupid review online and spending a few years in jail (Welch’s current fate). Still, why profess to such apparently stupid ideas as Pizzagate, flat earth or countless other conspiracy theories? Counterintuitively, holding such beliefs might be a way for people of showing (some) others that they can be relied on. When making statements that many find inane and offensive, we burn our bridges, making it difficult to cooperate — work with, be friends with — the people we’ve offended: Most Democrats, I imagine, wouldn’t want to work with a believer in Pizzagate. By sending a strong signal that we can’t cooperate with much of the population, we also send a signal to the rest that we can be relied on, since we don’t have any other choice but to cooperate with them. Burning bridges is one of the reasons I explore in the book for why people profess crazy beliefs.

HF: You argue that there’s little evidence that people are becoming more ideologically polarized, for example, by social media. However, social media may be increasing the level of perceived polarization. Why is this so?

HM: Americans haven’t polarized in every dimension: They are still broadly centrists, and on most issues they don’t hold increasingly extreme views. But they are more consistent in their political views (sorting), and they dislike people from the other party more (affective polarization). One of the appealing explanations for the rise of such polarization is social media. Each political camp has vocal supporters who make provocative statements — I would argue, because they are trying to burn bridges. This tiny minority can become highly visible on social media, in particular as their provocations are used by those they oppose as examples of how crazy the other side is. We already have a problem of perceived polarization, with people on each side holding widely exaggerated views of the other side (e.g. Republicans believe Democrats are much more opposed to free trade than they actually are), and the use of social media seems to amplify the gap between perception and reality.

However, a recent reality check has called into question the role of social media in the rise of affective polarization. Social media use has risen in many countries over the past decade. Yet affective polarization has only increased in a few of them. As a result, we might look for another culprit to explain the rise in affective polarization in the United States.

HF: What implications does the research you describe have for arguments that Russian bots or Cambridge Analytica psychometrics explain why Donald Trump won the 2016 presidential election?

HM: There is a convergence of evidence showing that people are very hard to persuade en masse. Improvements in technology, from the first clumsy propaganda attempts to modern, analytics-fueled political campaigns, have done nothing to alter this conclusion. Yet at every stage, new technologies have been decried as terrifying tools of mind control. These trends will continue: new technologies will arise (targeted advertising, deepfakes, etc.), there will be a hue and cry, yet getting people to accept uncongenial views, or to engage in costly behaviors, will remain tremendously difficult. Cambridge Analytica and the Russian bots fit this pattern: a lot of panicky talk, but no evidence that they had any significant impact on the election