Home > News > Politicians reject evidence that conflicts with their beliefs. And if you give them more evidence, they double down.
131 views 8 min 0 Comment

Politicians reject evidence that conflicts with their beliefs. And if you give them more evidence, they double down.

- October 5, 2017
President Trump has tweeted that he won the popular vote and that he was the victim of voter fraud. However, there is no evidence to back up this claim. (Jabin Botsford/The Washington Post)

We’d like to think that politicians make their decisions based on facts, and that they will make better decisions if they have access to scientific evidence and systemic evaluations of whether policies are working. However, psychological experiments show that people can be selective in how they deal with information about hotly disputed political topics. Prior beliefs and emotions often trump facts.

Our new research looks at how politicians deal with evidence, asking whether their ideologies affect their interpretation of evidence, and whether more evidence — if it is provided — will persuade them, or instead just lead them to double down on their previous beliefs.

We conducted a survey of 954 Danish local politicians. In Denmark, local politicians make decisions over crucial services such as schools, day care, elder care and various social and health services. Depending on their ideological beliefs, some politicians think that public provision of these services is better than private provision. Others think just the opposite. We wanted to see how these beliefs affected the ways in which politicians interpreted evidence.

Ideology can blind politicians to the facts.

Our research involved a number of “survey experiments” in which we varied the questions that we asked politicians. In our first test, we asked the politicians to evaluate parents’ satisfaction ratings for a public and a private school. We had deliberately set up the comparison so one school performed better than the other. We then divided the politicians into two groups. One group got the data — but without any information as to whether the school was public or private. The schools were just labeled “School A” and “School B.” The other group got the exact same data, but instead of “School A” and “School B,” the schools’ titles were “Public School” and “Private School.”

If politicians are influenced by their ideologies, we would expect that they would be able to interpret the information about “School B” and “School A” correctly. However, the other group would be influenced by their ideological beliefs about private versus public provision of welfare services in ways that might lead them to make mistakes.

This is exactly what we found. Most politicians interpreting data from “School A” and “School B” were perfectly capable of interpreting the information correctly. However, when they were asked to interpret data about a “Public School” and a “Private School” they often misinterpreted it, to make the evidence fit their desired conclusion. This is not surprising — many other studies have shown much the same thing. But what happens if the politicians are presented with more evidence?

More evidence can hurt instead of helping.

Our second experiment gave the politicians data about the success of post-surgery rehab. As in the first experiment, one provider performed better than the other one. Providers were again either called “Public provider of service” or “Private provider of service.” To investigate the influence of adding more evidence, we varied how much information supported the judgment that one provider was better than the other. Different groups of politicians received either one, three or five pieces of information, which all supported the same conclusion. We expected that the sheer amount of evidence would persuade the politicians to let information trump their ideological beliefs, when the information was at odds with their prior beliefs.

To our great surprise, that did not happen. Instead, we actually observed how more information reinforced prior attitudes rather than undermining them, suggesting that the politicians were even more prone to interpret the information in a way that supported their prior beliefs. In other words, the more evidence there was, the more politicians were influenced by their prior attitudes, rather than the conclusion that the evidence pointed to.

This is a depressing finding. To find out whether this was just true of politicians, or of people in general, we carried out the same survey experiments on a representative sample of 1,000 Danish citizens. This replication yielded roughly the same results. The good and the bad news is that politicians are only human beings.

This has important implications for politics.

Even if politicians and ordinary voters have the same unconscious biases, we might worry more about politicians, since they have to take important decisions. People want policymaking to be evidence based — but is that even possible?

Most developed democracies spend a lot of money to provide politicians with enough information that they can make informed decisions. Yet if politicians are constrained by ideological straitjackets, then one might argue that the money is wasted, as the information will be misinterpreted anyway. We think that this is too cynical a view.

After all, our study showed the politicians were perfectly capable of interpreting information, when the politicians did not hold any strong prior attitudes toward the issue at hand. At the least, this suggests evidence is important in deciding over non-politicized issues. Furthermore, political attitudes can change over time — for example, where there is overwhelming evidence that we were wrong. Although some still go on stubbornly denying global warming, increasing amounts of evidence have persuaded millions of people. The same is true for politicians. Few politicians (at least in Denmark) can maintain the credibility they need to get elected if they fail to acknowledge unambiguous evidence over a sustained period. Thus, although we may not expect any short-term effects of evidence, we do believe that evidence eventually affects political attitudes.

Even so, these findings suggest politics is not a rational process in which measurement and information automatically improve decision-making. This is a point that politicians, bureaucrats, researchers and other political stakeholders alike should remember when trying to introduce facts in politics.

Casper Mondrup Dahlmann works at the Ministry of Finance in Denmark.
Niels Bjørn Grund Petersen is a PhD student at Aarhus University.

This article is one in a series supported by the MacArthur Foundation Research Network on Opening Governance that seeks to work collaboratively to increase our understanding of how to design more effective and legitimate democratic institutions using new technologies and new methods. Neither the MacArthur Foundation nor the Network is responsible for the article’s specific content. Other posts in the series can be found here.

Topics on this page