Q&A The Misinformation Age: How False Beliefs Spread

New book acts as guide for understanding and mitigating misinformation in a media-frenzied age

Intersection Arrows

From a simple “like” on social media to a damning headline of false facts on national news, contributors – unknowing and otherwise - to the rise of fake news can be found everywhere from high school desks to bullpens at some of the most reputable news agencies. In their new book, UCI logic and philosophy of science professors Cailin O’Connor and James Weatherall explain how fake news comes to be, who some of the biggest culprits are, and how we can all work to kill it before it grows legs.

Q: A lot of people have argued that persistent false beliefs are best explained by individual failures - cognitive biases, blindspots, things like that. But is that the only or most important explanation of why we see so many well-meaning and well-informed people holding false beliefs?

CO: This is one of the core points of the book. Most people, including academics, who have thought about false beliefs and fake news assume that the main problem has to do with our psychological biases. We accept new information that fits our current beliefs, things like that. But we want to push back on this idea. We think that to really understand why false beliefs can persist and even spread, you need to recognize that there is a deep social aspect to what we believe. Think about where virtually all of our beliefs, true and false, come from: someone told you something. Almost everything you believe you get from others. So now think about social media and how people’s social interactions influence the way they get info. Who are they trying to impress? And now think about fake news and propaganda from governments and industry. Fake news works because propagandists know how to take advantage of social ties and connections to promote the beliefs they want people to hold.

Q: We have seen a startling amount of polarization in the U.S. in recent years, concerning not only opinions and values, but facts themselves. What explains this and what can we do about it?

CO: One example of this that we look at in the book is the chronic Lyme disease debate. Everyone agrees that Lyme disease afflicts a huge number of people every year. But it’s very controversial whether Lyme disease is always cured by a dose of antibiotics, or if in some cases there is a chronic form of the disease that can recur long after it has been treated. Lots of people are trying to figure out what’s happening, and there are two groups that have emerged with very strong, different beliefs. They can’t both be right, and they both produce a lot of studies. But they don’t seem to influence one another. The issue, we argue, is trust. Basically, each group only trusts evidence coming from those who share their beliefs. If you have similar beliefs to someone, you trust them. Once you have that sort of situation, you end up with people in very different camps only listening to the people who are like them - even when everybody involved wants to figure out the truth, like in the case of chronic Lyme disease.

JW: One really important aspect of what Cailin just said is that we are not assuming that people in the Lyme disease case are only listening to evidence that supports their current beliefs, though of course this might also be happening. Instead, we argue that they are only listening to evidence from people who believe what they believe, and that, because of the sorts of questions those people end up asking, they only get exposed to a limited amount of evidence. Basically, if you think it is settled that chronic Lyme disease doesn’t exist, then you aren’t going to keep doing studies that try to prove it does exist, and you’re going to think that the sorts of people who do those sorts of studies probably have a screw loose somewhere.

JW: Another thing we argue in the book is that there are actually several ways in which you can explain polarization, all of which could be right. But different explanations of polarization suggest different solutions, and in some cases these solutions oppose one another. As Cailin just explained, trust can play a role. But another explanation of polarization is that people are only exposed to a limited range of evidence. In that case, the natural solution is to expose people to more perspectives. But the trust explanation conflicts with that. If you suddenly put two groups with very different beliefs in contact with one another, and they don’t trust each other, they can end up even more polarized. So if trust is the basic issue, what you need is people who are recognized as trustworthy by the community to bring people along.

Q: How do journalists contribute to misinformation and false beliefs? In what ways do standard journalistic practices inadvertently mimic propaganda?

CO: Until the ’80s, there used to be a law on the books, called the fairness doctrine, which required journalists to present both sides of an argument equally. But on scientific matters of fact, that creates a problem, because there is usually more evidence in favor of true things than false ones, which means that presenting both of them evenly means giving too much misleading evidence. So when journalists try to be even-handed, the public ends up seeing too much of the wrong evidence. It’s inherently misleading.

JW: As we said before, an extremely effective propaganda tool is to try to distort the total evidence that the public sees. Essentially, you take scientific evidence and expose people to just some parts of it. This isn’t good. And treating scientific issues using something like the fairness doctrine mimics this sort of propaganda, by distorting the total body of evidence.

CO: This is how journalists, in trying to do due diligence, unwittingly spread false ideas. 

Read the full Q&A online at socs.ci/misinformationage2019.


connect with us


© UC Irvine School of Social Sciences - 3151 Social Sciences Plaza, Irvine, CA 92697-5100 - 949.824.2766