Consider this troubling fact: an analysis of blogs that focus on denying climate change found that 80% of those blogs got their information from a single primary source. Worse, this source—a single individual who claims to be a “polar bear expert”—is no expert at all; they possess no formal qualifications and have conducted no research on polar bears.

No matter what side of the debate you are on, this should concern you. Arguments built on false premises don’t help anyone, regardless of their beliefs or political leanings. But does this sort of widespread misinformation—involving the mere repetition of information across many sources—actually affect people's beliefs?

To find out, we asked research participants to read news articles about topics that involved different governmental policies. In one experiment, participants read an article claiming that Japan’s economy would not improve, as well as one or more articles claiming the economy would improve. Unbeknownst to our participants, they were divided into three groups. The first group read four articles claiming the economy was going to improve—and importantly, each of these articles cited a different source. The second group read the same four articles, except in this case, all of the articles cited the same source (just like the climate-change-denying blogs all cite the same “expert”). And to establish a point of comparison for the other groups, the third group of participants read just one article claiming the economy would continue to improve.

You might expect that when information is corroborated by multiple independent sources, it is much more likely to sway people’s opinions. If this is the case, then people who read four articles that each cited a unique source should be very certain that Japan’s economy will continue to improve. Furthermore, if people pay attention to the original sources of the information they read, then those who read four articles that all cited exactly the same source should be less certain (because that information was repeated but not corroborated). And, finally, people who read only one article claiming Japan’s economy will improve should be the least certain.

Indeed, people were much more likely to believe that Japan’s economy would improve when they heard it from four unique sources than when they read only one article. But, surprisingly, people who read four articles all citing the same source were just as confident in their conclusions. Our participants seemed to pay attention only to the number of times the information was repeated without considering where the information came from. Why?

Perhaps people assume that when one person is cited repeatedly, that person is the one, true expert on that topic. But, in a follow-up study, we asked other participants exactly this question: all else equal, would you rather hear from five independent sources or one single source? Unsurprisingly, most believed that more sources were better. But here's the shocking part: when those same participants then completed the task described above, they still believed the repeated information that all came from the same source just as much as independently sourced information—even for those who had just said they would prefer multiple independent sources.

These results show that as we form opinions, we are influenced by the number of times we hear information repeated. Even when many claims can be traced back to a single source, we do not treat these claims as if we had heard them only once; we act as if multiple people had come to the same conclusion independently.  This “illusion of consensus” presumably applies to virtually all the information that we encounter, including debates about major societal issues like gun control, vaccination, and climate change. We need to be aware of how simple biases like these affect our beliefs and behavior so that we can be better community members, make informed voting decisions, and fully participate in debates over the public good.

We wanted to finish this article with a fun fact concerning the massive amount of information that people are exposed to each day.  So, we asked Google: "How much information do we take in daily?", and we got a lot of answers. Try it for yourself. You'll see source after source that says you consume about 34 gigabytes of information each day. You may not know exactly how much 34 gigabytes is, but it sure sounds like a lot, and several sources told you so.

What's the problem? All of these sources circle back to a single primary source. And, despite our best efforts, we weren't able to find an original source that was able to back-up these claims at all. If we weren't careful, it would have been easy to step away believing something that might not be true—and only because that information was repeated several times.

So, next time you hear a rumor or watch the news, think about it. Take one moment and ask, simply, "Where is this information coming from?"


For Further Reading:

Yousif, S. R., Aboody, R., & Keil, F. C. (2019). The illusion of consensus: a failure to distinguish between true and false consensus. Psychological Science30, 1195-1204.

 

Sami Yousif is a graduate student at Yale University. His research primarily focuses on how we see, make sense of, and navigate the space around us. In his spare time, he also studies how we (metaphorically) navigate a world of overabundant information.

Rosie Aboody is a graduate student at Yale University. She studies how we learn from others. Specifically, she’s interested in how children and adults decide what others know, who to learn from, and what to believe.

Frank Keil is a professor of psychology at Yale University and the director of the Cognition and Development lab. At the most general level, he is interested in how we come to make sense of the world around us.