Much of the information we get on the internet, on Facebook, from friends, or in the news is third-hand and second rate. Put another way, most of what you read and hear is, at best, like the childhood game of telephone. 

To use a recent example, the Washington Post did a great postmortem on how hydroxychloroquine (as a treatment for COVID-19) went from an experimental treatment supported by questionable evidence to “a game changer,” in the words of Donald Trump. While the initial data on hydroxychloroquine possibly merited further studies—which were done and showed no evidence of benefit—they did not merit an endorsement to use a treatment in people, an endorsement that had many adverse effects, including known side effects of the medication and medication shortages for persons suffering from conditions proven to be treated with hydroxychloroquine.

Why does this happen and what can you do to avoid getting caught up in the hype?

Let’s say a scientist does an experiment and gets an interesting result. This is our source data. They publish the results, accurately, we we hope, in a scientific journal. From this point on, the true results are at risk for distortion. This can happen when: a) the scientist’s university spins the results to sound more exciting for their newsletter or website; b) the scientist tweets an exciting sounding summary of their results; c) a local reporter reads “a” or “b” and decides to add some hype to make it a more interesting story; d) a larger media outlet reads “c” and sees a potential miracle cure; e) members of a Facebook group who care about the topic read “d” and spin it to support their cause…

As with the game of telephone, the facts can get distorted with every pass. This can happen by accident among well-intentioned reporters. After all, science sometimes is complicated. But there are many players in this game of telephone who are reliably unreliable and can be counted on to get things wrong to serve their interests due to conscious or unconscious biases. Scientists want the science to sound exciting to build their reputation, universities want the science to sound exciting to attract donors, and media want the news to sound exciting to sell advertising. We move from misinformation to disinformation when individuals want to make the news sound better than it is to sell their product, to create false hope, or to minimize the seriousness of a bad situation; or make the news sound worse than it is to create fear or support their pet conspiracy. The end result is that this second- and third- and eighth-hand information is often worse than worthless.

When people mistake third-hand information as equivalent to an original source, they are vulnerable to accepting information that has no real source at all. As a classic example, we have all heard that we only use 10% of our brains. In fact, the vast majority of our brain is active almost all of the time, even during sleep. So where did this myth come from? 

In a 1907 article by one of the founders of psychology, William James, he said “we are making use of only a small part of our possible mental and physical resources”; this was said in the context of people being held back by norms and routines, not our brains. Over time, this idea evolved as it was passed around, and in 1936, Lowell Thomas wrote without any citations that “Professor William James of Harvard used to say that the average man develops only ten percent of his latent mental ability.” Since that time, there have been a wide-range of articles that spread this myth but vary on how much of the brain is used (0.01 to 20%), what is meant by used (e.g. mental capacity, brain potential), and who discovered this “fact.” Some provide no sources, but others claim support ranging from “common knowledge,” “experts,” “neurobiologists,” and even “Soviet physiologists* (see footnote below for citation).” There are many more recent myths (e.g. that vaccines cause autism) that have spread even when the original source of the myth was known and debunked.

By accepting third-hand information as true, we can become unwitting participants in the spread of unreliable information. Often this occurs by accepting third-hand information that supports our view of the world and rejecting third-hand information that conflicts with what we want to believe (confirmation bias). To think with the mind of a scientist, we must fact-check not only things we disagree with, but anything we don’t truly know (which is almost everything), particularly when the news strikes us as too good to be true.

* Higbee, Kenneth L., and Samuel L. Clay. “College Students’ Beliefs in the Ten-Percent Myth.” The Journal of Psychology, vol. 132, no. 5, 1998, pp. 469-476.