Hi, I’m Dan, the Founder of Vivid Labs. I post just about every day. You can connect with me using the links in the menu.

Explore

Share & Chat

Illusory truth

In 1977, a group of researchers began to run an interesting experiment. They believed that if a false statement was repeated enough times to a person, that person would have an increasing level of trust in the statement and ultimately believe that it’s true.

They went on with their research and began surveying a large group of students at Villanova and Temple Universities. Over the course of a few weeks, these students were given a series of statements for which they were asked to give a “belief score” (these were random facts that the students weren’t likely to have any knowledge of). There were a few rounds of this. Any given student would get one set of statements, and then a few weeks later they would get another.

What the students didn’t know is that some of these statements would be repeated in the second and third rounds. So, in effect, they were grading their belief in statements that they had already read two weeks prior.

What happened next was fundamental to our understanding of marketing, the news, and even our understanding of truth at large (here’s the original study if you’re curious).

Across the three surveys, the students didn’t have any variance in their belief levels for the statements that never repeated. They simply remained steady across the several weeks of surveys because each statement was new and fresh.

For the statements that did repeat, however, their belief scores increased by an average of 12%. The repetitious nature of those false statements somehow caused these students to have a higher level of belief and trust in their validity.

This happens because our brains, in large part, judge something to be true based on something called processing fluency. This is a term used to describe the amount of effort needed to interpret new information. If I were to tell you a fun fact you’ve never heard of, that fact will require more brain power to process than a fact you’ve heard before. If I were to tell you the same fact again tomorrow, it would no longer be new and unique, and thus it would require less brain power to process. You would say, “yes, I know, you told me that yesterday.”

Although this example may seem obvious, it shines a light on a mental phenomenon that underpins our political biases, religious views, and relationships on the most fundamental level: the illusory truth effect.

The illusory truth effect simply states that if we’re repeatedly exposed to a falsehood, we will begin to believe it’s true. Read what I’ve just written carefully: we will begin to believe it’s true. Not we might begin to believe it’s true. Not we are more likely to believe it’s true. We will begin to believe something is true if it’s repeated enough times to us.

By itself, the illusory truth effect isn’t too dangerous. There aren’t many situations in our lives in which falsehoods will be purposefully or practically repeated. But there are two situations that can abuse the illusory truth effect to the point of being destructive.

The first situation is when this effect is combined with strong (“high-valence”) emotions. Repeated strong emotions ensnare mentally unstable people, and they don’t have the mental capacity to escape a downward spiral of anger and falsehoods. They are unable to stop to fact-check, so they become trapped.

The second situation is when the illusory truth effect is combined with confirmation bias. Confirmation Bias is the uplifting feeling you get when you read a horoscope. We tend to seek out, interpret, and remember information that is in line with our existing beliefs. If something is not in line with our beliefs, we tend to avoid it. Combining this bias with the illusory truth effect causes the same downward spiral of mistruth I described above, but on a mass scale. Just invent a lie that will be popular, broadcast it to the world, and all of a sudden you have an entire group of people convinced that your lie is true (and willing to go to great lengths to defend it). This is the singular force behind fake news, political propaganda, and conspiracy theories.

Nobody is immune to these mental effects. Be careful what you read or watch and how often you read or watch it. Expose yourself to multiple points of view. Actively question the intentions of your information sources. Just as importantly, don’t assume everyone is lying, as this will lead to the same delusion that underpins rampant conspiracy theories.

The illusory truth effect, confirmation bias, and other mental phenomenon exist for a reason. They are there to protect us and to help our brains and bodies operate at their full capacity. But they can be abused as well; it’s up to us to keep that abuse in check and always think for ourselves.