Mar 24, 2016

Why are people so incredibly gullible?

David Robson
BBC
March 24,  2016
If you ever need proof of human gullibility, cast your mind back to the attack of the flesh-eating bananas. In January 2000, a series of chain emails began reporting that imported bananas were infecting people with “necrotizing fasciitis” – a rare disease in which the skin erupts into livid purple boils before disintegrating and peeling away from muscle and bone.
According to the email chain, the FDA was trying to cover up the epidemic to avoid panic. Faced with the threat, readers were encouraged to spread the word to their friends and family.
The threat was pure nonsense, of course. But by28 January, the concern was great enough for the US Centers for Disease Control and Prevention to issue a statement decrying the rumour.
Did it help? Did it heck. Rather than quelling the rumour, they had only poured fuel on its flames. Within weeks, the CDC was hearing from so many distressed callers it had to set up a banana hotline. The facts became so distorted that people eventually started to quote the CDC as the source of the rumour. Even today, new variants of the myth have occasionally reignited those old fears.
The banana apocalypse may seem comical in hindsight, but the same cracks in our rational thinking can have serious, even dangerous, consequences
We may laugh at these far-fetched urban myths – as ridiculous as the ongoing theory that Paul McCartney, Miley Cyrus and Megan Fox have all been killed and replaced with lookalikes. But the same cracks in our logic allow the propagation of far more dangerous ideas, such as the belief that HIV is harmless and vitamin supplementscan cure AIDS, that 9/11 was an ‘inside job’ by the US government, or that a tinfoil hat will stop the FBI from reading your thoughts.
Why do so many false beliefs persist in the face of hard evidence? And why do attempts to deny them only add grist to the rumour mill? It's not a question of intelligence – even Nobel Prize winners have fallen for some bizarre and baseless theories. But a series of recent psychological advances may offer some answers, showing how easy it is to construct a rumour that bypasses the brain’s deception filters.
According to conspiracy theorists, the actress Megan Fox has died and been replaced by lookalikes - not once, but twice (Credit: Getty Images)
One, somewhat humbling, explanation is that we are all “cognitive misers” – to save time and energy, our brains use intuition rather than analysis.
As a simple example, quickly answer the following questions:
“How many animals of each kind did Moses take on the Ark?”
“Margaret Thatcher was the president of what country?”
Between 10 and 50% of study participantspresented with these questions fail to notice that it was Noah, not Moses, who built the Ark, and that Margaret Thatcher was the prime minster, not the president – even when they have been explicitly asked to note inaccuracies.
Known as the “Moses illusion”, this absentmindedness illustrates just how easily we miss the details of a statement, favouring the general gist in place of the specifics. Instead, we normally just judge whether it “feels” right or wrong before accepting or rejecting its message. “Even when we ‘know’ we should be drawing on facts and evidence, we just draw on feelings,” says Eryn Newman at the University of Southern California, whose forthcoming papersummarises the latest research on misinformation.
Based on the research to date, Newman suggests our gut reactions swivel around just five simple questions:
Does a fact come from a credible source?Do others believe it?Is there plenty of evidence to support it?Is it compatible with what I believe?Does it tell a good story?
Crucially, our responses to each of these points can be swayed by frivolous, extraneous, details that have nothing to do with the truth.
Consider the questions of whether others believe a statement or not, and whether the source is credible. We tend to trust people who are familiar to us, meaning that the more we see a talking head, the more we will begrudgingly start to believe what they say. “The fact that they aren’t an expert won’t even come into our judgement of the truth,” says Newman. What’s more, we fail to keep count of the number of people supporting a view; when that talking head repeats their idea on endless news programmes, it creates the illusion that the opinion is more popular and pervasive than it really is. Again, the result is that we tend to accept it as the truth.
Sticky nuggets
Then there’s the “cognitive fluency” of a statement – essentially, whether it tells a good, coherent story that is simple to imagine. “If something feels smooth and easy to process, then our default is to expect things to be true,” says Newman. This is particularly true if a myth easily fits with our expectations. “It has to be sticky – a nugget or soundbite that links to what you know, and reaffirms your beliefs,” agrees Stephan Lewandowsky at the University of Bristol in the UK, whose work has examined the psychology of climate change deniers.
A slick presentation will instantly boost the cognitive fluency of a claim, while raising its believability. In one recent study, Newman presented participants with an article (falsely) saying that a well-known rock singer was dead. The subjects were more likely to believe the claim if the article was presented next to a picture of him, simply because it became easier to bring the singer to mind – boosting the cognitive fluency of the statement. Similarly, writing in an easy-to-read font, or speaking with good enunciation, have been shown to increase cognitive fluency; indeed, Newman has shown that something as seemingly inconsequential as the sound of someone’s name can sway us; the easier it is to pronounce, the more likely we are to accept their judgement.
In light of these discoveries, you can begin to understand why the fear of the flesh-eating bananas was so infectious. For one thing, the chain emails were coming from people you inherently trust – your friends – increasing the credibility of the claim, and making it appear more popular. The concept itself was vivid and easy to picture – it had high cognitive fluency. If you happened to distrust the FDA and the government, the thought of a cover-up would have fitted neatly into your worldview.
It's true: we would rather hide our heads in the sand than listen to evidence questioning our beliefs, even if the facts are solid (Credit: Getty Images)
That cognitive miserliness can also help explain why those attempts to correct a myth have backfired so spectacularly, as the CDC found to their cost. Lab experiments confirm that offering counter-evidence only strengthens someone’s conviction. “In as little as 30 minutes, you can see a bounce-back effect where people are even more likely to believe the statement is true,” says Newman.
The problem, she says, emerges from our deeply flawed memories. Correcting the facts “would work very well if we could play back our memories as if they were recorded on video, but years of research show the memory is not perfect – we fill in gaps and we lose information,” she says.
Fraying beliefs
As a result of these frailties, we are instantly drawn to the juicier details of a story – the original myth – while forgetting the piddling little fact that it’s been proven false. Worse still, by repeating the original myth, the correction will have increased the familiarity of the claim – and as we’ve seen, familiarity breeds believability. Rather than uprooting the myth, the well-intentioned correction has only pushed it deeper.
A debunked myth may also leave an uncomfortable gap in the mind. Lewandowsky explains that our beliefs are embedded in our “mental models” of the way the world works; each idea is interlinked with our other views. It’s a little like a tightly bound book: once you tear out one page, the others may begin to fray as well. “You end up with a black hole in your mental representation, and people don’t like it.” To avoid that discomfort, we would often rather cling to the myth before our whole belief system starts unravelling.
Fortunately, there are more effective ways to set people straight and make the truth stick. For a start, you should avoid repeating the original story (where possible) and try to come up with a whole alternative to patch up the tear in their mental model. “If I tell you the Moon is not made of cheese, then you find it difficult to give up on the belief – but if I say it’s not cheese but rock, you say ‘OK, fine’, because you still have an idea of what the Moon is like,” explains Lewandowsky.
Andrew Wakefield (pictured) falsified elements of research that wrongly linked autism to MMR vaccines, leading him to be struck off the medical register (Credit: Getty Images)
Newman agrees it’s a helpful strategy. For instance, when considering the fears that MMR vaccines may be linked to autism, she suggests it would be better to build a narrative around thescientific fraud that gave rise to the fears – rather than the typical “myth-busting” article that unwittingly reinforces the misinformation. Whatever story you choose, you need to increase the cognitive fluency with clear language, pictures, and good presentation. And repeating the message, a little but often, will help to keep it fresh in their minds. Soon, it begins to feel as familiar and comfortable as the erroneous myth – and the tide of opinion should begin to turn.
At the very least, staying conscious of these flaws in your thinking will help you to identify when you may be being deceived. Both Newman and Lewandowsky point out that there is a flurry of misinformation flying around the forthcoming US presidential elections, as seen in Donald Trump’s claims that Mexican immigrants bringsexual violence and drug trafficking and Hillary Clinton’s opinion that Isis are using videos of Trump to recruit terrorists. (Neither statement held up to fact-checking.)
It’s always worth asking whether you have thought carefully about the things you are reading and hearing. Or are you just being a cognitive miser, persuaded by biased feelings rather than facts? Some of your dearest opinions may have no more substance than the great banana hoax of the year 2000.
--
David Robson is BBC Future’s feature writer. He is@d_a_robson on twitter.
http://www.bbc.com/future/story/20160323-why-are-people-so-incredibly-gullible?ocid=global_future_rss

No comments: