This article exists as part of the online archive for HuffPost Australia, which closed in 2021.

Science Pinpoints Why People Fall For Fake News -- And What We Can Do About It

Science Pinpoints Why People Fall For Fake News -- And What We Can Do About It
Alex Jones' site, infowars.com, has been a source of fake news stories that suggest, for example, that no children were killed at the 2012 school shooting in Newtown, Connecticut.
Lucas Jackson / Reuters
Alex Jones' site, infowars.com, has been a source of fake news stories that suggest, for example, that no children were killed at the 2012 school shooting in Newtown, Connecticut.
Alex Jones' site, infowars.com, has been a source of fake news stories that suggest, for example, that no children were killed at the 2012 school shooting in Newtown, Connecticut.
Lucas Jackson / Reuters
Alex Jones' site, infowars.com, has been a source of fake news stories that suggest, for example, that no children were killed at the 2012 school shooting in Newtown, Connecticut.

“Fake news” — the kind of stories without even a kernel of truth, often made up by nefarious agents or cynical profiteers, appeared to play a major role in the 2016 presidential campaign.

There are no signs that these fictionalized articles, spread mostly on the internet via social media, are going away anytime soon. In fact, they’re a prominent feature of what some have dubbed the “post-truth era” ― a time when the general public (or even a certain leader of the free world) can’t seem to agree on basic facts, let alone reach consensus on tackling a problem.

Unsurprisingly, scientists have a major stake in making sure that facts ― obectively derived from the scientific method, reasoning, or other principles of enlightenment ― don’t lose their relevance to the public. Not only does their livelihood depend on experimentation and scientific discovery, but many of today’s disputed facts have widespread implications for health and safety.

Three new research papers tackle this problem, showing how to reach people with anti-scientific views, or how to help people sort fake news from real facts. Here’s what we can learn from each of them.

1. People have ulterior motives for holding anti-science or anti-fact beliefs.

Underlying motives ― what psychology professor Matthew Hornsey of the University of Queensland called “attitude roots” ― may not be apparent or obvious, even to the people who have them.

In a theoretical paper presented on Saturday, Hornsey hypothesized that people who want to challenge unscientific beliefs ― say public health communicators or political leaders ― should try focusing on attitude roots when repeated explanations of the facts don’t work.

“Rather than focusing on what people are saying, it might be better to focus on what their motivations are for what they’re saying,” Hornsey said. “Then you work backwards from there, constructing arguments that work with their underlying motivations, not against them.”

So how are you supposed to gauge a person’s underlying motivations for unscientific belief? First, ask why they believe this. Secondly, never assume that the other person is unreasonable or unprincipled. That’s a common mistake, and usually results in hurt feelings and even more entrenched beliefs. The onus to reach out, Hornsey said, is on people who accept the existence of an objective reality.

“You have to provide a path for people to change their minds without feeling humiliated or defeated,” he said.

Hornsey’s idea is untested, but provides an intriguing way forward for science communicators who find that simply talking about the evidence isn’t making a dent on an intended audience.

2. Education can protect against scientific misinformation.

Researchers from Cambridge, Yale and George Mason universities recently demonstrated that people can be “inoculated” against scientific misinformation if they are first educated about strategies and tactics that partisan groups use to distort the truth. The results show that learning more about the ways that groups spread falsehoods results in participants being able to distinguish between accurate information about climate change and misinformation. This effect held for people across the political spectrum, according to the study.

Scientists recruited a nationally representative sample of more than 2,000 people online and divided them into six groups: a control group, a second group that received the accurate message that “97 percent of scientists agree on man-made climate change,” a third group that received the false statement that “there is no convincing scientific evidence of human-caused global warming,” and a fourth group that received both messages.

The fifth and sixth groups received the accurate message, followed by a “vaccine” message either explaining how some partisan groups can mislead the public, or specific details about the Oregon Petition, which falsely claimed it had signatures of more than 31,000 experts agreeing there is “no convincing scientific evidence” that human activity causes climate change. Then they received the false statement on climate change.

The scientists found that the fifth and sixth groups, which had received the “vaccine” against misinformation, saw their belief in the scientific consensus around climate change grow. Group 5, which received only a general message about how misinformation can spread, grew in acceptance of climate change consensus by 6.5 percent. Group 6, which was educated specifically about the fraudulent Oregon petition, saw acceptance of scientific consensus grow 13 percent.

In contrast, Group 3, which received only the false statement, saw belief in scientific consensus fall 9 percent. Those in Group 4, who heard the accurate statement followed by the false statement, saw belief in the scientific consensus around climate change almost completely neutralized.

“We found that inoculation messages were equally effective in shifting the opinions of Republicans, independents and Democrats in a direction consistent with the conclusions of climate science,” lead author Sander van der Linden, director of the Cambridge Social Decision-Making Lab, said in a statement about the experiment.

3. Fake news isn’t as popular or as influential as it may seem.

Researchers from New York University and Stanford found in a recent survey of more than 1,200 people, weighted for national representation, that only 15 percent of respondents had seen fake news articles (verified fake news that had been debunked on either Snopes.com or Politifact.com) during the 2016 election, and only 8 percent of survey respondents said they believed those stories.

The researchers also made up a series of “placebo” fake news headlines to see if survey respondents would say they had seen and believed them, even though the stories had been created for the purposes of the survey and were not circulated online. These placebo headlines included, “Clinton Foundation staff were found guilty of diverting funds to buy alcohol for expensive parties in the Caribbean,” and “Leaked documents reveal that the Trump campaign planned a scheme to offer to drive Democratic voters to the polls but then take them to the wrong place.”

About 14 percent of survey respondents said they recognized these placebo fake stories, and 8 percent said they had believed them ― indicating that a small minority of the population is simply willing to buy into any story that fits their worldview.

This may seem bleak. But previous research shows that most Americans get their news from social media, that fake news was shared more widely than real news on Facebook, and that fake news has overwhelming pro-Trump slant. Taken altogether, some pundits have suggested President Donald Trump owes a debt of gratitude to the proliferation and believability of fake news among his voters.

The new survey results, in contrast, suggest that news from social media, and especially fake news from social media, was not a dominant source of information during the 2016 election. The researchers calculated that the average voting-age American saw and remembered 0.92 pro-Trump fake story and 0.23 pro-Clinton fake story.

“In summary, our data suggest that social media were not the most important source of election news, and even the most widely circulated fake news stories were seen by only a small fraction of Americans,” the researchers wrote. “For fake news to have changed the outcome of the election, a single fake news story would need to have convinced about 0.7 percent of Clinton voters and non-voters who saw it to shift their votes to Trump, a persuasion rate equivalent to seeing 36 television campaign ads.”

While there’s still a lot to learn about the role of “fake news” and “alternative facts” on American public discourse, it’s important to keep one thing in mind: Fake news can have real, life-or-death consequences, especially when it comes to acceptance of evidence-based health and science recommendations.

Attitudes towards scientific discovery used to lie largely outside politics, but now attitudes towards science have become wrapped up in the same culture wars that used to be reserved for things like abortion and gun control,” Hornsey said. “This is a frightening development — anti-vaccination movements cost lives. Climate change skepticism will cost lives.”

This reporting is brought to you by HuffPost’s health and science platform, The Scope. Like us on Facebook and Twitter and tell us your story: scopestories@huffingtonpost.com.

Close
This article exists as part of the online archive for HuffPost Australia. Certain site features have been disabled. If you have questions or concerns, please check our FAQ or contact support@huffpost.com.