Warnings can alert consumers to ‘fake’ news
Being reminded about the existence of misinformation disguised as legitimate news can boost news readers’ ability to identify articles that are “fake” or false, according to researchers at the University of Georgia Grady College of Journalism and Mass Communication.
The research suggests that social media platforms could play a role in preventing readers from falling prey to misinformation designed to look like news. Led by Bartosz Wojdynski, Jim Kennedy New Media Professor at Grady College, subjects were provided with four science articles to read—two legitimate news articles and two that were fictional stories. Once the subjects read the news articles, they asked the subjects a series of questions.
The study found that readers who were given a warning that science misinformation exists were more discerning in classifying news articles as false or true. Warned readers also rated the two false articles as less credible than those who were not warned.
“It is important to occasionally remind news consumers to use their critical thinking skills,” Wojdynski said. “Since many articles are found through social media, this study shows the impact that even an automatic disclaimer on Facebook could have in reminding people to use their best judgement when they look at webpages.”
The study, authored by Wojdynski and Grady College doctoral students Matt Binford and Brittany Jefferson, was recently published in “Open Information Science.”
Online misinformation designed to look real is often referred to as “fake news,” although in recent years politicians and celebrities have adopted that phrase to dismiss factual news stories that are not to their liking, Wojdynski said.
Authors of misinformation or spam content typically style their content to resemble news to capture clicks or make readers think it is real, Wojdynski said. Because of this, many readers assume that because a story looks like news, it is factual.
The study, titled “Looks Real, or Really Fake? Warnings, Visual Attention and Detection of False News Articles,” used sophisticated eye-tracking equipment to examine which page elements participants viewed while evaluating each story. Elements included the headline, source information, the date the story was published, author information, internal story links and external page links. This information was compared with the post-survey questions to determine what role, if any, these design elements played in helping to identify the credibility of the article.
The study found that more people spent time looking at links to other content published by the website than bylines and timestamps. Time spent viewing two-page areas – those containing links to the publisher’s other content and those containing identifying information including the banner and URL – predicted correctly classifying one of the articles as false.
“This is important because a lot of literature talks about people looking at the URL and the source information, but this research shows that subjects are also trying to get a sense of who the publisher is by looking at what kinds of other stories they publish through external links,” Wojdynski said.
Post-survey questions measured the level of agreement that participants had with questions about credibility like “This article told the whole story” and “I found this article to be believable.”
The results of the study showed that participants receiving the warning that not all content that looks like news may be credible significantly increased false news detection compared with those who did not have the warning.
“This study makes a case for how, although media literacy interventions are our best way to influence how people evaluate online sources,” Wojdynski said, “a simple reminder to be discerning can have an impact.”
This research will also set the foundation for future eye-tracking research examining how subjects use the web to verify information in determining whether an article is credible or not.Date: February 4, 2020
Author: Sarah Freeman, email@example.com
Contact: Bart Wojdynski, firstname.lastname@example.org