After accusations of being passive in the face of fake news arose after November’s election, Facebook announced a new policy on Thursday to combat the viral spread of misinformation.
“We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we’re approaching this problem carefully,” read a statement by Adam Mosseri, the VP of Facebook’s News Feed.
One of these steps hands the power to the people: Facebook users will now be able to report news as hoaxes by clicking on the upper right hand corner of a post.
Facebook is also employing a third-party International Fact-Checking Network to review popular stories. If they find them to be questionable in content, they will mark them as “disputed,” and that tag will be displayed on the post to educate readers about the content. These stories will then get lower preference in Facebook’s algorithm and appear lower in the News Feed.
Some moves are less obvious and may be controversial for those news organizations who are not publishing fake news: “We’ve found that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way,” Mosseri writes. “We’re going to test incorporating this signal into ranking, specifically for articles that are outliers, where people who read the article are significantly less likely to share it.”
Lastly, Facebook plans to study posts and ads that use click-bait style content lures to get people to head to their sites, only to be greeted with a mess of ads.
“It’s important to us that the stories you see on Facebook are authentic and meaningful,” Mosseri says. “We’re excited about this progress, but we know there’s more to be done. We’re going to keep working on this problem for as long as it takes to get it right.”Facebookfake newsmedianewspublishing