Facebook reveals sweeping new 'remove, reduce, inform' plan to tackle misinformation and 'problematic content' on its platform, including altering the News Feed algorithm and more fact-checking measures

  • Facebook is rolling out a slew of new policies that aim to prevent misinformation
  • The firm will now limit the reach of Groups that repeatedly share false content
  • Additionally, the News Feed will now reduce the rank of sites that link out 

Facebook is doubling down on its efforts to prevent the spread of misinformation on its platform and some of its apps. 

In a nearly 2,000-word blog post, Facebook unveiled a slew of new policies that the company will put into place to clamp down on false news stories, images and videos. 

The plan, titled 'remove, reduce and inform,' addresses one of the major criticisms against Facebook concerning the continued presence of harassment, hate speech and false content on its site.  

Scroll down for video 

Facebook is introducing new changes to how it tackles misinformation on its site, starting with new policies for Groups, News Feed ranking and its work with third-party fact checkers

Facebook is introducing new changes to how it tackles misinformation on its site, starting with new policies for Groups, News Feed ranking and its work with third-party fact checkers

HOW IS FACEBOOK TACKLING FALSE INFORMATION? 

As part of a sweeping new plan, titled 'remove, reduce and inform,' Facebook is launching its toughest measures yet to tackle misinformation. 

Among the changes are: 

  • Expanding its partnership with the Associated Press and third-party fact checkers to 'debunk false and misleading information'
  • Changing the News Feed algorithm to reduce the rank of sites that are linked out to more than they are linked to, using a new metric called 'Click-Gap'
  • Some content will include a 'Trust Indicator' button that provides more context on a particular publication 
  • Reducing the reach of Groups that repeatedly post false information and putting more responsibility on administrators for policy violations
  • Adding verified badges on Messenger and labeling messages that have been forwarded
  • Reducing the reach of posts on Instagram that may contain false content but don't violate its community guidelines 

Guy Rosen, Facebook's vice president of integrity, and Tessa Lyons, Facebook's head of News Feed integrity, broke down the policy changes in a lengthy post.  

Facebook has been using the 'remove, reduce and inform' strategy since 2016, but is now applying it in new ways.

'This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and informing people with additional information so they can choose what to click, read or share,' the post states. 

Facebook will now bring the hammer down on groups that 'repeatedly share misinformation' by limiting how widely their content is shared in the News Feed. 

This change affects all users globally starting today. 

'When people in a group repeatedly share content that has been rated false by independent fact-checkers, we will reduce that group’s overall News Feed distribution,' the firm stated.  

As part of the move, users can also remove their posts and comments from a group, even if they've left the group.

Group moderators will also be held to higher responsibility for the types of content that populate their pages. 

'When reviewing a group to decide whether or not to take it down, we will look at admin and moderator content violations in that group, including members posts they have approved, as a stronger signal that the group violates our standards,' the firm wrote in the blog post. 

A new feature called Group Quality will also show administrators a breakdown of the content flagged in the group, so they can gain greater insight into what kinds of content are considered false news.

The plan, titled 'remove, reduce and inform,' addresses one of the major criticisms against Facebook concerning the continued presence of hate speech and false content on its site

The plan, titled 'remove, reduce and inform,' addresses one of the major criticisms against Facebook concerning the continued presence of hate speech and false content on its site

The News Feed is also being overhauled to an extent, with the launch of a new 'Click-Gap' metric that will help inform its algorithms on how to rank a certain post.

This will impact how stories are ranked in the News Feed, by reducing the rank of sites that link out much more than they are linked to.

For example, if a certain site is getting a disproportionate amount of traffic from Facebook's News Feed compared to the rest of the web, it will be demoted in the News Feed. 

'Click-Gap looks for domains with a disproportionate number of outbound Facebook clicks compared to their place in the web graph,' the company said.

'This can be a sign that the domain is succeeding on News Feed in a way that doesn’t reflect the authority they’ve built outside it and is producing low-quality content.'

As Wired pointed out, this could be bad news for smaller websites that aren't necessarily popular outside of Facebook and craft content or headlines that are meant to manipulate the site's algorithms. 

Some content on the News Feed will also include a 'Trust Indicator' button that provides further context around a publication. 

A few updates are arriving on Facebook's family of apps like Instagram and Messenger.

Now, in a move that mirrors WhatsApp, Messenger will label forwarded messages and include a context button to 'provide more background information on shared articles.'

Additionally, Instagram will now reduce the reach of posts that contain false information but don't violate the app's community standards. 

'For example, a sexually suggestive post will still appear in Feed if you follow the account that posts it, but this type of content may not appear for the broader community in Explore or hashtag pages,' the firm said.

CEO Mark Zuckerberg has been criticized for his failure to reduce the spread of fake news, hateful posts and other negative content. Since then, he has launched a number of steps

CEO Mark Zuckerberg has been criticized for his failure to reduce the spread of fake news, hateful posts and other negative content. Since then, he has launched a number of steps 

Facebook is also working closely with news organizations and outside experts to tackle misinformation more quickly. 

Among the updates are that it will continue to work with the Associated Press to 'debunk false and misleading video misinformation and Spanish-language content appearing on Facebook in the U.S.' 

In doing so, the firm says it can clean up false information from the site without Facebook having to be the arbiter of truth or make judgments around news content.     

The firm will continue to consult with academics, journalists and other parties to develop new ways to tackle misinformation. 

Facebook acknowledged that it will never be able to hire enough fact-checkers and moderators to keep an eye on all the content posted on its site.

'Our professional fact-checking partners are an important piece of our strategy against misinformation, but they face challenges of scale: There simply aren't enough professional fact-checkers worldwide and, like all good journalism, fact-checking takes time,' the company said.

Advertisement

Facebook reveals sweeping new policy changes to tackle misinformation and 'problematic content'

The comments below have not been moderated.

The views expressed in the contents above are those of our users and do not necessarily reflect the views of MailOnline.

What's This?

By posting your comment you agree to our house rules.