How Facebook Is Working to Stop Misinformation and False News

Source: Facebook Media Blog

We know people want to see accurate information on Facebook – and so do we.
False news is harmful to our community, it makes the world less informed, and it erodes trust. It’s not a new phenomenon, and all of us — tech companies, media companies, newsrooms, teachers — have a responsibility to do our part in addressing it. At Facebook, we’re working to fight the spread of false news in three key areas:
  • disrupting economic incentives because most false news is financially motivated;
  • building new products to curb the spread of false news; and
  • helping people make more informed decisions when they encounter false news.

 

Disrupting Economic Incentives
When it comes to fighting false news, one of the most effective approaches is removing the economic incentives for traffickers of misinformation. We’ve found that a lot of fake news is financially motivated. These spammers make money by masquerading as legitimate news publishers and posting hoaxes that get people to visit their sites, which are often mostly ads.
Some of the steps we’re taking include:
  • Better identifying false news through our community and third-party fact-checking organizations so that we can limit its spread, which, in turn, makes it uneconomical.
  • Making it as difficult as possible for people posting false news to buy ads on our platform through strict enforcement of our policies.
  • Applying machine learning to assist our response teams in detecting fraud and enforcing our policies against inauthentic spam accounts.
  • Updating our detection of fake accounts on Facebook, which makes spamming at scale much harder.

 

Building New Products
We’re building, testing and iterating on new products to identify and limit the spread of false news. We cannot become arbiters of truth ourselves — it’s not feasible given our scale, and it’s not our role. Instead, we’re working on better ways to hear from our community and work with third parties to identify false news and prevent it from spreading on our platform.
Some of the work includes:
  • Ranking Improvements: We’re always looking to improve News Feed by listening to what the community tells us. We’ve found opportunities like the fact that if reading an article makes people significantly less likely to share it, that may be a sign that a story has misled people in some way. We’re continuing to test this signal and others in News Feed ranking in order to reduce the prevalence of false news content.
  • Easier Reporting: We’ve always relied on our community to determine what is valuable and what is not. We’re testing ways to make it easier to report a false news story if you see one on Facebook, which you can do by clicking the upper right hand corner of a post. Stories that are flagged as false by our community then might show up lower in your feed.
  • Working with Partners: We believe providing more context can help people decide for themselves what to trust and what to share. We’ve started a program to work with independent third-party fact-checking organizations. We’ll use the reports from our community, along with other signals, to send stories to these organizations. If the fact-checking organizations identify a story as false, it will get flagged as disputed and there will be a link to a corresponding article explaining why. Stories that have been disputed also appear lower in News Feed.

 

Helping People Make More Informed Decisions
Though we’re committed to doing everything we can to reduce the spread of false news to as close to zero as possible, we also need to make sure we take steps to address the problem when people do encounter hoaxes. To that end, we’re exploring ways to give people more context about stories so they can make more informed decisions about what to read, trust and share and ways to give people access to more perspectives about the topics that they’re reading.
Some of the work we’ve been focused on includes:
  • Facebook Journalism Project: We are committed to collaborating with news organizations to develop products together, providing tools and services for journalists, and helping people get better information so they can make smart choices about what they read. We are convening key experts and organizations already doing important work in this area, such as the Walter Cronkite School of Journalism and Mass Communication at Arizona State University, and have been listening and learning to help decide what new research to conduct and projects to fund. Working with the News Literacy Project, we are producing a series of public service announcements (PSAs) to help inform people on Facebook about this important issue.
  • News Integrity Initiative: We’ve joined a group of over 25 funders and participants — including tech industry leaders, academic institutions, non-profits and third party organizations — to launch the News Integrity Initiative, a global consortium focused on helping people make informed judgments about the news they read and share online. Founding funders of this $14-million fund include Facebook, the Craig Newmark Philanthropic Fund, the Ford Foundation, the Democracy Fund, the John S. and James L. Knight Foundation, the Tow Foundation, AppNexus, Mozilla and Betaworks. The initiative’s mission is to advance news literacy, to increase trust in journalism around the world and to better inform the public conversation. The initiative, which is administered by the CUNY Graduate School of Journalism, will fund applied research and projects, and convene meetings with industry experts.

 

We need to work across industries to help solve this problem: technology companies, media companies, educational organizations and our own community can come together to help curb the spread of misinformation and false news. By focusing on the three key areas outlined above, we hope we will make progress toward limiting the spread of false news — and toward building a more informed community on Facebook.
%d bloggers like this: