In December 2019, as Facebook was bracing for the looming chaos of the 2020 election, a post appeared on its internal discussion site. “We are responsible for viral content,” the title declared. The author walked through the ways in which Facebook’s algorithmic design helps low-quality content go viral, concluding with some recommendations. Among them: “Rather than optimizing for engagement and then trying to remove bad experiences, we should optimize more precisely for good experiences.”
That might sound obvious—optimize for good experiences. And yet Facebook’s disinterest in doing that is a persistent theme in The Facebook Papers, internal documents revealed by Frances Haugen, the former employee-turned-whistleblower who recently testified before Congress. The files, first reported on by the Wall Street Journal, were included in disclosures made to the Securities and Exchange Commission by Haugen and provided to Congress in redacted form by her legal counsel. The redacted versions were reviewed by a consortium of news organizations, including WIRED.
They reveal Facebook’s own employees agonizing over the fact that, in their view, its central algorithms reward outrage, hatred, and viral clickbait, while its content moderation systems are deeply inadequate. The documents are also full of thoughtful suggestions for how to correct those flaws. Which means there is good news for Facebook and Mark Zuckerberg in the files, if they choose to see it: a blueprint for how to fix some of the company’s biggest problems.
Quite a few Facebook employees seem to agree that the company has failed to pursue any positive value besides user engagement. Sometimes this is framed explicitly, as in a document published in 2020 with the title “When User-Engagement ≠ User-Value.” After explaining why keeping users glued to Facebook or Instagram isn’t always good for them, the author considers possible solutions. “A strong quality culture probably helps,” they conclude, in what reads as dry understatement. The author goes on to cite the example of WhatsApp—which Facebook acquired in 2014—as a company that built a successful platform not by testing features to optimize for engagement but by making “all their product decisions just based on their perceptions of user quality.”
In other files, researchers only indirectly acknowledge how little attention company leadership pays to factors besides engagement when making product changes. It’s treated as so obvious a fact that it doesn’t require explanation—not just by the authors, but in the extensive discussions with fellow employees that follow in the comments section. In a discussion thread on one 2019 internal post, someone suggests that “if a product change, whether it’s promoting virality, or increasing personalization, or whatever else, increases the severe harms we’re able to measure (known misinfo, predicted hate, etc.), we should think twice about whether that’s actually a good change to make.” In another 2019 post, a researcher describes an experiment in which Facebook’s recommendations sent a dummy account in India “into a sea of polarizing, nationalistic messages,” including graphic violence and photos of dead bodies. The author wonders, “Would it be valuable for product teams to engage in something like an ‘integrity review’ in product launches (eg think of all the worst/most likely negative impacts that could result from new products/features and mitigate)?”
It’s almost cliché at this point to accuse Facebook of ignoring the impact its products have on users and society. The observation hits a little harder, however, when it comes from inside the company.