
Facebook wants you to know that it's the most transparent social media network. The way that you know that is that Facebook says so. "Transparency is an important part of everything we do at Facebook," the company says on its website.
In fact, it has an entire section on its website that it says is meant to "give our community visibility into how we enforce our policies, respond to data requests and protect intellectual property, while monitoring dynamics that limit access to Facebook technologies."
It's probably worth mentioning that--as a general rule--if you have to try this hard to convince people that you're being transparent, there's a pretty good chance you aren't. Otherwise, we'd already know. I don't know anyone who thinks that Facebook is transparent about anything.
This is, after all, the company that fought bare-knuckled against Apple over its requirement that developers ask permission from users before they track their activity across apps and websites. Why was this such a big deal for Facebook? Because the company knows that when people are made aware of how much of their personal data is being collected and given a choice about whether they want to be tracked, very few people will opt-in.
So, Facebook very much wanted to find a way to pressure Apple into not requiring developers to be honest about how they use your data and ask permission first. Trying to hide the fact that your entire business model is based on monetizing user data doesn't sound very transparent.
Still, as a part of its promise, the company last week its quarterly reports designed to make Facebook "by far the most transparent platform on the internet," according to Guy Rosen, the company's vice president of integrity.
Among those, Facebook released a "Widely Viewed Content Report" that the company says is designed to help "provide clarity around what people see in their Facebook News Feed, the different content types that appear in their Feed and the most-viewed domains, links, Pages and posts on the platform during the quarter."
The plan was for Facebook to highlight the most popular content shared on the social platform. At a time where the company is increasingly criticized for serving as a digital Petri dish where misinformation and divisive content is born and amplified, Facebook had a strong motivation to show that most viewed posts were about benign things like recipe websites and former football players.
And, the report the company released showed that the top viewed Page on Facebook belongs to UNICEF. That's not very controversial. The most widely viewed post was one of those memes that tells users that the first three words they see "are your reality," whatever that means.
It's all pretty boring, which I suppose is the point if Facebook is trying to downplay the spread of Covid-19 misinformation and anti-vaccination content. The accompanying blog post even highlighted that "Prevalence of hate speech has decreased for three quarters in a row since we first began reporting it. This is due to improvements in proactively detecting hate speech and ranking changes in News Feed."
Except, it turns out that this wasn't the first of this type of report Facebook put together. The New York Times reported on Friday that Facebook had buried an earlier report that it feared would make it look bad. I reached out to Facebook but did not immediately receive a response.
When Facebook recognized that the most popular post earlier in the year was to an article about a doctor that died after receiving the Covid-19 vaccine, it realized that would be bad for its narrative. So, it did what a giant company does when it's faced with information that might damage its credibility. It put it in a drawer and hoped the rest of would forget that it promised to be "transparent."
"We considered making the report public earlier," a Facebook spokesperson told the New York Times, "but since we knew the attention it would garner, exactly as we saw this week, there were fixes to the system we wanted to make."
The thing is, if you only share information that you want people to know, that's the literal opposite of being transparent. Hiding unflattering information is about as not transparent as it gets.
It also undermines any credibility Facebook has in making its point, which seems to be that it isn't the vast trough of sewage that many claim it is. But, how would anyone know? Facebook might point to its Q2 report, but why would anyone believe it? The company has already shown it's willing to play with the facts if they don't line up with the story it wants to tell.
Believe it or not, it gets worse. In a Twitter thread, Andy Stone did his very best to muddy things up when it came to the criticism Facebook received. He used phrases like "getting criticism isn't unfair." So, if it's fair criticism, just say that, and own it.
Except, he doesn't own it. The rest of the thread goes on to talk about how criticism is "not unfair," and whether The New York Times updated a story about the same doctor. It seems he's trying to make the case that The New York Times was trafficking in misinformation, except that's completely irrelevant to the discussion of whether Facebook was trying to hide a report that would make it look bad.
When you say that you're the most transparent, or most anything, and it's clear that you aren't, it's not only humorous, it's kind of insulting. That doesn't seem like a great strategy for winning the trust of your users. If you make a promise, you have to keep it, even if you don't like the outcome. Anything else is not only not transparent, but it's also not acceptable.