Last week, Facebook released its very first quarterly report covering the most viewed domains, links, pages, and posts in the US across the second quarter of the year.
But it wasn't the first report the social platform had prepared: According to those in the know, Facebook had previously held off on publishing a report from the first quarter of 2021.
Why? Because the most viewed link in the first three months of the year was an article on a polarizing topic (the death of a doctor who had received a COVID vaccine two weeks earlier), and Facebook seems to have hoped to avoid the appearance of further polarizing its US users. Instead, they're now dealing with their own lack of transparency.
Why Facebook Flinched
The news comes from the New York Times, which viewed internal emails from Facebook executives as well as the report itself.
“In that report,” the Times says, “the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.”
Before the report was publicly released, some executives began discussing over email what potential “public relations problems” could ensue. The Times names Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, among them. The report was then shelved.
Facebook has confirmed that they did indeed shelve the report. On Saturday they released a version of the first quarter report, complete with the most viewed link in question.
Spokesperson Andy Stone has tweeted that the social media giant “ended up holding it because there were key fixes to the system we wanted to make” — implying, though not outright stating, that the reason wasn't — or wasn't only — to avoid the public appearance of contributing to polarizing Americans.
We’re guilty of cleaning up our house a bit before we invited company. We’ve been criticized for that; and again, that’s not unfair.
— Andy Stone (@andymstone) August 21, 2021
Facebook has faced more than little bad press over the last half a decade, from intentionally sharing massive amounts of users' personal information with third-party apps to leaking a ton of data again, but unintentionally this time.
Granted, it's not endlessly bad PR: Facebook has also recently taken measures to help users in Afghanistan keep their personal information even more private given the upheaval in the country.
It's easy for algorithms that focus on driving engagement to promote anything that drives emotions higher. Vaccine information, misinformation, and misinterpretations of valid information were certainly engaging plenty of people across the early months of 2021.
But the absence of useful information can be just as bad as the presence of misleading information. Facebook's decision to avoid publishing a report that could make them look bad is just that, regardless of key fixes it intended to implement.