Facebook Shelved a Report, Allegedly Feared It Made Them Look Bad

Facebook held the report because they wanted to make "key fixes to the system," a spokesperson holds.
Adam Rowe

Last week, Facebook released its very first quarterly report covering the most viewed domains, links, pages, and posts in the US across the second quarter of the year.

But it wasn't the first report the social platform had prepared: According to those in the know, Facebook had previously held off on publishing a report from the first quarter of 2021.

Why? Because the most viewed link in the first three months of the year was an article on a polarizing topic (the death of a doctor who had received a COVID vaccine two weeks earlier), and Facebook seems to have hoped to avoid the appearance of further polarizing its US users. Instead, they're now dealing with their own lack of transparency.

Why Facebook Flinched

The news comes from the New York Times, which viewed internal emails from Facebook executives as well as the report itself.

“In that report,” the Times says, “the most-viewed link was a news article with a headline suggesting that the coronavirus vaccine was at fault for the death of a Florida doctor. The report also showed that a Facebook page for The Epoch Times, an anti-China newspaper that spreads right-wing conspiracy theories, was the 19th-most-popular page on the platform for the first three months of 2021.”

Before the report was publicly released, some executives began discussing over email what potential “public relations problems” could ensue. The Times names Alex Schultz, Facebook’s vice president of analytics and chief marketing officer, among them. The report was then shelved.

Facebook's Response

Facebook has confirmed that they did indeed shelve the report. On Saturday they released a version of the first quarter report, complete with the most viewed link in question.

Spokesperson Andy Stone has tweeted that the social media giant “ended up holding it because there were key fixes to the system we wanted to make” — implying, though not outright stating, that the reason wasn't — or wasn't only — to avoid the public appearance of contributing to polarizing Americans.

Facebook has faced more than little bad press over the last half a decade, from intentionally sharing massive amounts of users' personal information with third-party apps to leaking a ton of data again, but unintentionally this time.

Granted, it's not endlessly bad PR: Facebook has also recently taken measures to help users in Afghanistan keep their personal information even more private given the upheaval in the country.

Lost Face

It's easy for algorithms that focus on driving engagement to promote anything that drives emotions higher. Vaccine information, misinformation, and misinterpretations of valid information were certainly engaging plenty of people across the early months of 2021.

But the absence of useful information can be just as bad as the presence of misleading information. Facebook's decision to avoid publishing a report that could make them look bad is just that, regardless of key fixes it intended to implement.

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Adam is a writer at Tech.co and has worked as a tech writer, blogger and copy editor for the last decade. He's also a Forbes Contributor on the publishing industry (and Digital Book World 2018 award finalist) and has appeared in publications including Popular Mechanics and IDG Connect. When not glued to TechMeme, he loves obsessing over 1970s sci-fi art.

Explore More See all news
close Building a Website? Get 50% off Wix premium plans until September 23rd, 2021 50% Off Wix