In a long-overdue move from the world's foremost social media platform, Facebook is finally going to warn users about sharing articles that are outdated.
Facebook has gone from social connection app to a massive purveyor of global news in a matter of years. Along the way, elections have been decided, activist movements have been spurred, family gatherings have been ruined, all thanks to Facebook's unflinching commitment to free speech and letting people share what they choose.
Fortunately, the company is now trying to make the way users share news a bit more informed. It's doing this by adding a warning notification to users when attempting to share an article that's too old to be relevant.
Facebook Announces Context Warning
In a blog post from Facebook today, the social media company announced that it would be launching “a notification screen that will let people know when news articles they are about to share are more than 90 days old.”
“To ensure people have the context they need to make informed decisions about what to share on Facebook, the notification screen will appear when people click the share button on articles older than 90 days,” read the blog post.
The post went on to explain that, in true hands-off Facebook fashion, users will still obviously be able to post the article if they wish, after acknowledging that the article is more than three months old.
Why is Facebook Adding This Update?
Facebook is always looking to update its platform, but this update is about more than making the big F look prettier. This update should, ideally, help Facebook to combat the aggressive spread of misinformation on the site. Plus, according to its blog post, there's plenty of appetite for it:
“Over the past several months, our internal research found that the timeliness of an article is an important piece of context that helps people decide what to read, trust and share,” read the post. “News publishers in particular have expressed concerns about older stories being shared on social media as current news, which can misconstrue the state of current events.”
The change seems like an incredibly obvious one, given the increasingly prevalent presence of misinformation and bad information found on Facebook. But has the social media giant really only been paying attention to this over the “past several months” alone?
How Did This Take So Long?
We all know that Facebook has had its fair share of “fake news” problems, even before the controversies of the 2016 election. From problematic Facebook groups to suspicious troll bots, the platform has been plagued with a wide range of issues for more than half a decade. So, why did something as easy as including prominent information, like the date, take so long to add?
The most likely answer is that Mark Zuckerberg hasn't felt motivated to make such changes – by his own definition, the platform is still perfectly successful. Sharing misinformation is still sharing, which means Facebook is doing its job: getting people to connect online.
As Zuckerberg is happy to point out, Facebook isn't an “arbiter of truth.” It's merely a vessel for free speech, regardless of how overly destructive and obviously untrue that speech may be, a point of view that has to change if we want social media to ever return to its former glory.
Unfortunately, until the social media platforms take a clear stance on their role in the global discussion of free speech versus misinformation, the best we're going to get is articles that actually have the date displayed.