YouTube and Reddit are taking disinformation throughout their platforms seriously. YouTube claims to have taken down over 1 million videos citing dangerous COVID misinformation as the cause. Reddit, on the other hand, has deleted a popular subreddit that also centered around COVID.
After Twitter began tagging questionable COVID or election-based information, a new standard is hopefully slowly being pushed onto social media platforms, forcing them to police harmful lies about important issues.
These businesses obviously never want to delete content if they can help it, as it can only serve to alienate users and bring forth claims of censorship. However, due to massive pushes from frustrated users, these platforms are finally making moves to limit the dangerous disinformation that is being spread.
What Happened with Reddit?
One of the main tenets of Reddit is the idea that users are free to create their own communities, known as subreddits, within the website. For example, fans of the Legend of Zelda video game series could create their own subreddit where they could discuss the games, share fan art, etc.
One such subreddit was known as “No New Normal,” which was made to vent frustrations over COVID restrictions, share news on the virus, and generally discuss the past year or so. However, as time went on, No New Normal became a hub for anti-vaccination sentiment.
As Reddit users became aware of this, various major subreddits began a blackout, protesting No New Normal's existence, and demanding that Reddit delete the subreddit. It took a few days of back and forth, but Reddit finally caved and closed No New Normal.
While some people consider this to be the bare minimum, it is at least a step towards limiting the share of harmful disinformation on the platform.
What Happened with YouTube?
YouTube has also experienced its fair share of anti-vaccination sentiment, only with the problem of it not being centralised in one area, like Reddit's No New Normal. This hasn't stopped them however, as they still claim to have taken down over 1 million videos of COVID disinformation.
It's worth noting that YouTube regularly takes down all kind of videos that are unsuited to its platform. Whether these videos are related to pornography, gore, or some other kind of controversial topic, YouTube will usually step in to remove them. However, this is the first time that YouTube has made a big deal of removing any kind of disinformation.
This is an ongoing crusade, unlike Reddit's single action of deleting a subreddit, and will likely continue far into the years once the pandemic has fully subsided. While they're focusing on COVID disinformation for now, hopefully this will expand to things like climate change denial, stopping the platform from being a vessel for fake news.
The Future of Social Media
It's fair to say that public opinion of social media has been a bit tumultuous over the past couple of years. Throughout the 45th presidency and the COVID pandemic, a lot of disinformation was spread in order to sway opinions in bad faith.
However, with the rise of negative perceptions from the public, social media platforms are being forced to combat any disinformation, with Twitter adding verification warnings to questionable Tweets, and Facebook deleting groups centered around anti-science rhetoric.
Hopefully this will lead to an increase in faith in social media, as we become more able to trust what we're shown.