Last week, Facebook announced plans to address anti-vaccination content on its platform. Facebook won't be deleting anything outright, but it will be deprioritizing “anti-vax” content in search results and banning it from ads, and also from its “recommend” promotional feature.
Facebooks actions are just the latest in a series of responses from tech companies, following public unease around their perceived promotion of conspiratorial anti-vaccination videos, books and articles.
As well as Facebook, the likes of Amazon, YouTube and Pinterest are all taking various actions to hide, downgrade or otherwise control the spread of anti-vax content.
Why Is Anti-Vaccination Content a Problem?
The rise of anti-vaccine views has happened online, but has real-world effects. Even if a small portion of the population stops vaccinating, there's a very real risk of seeing the return of once-dormant illnesses. As a result, a whole host of old-timey Downton Abbey-era diseases — including measles, smallpox, and polio — could return in force. A recent measles outbreak in Washington state resulted in 71 confirmed cases.
Misinformation, such as inaccurate claims that vaccinations can cause autism, are spread easily across social networks including YouTube, Amazon, and Facebook. This isn't due to any manual effort on the part of the companies. Rather, it's largely due largely to algorithms designed to amplify the content that is the most engaging, rather than the most accurate or useful.
As it turns out, conspiracy theorists tend to spend a lot of their free time clicking on the next suggested YouTube video or book recommendation. This only teaches the algorithm to quickly suggest rabbit trails of conspiracy to anyone else searching similar terms.
Last week, I wrote to Facebook and Google to express my concern that their sites are steering users to bad information that discourages vaccinations, undermining public health.
The search results you get for “vaccines” on Facebook are a dramatic illustration. pic.twitter.com/ZrEQfVTaRo
— Adam Schiff (@RepAdamSchiff) February 20, 2019
What makes anti-vax arguments so engaging to start with? Well, fear and mistrust are highly engaging, and the unprecedented access to information on the internet can easily be used to surface and promote fear-mongering claptrap. This sort of behavior predates the internet itself – the invention of the printing press “hugely augmented” a 1487 text used to justify the fifteenth century witch trials.
Why are Social Media Companies Acting Against Anti-Vax Content?
The companies are responding to the uptick in public attention that stems from certain lawmakers and the news outlets covering them.
“We are on the verge of a public health crisis,” New York Assemblywoman Patricia Fahy said, following her sponsorship of a bill to allow minors the chance to consent to vaccinations, regardless of their parents' opinions.
In addition, California representative Adam Schiff sent letters to CEOs of Google, Facebook, and Amazon, urging they “consider additional steps [they] can take to address this growing problem.”
What Are Tech Companies Doing?
Are tech giants listening? They are, but with mixed actions in response. Here's a rundown:
The massive Google-owned video service announced in January its intentions to curb the promotion of conspiracy theory videos. Among its efforts, a new link to Wikipedia will appear when users search a relevant phrase like “MMR vaccine.”
In addition, the company says it will downrank any misleading content. Perhaps most usefully, the service demonetized anti-vax videos in February by removing the option to add programmatic ads to videos advocating against vaccines.
Critics have stated, however, that anti-vaccine content remains easy to find on the site.
Following a big news cycle on the anti-vax content that “thrives” on its site, Amazon has removed some misleading documentaries from its Prime video content. However, critics have noted that Amazon is leaving many anti-vax books available for purchase.
However, it has begun to remove a few additional book titles in the days since removing the video, so their push to address misinformation may well continue moving slowly forward for the near future.
As mentioned earlier, the social network giant has publicly vowed to crack down on anti-vax misinformation. Facebook's actions won't go far enough for some – at present, it plans to deprioritize the content, rather than removing it altogether.
Finally, there's Pinterest. In a rare bit of good news, the image-centric social platform has actually been held up as an example of misinformation handled correctly.
In 2016, the service hit a low point – a scientific study determined that 75% of vaccine-related Pinterest posts were negative. Pinterest then updated its guidelines to explicitly ban “promotion of false cures for terminal or chronic illnesses and anti-vaccination advice.”
More importantly, however, Pinterest instituted a “data void” for the term. If anyone searches the site for “vaccine” or “vaccines,” they'll get a curt, “Sorry, we couldn't find any Pins for this search” message.
It's a simple solution. But, such a sweeping statement is something that every other tech company mentioned in this article has balked at, fearing it will represent censorship.
While the larger question of a tech giants' censorship power likely falls into the “complicated issue with no easy answers,” category, the question of banning widely debunked conspiracies like anti-vaxxers' beliefs seems a little less complex. The answer might just be to take a bit of inspiration from Pinterest.