YouTube has a announced a shake-up to the way that it promotes videos in its channels, with the news that it will be making a concerted effort to recommend fewer conspiracy videos about controversial subjects. In an effort to clean up its video recommendations, YouTube is clamping down on what it defines as “borderline content”.
While these videos will still be available to those that want them, or anyone who searches, they won’t be actively pushed as much as they have been in the past by YouTube’s algorithm, especially to those viewers who haven’t shown any interest in them.
The change follows a long line of tweaks to the video websites suggested content, which has previously tackled misleading video titles, plus too many recommendations on the same subject.
What’s the Problem with YouTube’s Recommendations?
YouTube has a wide range of content that covers everyone plausible avenue of interest. And not all of it positive.
The problem stems from the fact that unless a video explicitly violates the services guidelines, it gets to stay on the platform. Included in these guidelines are no sexual content, no harmful or dangerous videos, nothing too graphically violent and no harassment, among other things.
That still leaves a lot of room for plenty of other videos that viewers may find offensive or not what to see.
The issue is that in its effort to show a breadth of content, YouTube will sometimes suggest videos that are controversial, or not based on truth. These aremost typically classed as conspiracy videos.
YouTube wants to stem the distribution of videos that show deceptive content, but don’t necessarily contradict guidelines. In a statement on its site, YouTube calls out “flat earth”, 9/11 theories and phoney cures as examples.
How Is YouTube Addressing “Borderline Content”?
YouTube has stated that it is going to tweak its algorithms to ensure that anything it classes as a “borderline content” – that is, videos that don’t contradict its guidelines, but come close – isn’t actively recommended.
This means that when you’re hopping onto YouTube to check out the latest Ariana Grande track, or a recipe for pecan pie, you won’t be presented with titles such as ‘FLAT EARTH THEORY EXPLAINED’ or ‘GOVERNMENT ALIEN EXPERIMENT EXPOSED!’ (because these videos always use all-caps, naturally).
YouTube hopes that in making this move, its user will trust its recommendations more. In cracking down on such recommendations, it will also be less actively responsible for the spreading of false information.
The videos will be flagged through a combination of machine learning and human evaluators, who will decide what makes a video ‘false’ or factually misleading.
This isn’t the first time YouTube has responded to complaints about its recommendations. Previously, it took a stand against clickbait videos with misleading titles and poor content, by taking the like/dislike ratio into account rather, than the raw viewer numbers. This meant that these types of videos were less lucrative to make, as they were no longer being promoted by YouTube.
YouTube has stated that the borderline content crackdown will affect fewer than 1% of the total videos on YouTube, and that it will be rolled out in the United States first, before other countries follow.
Can I Still Check Out My Favorite Tin-Foil-Hat YouTuber?
Yes. If you’re #TeamFlatEarth or have some strong opinions about the heat at which steel beams melt, then you’ll still be catered for. YouTube isn’t actively removing these videos (as long as they stay within its existing guidelines). It’s just no longer promoting them.
Not only that, but if you actively seek out and show an interest in video content like this, you’ll still receive targeted recommendations in your searches and channels for them.
However, the move could well mean that with a smaller audience watching such videos, they become a lot less appealing to the creators to make in the first place.