A broken tool is behind a spate of post and account deletions, says Instagram’s head.
The post from Adam Mosseri was prompted by users sharing concerns about “enforcement issues” and after “Threads Moderation Failures” became a trending topic.
Mosseri shared that these problems were down to a tool that the platform’s human moderators use, and comes as TikTok announces lay-offs as it moves towards AI moderation.
Acknowledging Mistakes
The post from Mosseri said that his team is looking into the issues reported and “have already found mistakes and made changes.”
He said that the biggest issue was the context of some conversations. This led to what were actually benign comments being deleted or accounts blocked. The Verge’s writer, Umar Shakir, shared that his account was deleted because Threads decided he was underage.
This just in! View
the top business tech deals for 2024 👨💻
Mosseri wrote: “…our reviewers (people) were making calls without being provided the context on how conversations played out, which was a miss. We’re fixing this so they can make the better calls and we can make fewer mistakes.”
Only Human?
The Instagram boss adds that: “We’re trying to provide a safer experience, and we need to do better.”
Moderation is a sticky topic for many major social platforms at the moment, with TikTok opting to move towards AI moderation while also facing huge criticism from 14 attorneys general, via the allegation that children are routinely being exposed to sexualized and emotionally damaging material.
With the lawsuits ongoing, it will be interesting to see if TikTok suggests its move towards AI moderation is a solution to the issues with moderation; or whether it’s just simply a brutal cost-cutting exercise.