In the years since online trolling became a thing, it has never gone away. If anything, it’s a bigger problem than ever: A Pew Research report out last March detailed the opinions of some 1,537 “technology experts, scholars, corporate practitioners and government leaders” who came to the general consensus that the Internet as a whole was either staying the same (42 percent) or getting worse (39 percent). The survey was conducted in the early months of 2016, and plenty of outlets are agreeing that the levels of online abuse are only on the rise since.
What does all that mean for a lowly tech startup hoping to create a community platform? You’re going to need a crash course in managing the trolls.
Luckily, The Coral Project is here to help. It’s a journalism-oriented collaboration between Mozilla, The New York Times, and The Washington Post that’s designed to help outlets learn how to build a community, and they’ve recently published an article on MediaShift about managing abusive commentators.
Look at a New User’s First Few Comments
The Coral Project said:
“Set the first few comments from any new user to go to pre-moderation (if your system allows)”
By cracking down on the first comments, you’ll be able to separate the trolls from the genuine articles immediately.
Get Users to Flag Abuse
While onboarding users, encourage them to report on any future bad behavior they might notice. Keep your messaging simple and clear in order to get the best results — the article suggests their own tool, Talk, in order to handle the process.
Highlight the Good Comments
Finding a way to prioritize the useful, constructive comments will guide the conversation away from abuse and those who rely on abusive comments and outrage in order to be heard.
Point Targeted Users to Support Networks
Create a resource list will help users who might otherwise feel alone and supported if they are singled out for online abuse.
“Create a list of places you can point users to – e.g. Crash Override, Heartmob, Trollbusters – to get support if they are being targeted,” the article says.
Reply with Empathy
The first weapon a community editor should pull out? Empathy.
“Respond with empathy to those who cross the line,” Coral Project recommends, adding that “they might not have understood the community guidelines. If appropriate and possible, consider giving them a time out from posting instead of banning them for life. However, sometimes a user acts in a way that is deliberately abusive by repeatedly targeting one or more people, and banning them from your community has no effect.”
Check out the rest of the project’s research for advice on more specific ways to handle community abuse, from abuse of a single user to a situation in which your own team is targeted, to ongoing abuse in general. If your community stands out as a healthy one in a world of increasing online abuse, you just might gain the edge you need to thrive.
Read more advice on social media at TechCo