Top 5 Tips for Dealing With Abuse in Your Comments Section

In the years since online trolling became a thing, it has never gone away. If anything, it’s a bigger problem than ever: A Pew Research report out last March detailed the opinions of some 1,537 “technology experts, scholars, corporate practitioners and government leaders” who came to the general consensus that the Internet as a whole was either staying the same (42 percent) or getting worse (39 percent). The survey was conducted in the early months of 2016, and plenty of outlets are agreeing that the levels of online abuse are only on the rise since.

What does all that mean for a lowly tech startup hoping to create a community platform? You’re going to need a crash course in managing the trolls.

Luckily, The Coral Project is here to help. It’s a journalism-oriented collaboration between Mozilla, The New York Times, and The Washington Post that’s designed to help outlets learn how to build a community, and they’ve recently published an article on MediaShift about managing abusive commentators.

Look at a New User’s First Few Comments

The Coral Project said:

“Set the first few comments from any new user to go to pre-moderation (if your system allows)”

By cracking down on the first comments, you’ll be able to separate the trolls from the genuine articles immediately.

Get Users to Flag Abuse

While onboarding users, encourage them to report on any future bad behavior they might notice. Keep your messaging simple and clear in order to get the best results — the article suggests their own tool, Talk, in order to handle the process.

Highlight the Good Comments

Finding a way to prioritize the useful, constructive comments will guide the conversation away from abuse and those who rely on abusive comments and outrage in order to be heard.

Point Targeted Users to Support Networks

Create a resource list will help users who might otherwise feel alone and supported if they are singled out for online abuse.

“Create a list of places you can point users to – e.g. Crash OverrideHeartmobTrollbusters – to get support if they are being targeted,” the article says.

Reply with Empathy

The first weapon a community editor should pull out? Empathy.

“Respond with empathy to those who cross the line,” Coral Project recommends, adding that “they might not have understood the community guidelines. If appropriate and possible, consider giving them a time out from posting instead of banning them for life. However, sometimes a user acts in a way that is deliberately abusive by repeatedly targeting one or more people, and banning them from your community has no effect.”

Check out the rest of the project’s research for advice on more specific ways to handle community abuse, from abuse of a single user to a situation in which your own team is targeted, to ongoing abuse in general. If your community stands out as a healthy one in a world of increasing online abuse, you just might gain the edge you need to thrive.

Read more advice on social media at TechCo

Did you find this article helpful? Click on one of the following buttons
We're so happy you liked! Get more delivered to your inbox just like it.

We're sorry this article didn't help you today – we welcome feedback, so if there's any way you feel we could improve our content, please email us at contact@tech.co

Written by:
Adam is a writer at Tech.co and has worked as a tech writer, blogger and copy editor for more than a decade. He was a Forbes Contributor on the publishing industry, for which he was named a Digital Book World 2018 award finalist. His work has appeared in publications including Popular Mechanics and IDG Connect, and his art history book on 1970s sci-fi, 'Worlds Beyond Time,' was a 2024 Locus Awards finalist. When not working on his next art collection, he's tracking the latest news on VPNs, POS systems, and the future of tech.
Explore More See all news
Back to top
close Building a Website? We've tested and rated Wix as the best website builder you can choose – try it yourself for free Try Wix today