As part of a larger conversation about the future of Twitch Extensions, Twitch cofounder and COO Kevin Lin recently shared his thoughts on the battle against toxic behavior in the video game world.
For Lin, the answer lies in community.
“You’ll also see broadcasters who have very carefully managed their community carefully and communicated what they will allow in their channel,” he told GamesIndustry.biz. “You’ll find they’re quite conversational, they’re pleasant but it does take a lot of work as a broadcaster. As that perpetuates through more and more broadcasters as they grow, the communities will self-police – we’ve seen that many times.”
Twitch’s approach to toxicity has been to give those communities the power they need to self-police with tools like AutoMod. Users can appoint moderators to their channels, but AutoMod seeks to do that work for them, shutting down inappropriate or harassing chat aggressively. Viewers engaging in toxic behavior are still out there, but in any individual channel, it’s easier to keep them quiet.
“Initially we just had tools like choosing moderators in your chat, who would then basically get admin rights to purge lines of chat or drop the banhammer and kick people out of chat entirely. That worked to some extent for the smaller channels, but for the larger channels it’s very difficult,” Lin explained.
“Then we built AutoMod and released last year, which you can dial up and down. It’s a machine learning tool that helps you remove unsavoury stuff from chat. You can dial that all the way up if you want to manage your community more tightly and we’re constantly iterating on that tool. We also work really closely with game companies – they face the same problems we do.
“We prefer to keep things positive. Our goal at Twitch is not only to help creators but also to connect people around the world and help them understand difficult cultures, backgrounds and upbringings. So we always keep an eye towards that.
“It’s not great that’s a common experience we all have on the internet, but we’re doing what we can to reduce that.”
No social network is perfect, or perfectly free from harassment. But as Lin says, we have the chance to build communities within those networks where we can help keep each other safe.