Toxicity often seems like a foregone conclusion in online communities, just something that happens when a game becomes too big or popular. But it doesn’t have to be that way.
During a presentation at the Game Developers Conference in San Francisco this week, expert panelists talked about what creators can do to help prevent their communities from succumbing to abuse and harassment. User experience researcher and veteran social media moderator Katherine Lo began by debunking a key myth about online behavior: the long-held belief that someone acts like an asshole because they’re hiding behind a fake name.
But as seen with sites like Facebook (where Lo showed a graphic comment from someone who had his family in his profile picture), that’s not always the case.
“Real-name policies don’t necessarily enforce good behavior,” said Lo.
She argued that some anonymous groups — like certain Reddit channels that have clear policies and moderators who actually enforce them — can still foster healthy discussion, and that made-up names still offer a layer of protection to people who are most vulnerable to abuse. Instead of following generalized procedures for curbing toxicity, Lo said that developers need to invest in experienced community managers and customer support specialists who have the skillsets needed to deal with those problems.
The sooner they join the team, the sooner they can help the studio come up with good strategies and tools.
“Community managers need to be deeply integrated into both the design and policy-making processes,” said Lo.
For Nicole Lazzaro, the chief executive officer of consulting firm XEODesign, part of the problem with abuse comes down to a game’s mechanics, which encompasses all the different actions available to the player. She advised attendees to think hard about the type of emotions that their gameplay elicits, and use their design plans to help reduce toxicity. Above all, players need to feel safe.
“You can’t design emotions directly. But it’s the emotions coming from those mechanics that drastically change how your game is played. Emotions are what players are there for. If there is no emotion, there is no game,” said Lazzaro.
This is especially true for virtual reality experiences, where emotions are stronger and more personal due to immersive environments and life-sized avatars. Lazzaro used an example of a VR game where avatars could intersect with one another, allowing players to feel “very violated.” At minimum, dev teams can introduce something like a personal bubble shield that prevents people from going into their personal space without permission.
They can also address what Lazzaro called “empathy hurdles” at an early point in a game by introducing mechanics that reward players for good behavior (like forming friendships and trusting one another).
“If there are a lot of emotion hurdles or empathy hurdles that a [banned] player has to go through in order to start a new account, that’s going to help keep [the bad] people out,” she said.
Tara Brannigan, head of community and customer service at German publisher Flaregames, agreed that developers need to take responsibility for the safety of their players, saying that a toxic community will only drive potential customers away.
“I’ll defend this to my dying breath: The majority of your players are great, non-toxic people. … You owe it to them to try to make a great experience for everybody,” said Brannigan.
She suggested that developers should take the time to understand what their community strategy will be months before launch, especially in regards to how they punish harmful behavior. If social interaction is limited to external solutions like forums and social media, teams should take at least three months to plan everything out — even longer if they don’t have any experience managing a community. At this point, they also need to consider worst-case scenarios (like death threats, rape threats, and child pornography), which will also better prepare them for tackling lower-level infractions.
Integrated solutions, like in Flaregames’s Zombie Gunship Survival, can also be effective. In order for banned players to return to the game and the community, they either have to wipe their progress and reinstall the app, or contact the company and explain why they were being so abusive in the first place. This method dramatically reduced the amount of toxicity in the game.
“Community is now, more than ever, part of the actual game design. It’s not this thing you just staple onto the end. It’s an extension of the experience,” said Brannigan.