GamesQuest 8: Dealing with Toxic Player Behavior
I'm afraid of playing with strangers online and the chance of people shouting at me. So what can game developers do to deal with toxic behavior? Listen to this episode of GamesQuest or read the excerpt below.
Links from the episode:
Toxic Behavior (The Psychology of Video Games)
Battling Toxicity in Online Video Games (Gamasutra)
What Can Community Managers Learn From Game Designers? (Forbes)
Transcript excerpt
You can experience toxic behavior in any kind of multiplayer game but not every kind of game has the same potential for it. For example playing a round of Dota2, you can be matched up with complete strangers and then spend 30 minutes or more trying to win. It can be frustrating when you think your teammates don’t act the way you think they should. In a sports team you know those people face to face and even there you can experience profanity and aggressive behavior towards your teammates and your opponents. So it’s not that surprising to have the same problems in games in which you usually can hide behind online anonymity.
Even in regular sports this has been accepted for a long time and just been “part of the game”. In the last few years we have finally seen moves to address these issues and work against harmful behavior such as violence and racism.
The same is true for games and I could get most information about that from an interview on the Psychology of Games podcast with Dr. Jeffrey Lin who was at the time leading Riot Games’ Player Behavior Team. This is 3 years old but I think a lot of the conclusions from the interview are still valid today.
Of course it helped that with League of Legends they have a massive player base to experiment with. They took a very analytical approach and were able to get some key findings that are relevant to anybody who want to make an open online experience a part of their game.
1. You cannot get rid of toxic players but you can educate them.
Some might think anybody who crosses a certain line should just be banned from a game. Apart from the obvious question fo where to draw that line, it is also a nearly impossible task on a technical level to keep those players from returning, and of course it is lost revenue down the line as well. The more sustainable option is to educate your players. As Lin’s research shows a high percentage of players will not become repeat offenders if they get told off just once. I’m in my 30s and I tend to think about players like myself, but a lot of players are still kids or adolescents who really might not know better. It’s nice to think that everybody grows up and gets proper values from their families and schools, but that is just not always the case and probably never was. For young players like this it is important that they actually understand what is acceptable behavior and what not.
2. Feedback has to be as soon and as precise as possible.
Whether you get banned for something you did or you are reporting someone else for toxic behavior, time is essential here. It has a lot more impact if you actually understand what you did wrong. If you get feedback weeks later, it is way more likely that you are just outraged that you got penalized for something you probably don’t even remember anymore. For the same reason it is important that the feedback is as precise as possible, containing the actual offending chat or voice messages and again a clear explanation of what is not acceptable. Of course this is not easy to achieve with thousands of matches going on at any given time and has to be properly planned and managed early on.
Feedback is also important for people who are reporting toxic behavior. If you know that somebody actually follows up on your reporting and get a notification of what eventually happens you are more likely to do the same again. The opposite outcome could be that this player is frustrated with the toxic behavior and just quits a game, because it seems that the developers don’t even care about it.
3. Positive feedback may not always be helpful.
Going into this episode I actually wanted to talk more about positive reinforcement as I thought that the usual route of just punishing the bad actors seems very destructive and rewarding good behavior would be the better option. However, Lin’s research shows that it is not that easy because different groups of people react differently to this. When you look at online behavior on a scale from negative via neutral to positive then it is the neutral people who respond the most to certain perks for good behavior. The toxic group can only be incentivized with very specific rewards that are valuable to that person, otherwise rewards just had no effect at all. And for people who already show good behavior it can even have the opposite effect as they learn to act positively in response to reward. They become less intrinsically motivated.
Still, positive feedback can work but it has to be quite nuanced. Riot tries that in League with its Honor System in which you vote who was the friendliest player or who was the best leader regardless if you won or not. You can level up your honor throughout a season but it resets at the start of the next, thus gamify the positive attitude.
Positive Experience starts with Community Managemant
Finally I want to say that a positive in-game experience starts outside the game, with a community management that focuses on a positive environment for everybody. That puts down ground rules early on, enforces them and encourages supportive behavior between community members. Just like in the game this is ideally done in advance but has to adjust along the way.