Since there has been multiplayer, there has been toxicity. Players spouting foul and derogatory terms at each other with little to no repercussions or systems in place that—could at the very least—slightly dampen the amount of abuse taking place. In an effort to curtail this toxic behavior, a slew of companies are banding together to form a new alliance, the Fair Play Alliance. The Fair Play Alliance’s goals are as follows,

The Fair Play Alliance is a coalition of gaming professionals and companies committed to developing quality games. We provide an open forum for the games industry to collaborate on research and best practices that encourage fair play and healthy communities in online gaming. We envision a world where games are free of harassment, discrimination, and abuse, and where players can express themselves through play.

It’s a noble endeavor, as solving toxicity in games not only improves the general environment for players, it also makes them more enjoyable on the whole for other players. Toxicity, however, is more insidious, and while it can certainly be curtailed, the development of systems within games to combat it, there are limitations to what those systems can actually accomplish. Game systems won’t be able to change human behavior or the culture that reinforces those behaviors, but when designing a game, developers can be aware of those issues. Rather than create games that accept toxicity as a price of making a multiplayer game and creating systems to combat it, developers should design games from the ground up that actively dismiss toxic behavior by how the game naturally functions.

Systems

Multiplayer games have employed a series of different systems over the years in an effort to curtail the toxicity within those titles. The most simple and the most prominent of these systems is the language filter. Any terms or words deemed by the developers as being out of bounds are simply blocked, being replaced with jibberish or simply removed from the conversation. The effectiveness of this system is however extremely limited for two main reasons. The first is that language is constantly evolving; new words, phrases, or even meanings are in a constant state of change. A phrase that was deemed inoffensive at one point in time, can quickly change to take on an entirely different meaning that may be offensive. For developers, it’s impossible to stay on top of such changes and thus words constantly fall through the cracks. The second reason is that it’s impossible to police all forms of communication; voice chat and various other systems can’t be regulated in any meaningful fashion and thus the language filter fails.

Blocking language only goes so far, however, as there are other aspects besides talking, other systems are required. Take Massively Multiplayer Online (MMO) games for example, the number of interactions, content, and various systems that exists allow for a slew of different ways players can troll and cause havoc. World of Warcraft, for example, had a series of different of ways the players could troll other players. One such way (at one point in time) was to have so many players on top of an NPC that no one could access it. There were also times where players would purposely wreak havoc on groups by bringing mobs from other areas to kill them or joining a dungeon or raid group only to ruin everyone’s experience. All of these and more are situations that the developers on some level expected, but had no real means of creating a system that could automatically handle it. So a report system was built in, allowing players to report those who were causing issues or wreaking havoc by ruining other’s experience. The problem with this system is that it’s a reactive rather than a proactive response to a problem. Even if a player was punished, either by suspending or banning them, it was after what they already did. While it would serve as a threat to others, to those who really wished to cause an issue would still do so.

Xbox’s Gamer Reputation System

As one solution to this problem, some developers (such as Microsoft) started developing a “gamer reputation”. These systems would essentially be your value, as a player, in regards to your standing within the community. If you were someone who trolled a lot, others would be able to report your poor behavior, subsequently lowering your score. As your score lowered, you would be restricted from certain aspects of online play such as being banned from voice chat or unable to join certain games.Riot Games, creator of League of Legends, employs a similar system that is more focused on the positives, termed “honor.” In short, the honor system works by rewarding players for rewarding other players for a job well done. After each match, players can nominate others for special awards that add to the player’s honor. The more honor one accumulates, the more bonuses players receive. However, should someone fall below the standard level of honor everyone starts at, it then becomes much more difficult to earn rewards and other systems come into play that further remove that person from others.

All of these solutions, and more, that have been developed have all had various degrees of success. But they are all focused on one aspect, creating new or altering existing systems in an effort to fix the toxicity in these titles. They all exist outside of the gameplay of the title itself, and thus, ignore one of the major sources of where toxicity grows, the design of the game.

Design

No matter how many systems you put in place for a game, if the actual gameplay of the title is designed in such a way to promote toxicity, it will begin to fester. Titles like League of Legends are highly competitive, team-oriented, and are built on a premise of having a large understanding of how the game functions for you to succeed. Each of these points all factor into creating an environment that promotes toxicity. By being highly competitive, it means that players are high strung as wins and losses matter more than other titles. If a player has a poor team (or if they part of that poor team) then it affects their standing, and will likely become frustrated or upset. This feeds directly into the issues surrounding a team-oriented game, you can’t simply accomplish the goal for yourself, so you must work with others to win. Humans, especially in a team environment, have a nasty habit of placing the blame of failure on something or someone. For a team, this means that someone will be receiving the brunt of that criticism that can easily lead to toxicity. Finally, there is an elitism associated with a game that requires the player to have a certain understanding before they can truly be successful. For new players to understand the game, they need to play, but others don’t want to play against those who don’t understand and become the scapegoat for any failing in the team.

These three aspects of the game are some of the main sources (in this case) that breeds toxicity. It is not, however, just these aspects alone, but others that compound further with these to make them even worse. In regards to League of Legends, all three of the problem designs are further exacerbated by the amount of time each match within the game takes. Since they are such a high investment for the player, it also makes it all the more important win, furthering each of the designs of the game that cause toxicity, even more.

Creating systems on top of these designs serve only as band-aids, where the true source of the toxicity in a game’s community lies in the design of the gameplay itself. This can be seen in titles such as Monster Hunter or Final Fantasy XII. Looking at the three main design issues that promote toxicity, Monster Hunter, on the other hand, has answers for each of them. In terms of competition, Monster Hunter simply has no direct one on one competition. There is no benefit for being against another person in your party, especially because in Monster Hunter, your fates are sealed together when hunting a new target. If there are three deaths in your group, you’re out. Because there is no direct competition, there are not the same worries that come with a team from League of Legends. Teammates need fostering and help to become better and truly assist you in your hunts, if they don’t, you can still complete the task, but it will just be even more difficult. Which once more feeds into the final point, other players’ knowledge. If there is no direct competition, and working together only yields better results, then it’s advantageous for players to help others, especially when it benefits themselves as well.

In the case of Monster Hunter and other games, it highlights a distinct aspect that is largely ignored when considering how to remove toxicity from games, and that is the design itself.  Monster Hunter is considered one of the friendliest communities, and for good reason, it has created an environment that promotes working with other players through the gameplay itself, creating reasons to help and work with others so they can do the same. If a developer wants to truly combat toxicity, it needs to come from the design itself. Adding all these extra systems to a game will certainly assist in dealing with it, but for it to be actually solved, we need games to incorporate those solutions as part of the game design, not just as secondary systems that are placed on top in hopes of fighting toxicity.