Here’s what it’ll take to clean up esports’ toxic and sexist culture

History of harassment

Recent stories fromThe New York Times,Wired,Insiderand others have highlighted how pervasive sexism, racism, homophobia and other forms of discrimination are in online spaces. However, these issues are hardly new. Similar problems arose in 2014’sGamerGateTwitter-based campaign of harassment of female gamers, designers and journalists.

Sexism was also common before GamerGate. For example, professional gamer Miranda Pakozdi quit her team followingsexual harassmentfrom her coach in 2012; the coach, Aris Bakhtanians, famously stated that“sexual harassment is part of [the fighting game] culture”and that it could not be removed.

Others have suggested that theanonymityof online game spaces, combined with gamers’competitive natures, increases the likelihood of toxic behavior. Survey data from theAmerican Defamation Leaguesuggests that at least 37% of female gamers have faced gender-based harassment.

However, positive online communities exist, and a study by lawyer and former Microsoft user experience designerRebecca Chuifound thatanonymous online communities are not inherently toxic. Rather, a culture of harassment requires community norms that allow for it. This suggests that online bad behavior can be addressed effectively. The question is how.

Players’ coping strategies

In my interview-based research with female gamers, I’ve found that players havemany strategies for avoiding or managing online harassment. For instance, some play only with friends or avoid using voice chat to hide their gender. Other gamers get really good at their favorite games, to shut down harassment through skill. Research by other media scholars, such asKishonna GrayandStephanie Ortiz, has found similar results across race and sexuality.

These strategies have significant downsides, however. For example, ignoring toxicity or brushing it off allows it to persist. Pushing back against harassers often results in further harassment.

They can also put the burden of challenging harassment on the victim, rather than on the perpetrator or community. This can drive victims out of online spaces. As my interviewees gained responsibilities in their jobs or families, for instance, they no longer had the time or energy to manage harassment and stopped gaming. My research suggests that game companies need to intervene in their communities to keep players from having to go it alone.

How companies can intervene

Game companies are becoming increasingly invested in community management strategies. Large publisher Electronic Arts held acommunity management summitin 2019, and companies likeMicrosoftandIntelare developing new tools for managing online spaces. A group of game development companies even recently formed theFair Play Alliance, a coalition working to address harassment and discrimination in gaming.

It’s important that interventions be rooted in the experiences of players, however. Right now, many companies intervene though practices like banning or blocking harassers. For instance, the live-streaming platform Twitch recently banned several prominent streamers following allegations that they had committed sexual harassment.

This is a start, but harassers who are blocked or banned often create new accounts and return to their previous behaviors. Blocking also manages harassment after it occurs, rather than stopping it at the source. Thus blocking should be combined with other potential approaches.

First, companies should expand the tools they provide players to manage their online identities. Many participants avoided voice chat to limit gender harassment. This at times made it difficult to compete. Games like Fortnite, League of Legends and Apex Legends, however, have instituted“ping” systems, where players can communicate essential game information rapidly, without requiring voice. Similar tools could be built into many other online games.

Another option my interviewees suggested is to make it easy for players to group with friends, so they have someone on their side to guard against harassment. Grouping mechanisms work particularly well when matched to the needs of their specific game. For instance, in games like Overwatch and League of Legends, players need to take on different roles to balance their team. Abuse can occur when randomly assigned teammates all want to play the same character.

Overwatch recently introduced anew grouping systemthat allows players to choose their characters, then be matched with players who have chosen different roles. This appears toreduce abusive in-game chat.

Finally, companies should work to change their basic cultural norms. For example, League of Legends publisher Riot Games once instituted a “Tribunal” system where players could view incident reports and vote on whether the behavior was acceptable in the League community.

Although Riot Games unfortunately closed the Tribunal shortly after its release, including community members in any solution is a good idea. Companies should also develop clear community guidelines, encourage positive behavior through tools like in-game accolades, and respond to ongoing issues rapidly and decisively.

If esports continue to expand without game companies addressing the toxic environments in their games, abusive and exclusionary behaviors are likely to become entrenched. To avoid this, players, coaches, teams, leagues, game companies and live-streaming services should invest in better community management efforts.

This article is republished fromThe ConversationbyAmanda Cote, Assistant Professor of Media Studies/Game Studies,University of Oregonunder a Creative Commons license. Read theoriginal article.

Story byThe Conversation

An independent news and commentary website produced by academics and journalists.An independent news and commentary website produced by academics and journalists.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with