Abuse can happen anywhere on the internet, and unfortunately it does every day. It’s on your Twitter feed, over the forums you visit, and in the comment threads of the stories you read. It’s also in your games. Not even our beloved hobby is a guaranteed haven away from online bullying.
You may have experienced it yourself: that one player in the chat who gets angry and aggressive; calling people names and filling the screen with lines of typed abuse. If you’re on voice-chat, it could be shouted insults filled with curses, or sneered commands from someone convinced they’re superior. It’s not pleasant at all, and can be genuinely upsetting.
But why do some people feel they can act this way online when, outside of games, it’s common knowledge that this is utterly unacceptable? We spoke to some experts to learn about the psychology behind online abuse.
“Frustration is conceivably one key reason, in which players deem the failure of the task to another player or players,” suggests Dr Linda Kaye, Psychology lecturer from Edge Hill University, when asked why some players can get abusive.
Jamie Madigan is a games psychologist and author of Getting Gamers: the psychology of video games and their impact on the people who play them. He has a different way of explaining why players can become bullies : “Primarily because they're jerks with poor impulse control. Let's not lose sight of that,” he says. But he believes there’s something else at play, too.
“Deindividuation is a psychological state when you feel like your identity is more part of a crowd than yours individually,” explains Madigan. “When experiencing deindividuation, we tend to look more to contextual information about the situation and the behavior of other people to inform us as to what to do or what's appropriate. If we see others behaving badly or if the cues provided by the situation suggest it's okay to be nasty, then we're more likely to do that.”
“Lots of things can lead to deindividuation, including anonymity (your identity is secret), being part of a crowd (your identity is unlikely to stand out, even if it's not strictly secret), and having interactions mediated by the internet or a video game,” he reveals.
Madigan’s point about reading other people’s behaviour online certainly explains how things can go rapidly downhill for a player once they’ve been identified as a target. If the players with the strongest voices seem to be picking on another player because they’re ‘dragging the team down’, it can sometimes feel like there’s pressure to pick a side. Do you defend the supposed ‘weak link’, or join in with the abuse to conform? After all, you’re just another faceless voice in the crowd, right?
Dr. Kaye agrees that anonymity is a key factor: “Gamers may often be representing themselves away from their “everyday” identity and thus be relatively anonymous to others. Research in psychology shows that the more anonymous we are to others; the more we disclose and often this may translate in our negative behaviours in this particular context.”
She also notes that there are other issues to consider, too. “One factor here is how accountable we are with the people we are engaging with. That is, in some online contexts, we are interacting with people who we may not encounter otherwise, and any rudeness to them may not have any “real world” consequences.”
This suggests that, for some players, there’s no consideration taken as to how their behaviour may affect the person behind the digital avatar. How often have you considered that the people who you’re playing with are actual real human beings with families, fears, hopes, and aspirations? It’s too easy to see them as little more than NPCs with silly usernames. And an NPC won’t log off in floods of tears if we ridicule them for making a mistake.
There’s also the idea that, on the internet and in games, the general consensus on what is ‘normal’ behaviour isn’t quite as rigid as it is offline. “As is the case in any context or environment, there are always a set of social norms which we follow which help guide our behaviour,” says Dr. Kaye. “It is perhaps also the case that the ‘norms’ in gaming contexts are more flexible than in the ‘real world’ and therefore people are not as guided to behave in what we deem ‘acceptable’ ways.”
In arguable that, over more than a decade of online play, we’ve normalised various forms of abuse. The word “n00b” is slung around as ‘banter’, but perhaps it’s just a gateway to stronger levels of abuse.
So if the norms of the online world are so hazy, is it just a case of emphasising more rigid rules if we want to eradicate abuse? It’s certainly worth trying.
“Developers could take the lead in establishing social cues for good behavior instead of bad,” suggests Madigan. “Emphasize cooperation and sportsmanship from the get-go. Or pair friends up with each other whenever possible, so that camaraderie and friendliness are the default attitudes that get spread to the others in a group.”
Dr. Kaye agrees: “Game companies may benefit from focusing on ways to adapt the social norms of their game communities. One excellent example here is League of Legends which includes the Tribunal feature in which negative play is made accountable by members.”
There’s an idea for another approach, though: “[developers could] focus on broadening players’ identities within their gaming communities. By finding ways of helping players affiliate themselves with the gaming community more broadly, psychologically, we know this can reduce prejudice towards other members,” Dr. Kaye explains. “Therefore, game companies may benefit from finding ways of building in activities or features which allow members to work collaboratively with others, as this may help shape players’ perceptions on shared affiliation with others rather than an “us versus them” mentality.”
As noted, League of Legends are making active strives for better community behaviour with the Tribunal system, but it feels one-of-a-kind. That’s a shame, because a peer-review approach that allows players to make decisions is an excellent idea: every second you’re playing, you’re being witnessed by players who could have the power to punish you. That seems to be the most logical way to help encourage a better culture.
And while punishing bad behaviour is important, trying to prevent it ever happening is even more valuable. Like Madigan says, more co-operative elements that help build relationships would be worthwhile. Ideally these additions would be based more on being present than being skillful: if you can offer aid without messing it up, no matter how small the contribution, then you’re likely to be valued.
Until a day comes when abuse is eradicated from games, remember: if you’re having problems with a badly behaved player, you should consider reporting them. Make use of any mute functions, too. You’re there to have fun, and no one should take that away from you.
Have you been the victim of online abuse? Do you think our experts’ ideas could solve the problem? Let us know in the comments. And if you’re interested in the psychology of games, you can help Dr Linda Kaye out in her studies with this MMO survey.