Rainbow Six Siege is auto-banning players for toxic language in chat | PCGamesN

Rainbow Six Siege is auto-banning players for toxic language in chat

rainbow six siege maestro

Ubisoft’s Rainbow Six Siege team have been promising a crackdown on abusive language in in-game chat since March, and it appears as though they’ve tightened up the rules. Players have been experimenting with the chat ban system and posting the results to the Rainbow Six subreddit, saying they’ve triggered half-hour and two-hour bans by using toxic terms.

It’s not exactly clear which words currently trip the banhammer in Rainbow Six Siege, but some of the subreddit’s findings are pretty unsurprising. Homophobic and misogynistic slurs have gotten players banned right away, but more generalized vulgarities like ‘fuck’ seem to be fine.

Rainbow Six Siege is on our list of the best FPS games on PC.

Redditor u/enforcerdestroyer posted a screengrab of the Rainbow Six Siege menu screen, with a red banner across the top that reads “Banned for toxic behavior,” and a timer counting down from 27 minutes. Enforcerdestroyer said the ban hit after they used a homophobic slur in chat as a test.

In March, Rainbow Six Siege community manager Craig Robinson said Siege was planning a crackdown on toxic behavior within the player community. At the time, the team was looking at bans that scaled from two days up to fifteen days, and then to permanent for egregious offenses. 

Now, Rainbow Six Siege’s Code of Conduct details shorter bans: 30 minutes for a first offense, two hours for a second, and two hours for a third. After three offenses, the rules say Ubisoft will investigate the conduct and determine whether to apply a permanent ban to a user’s account.

The auto-bans only apply to text chat, of course - voice communications still fall under the game's code of conduct, but there is no automated system in place for moderating it.

Players who have received bans can contact Ubisoft's customer support to appeal their case.

GOTW
Sign in to Commentlogin to comment
1N07 avatarDedderInside2edgy4me avatarojinyx avatar
1N07 Avatar
430
4 Days ago

It would be nice to know exactly how it works. I don't think this is a very good idea.

A combination of this system with player reporting would probably be good.

Say if the system detects a ban-listed word AND that user is reported for toxicity within maybe 1 hour of using the word.

That way people wouldn't get instantly banned if they use a bad word in a less offensive context that no one actually felt bad about.

1
DedderInside2edgy4me Avatar
1

Yeah, uh, that'd help, considering saying the censored version of the n-word for meme purposes gets you banned lmao

1
ojinyx Avatar
1
3 Days ago

Ubisoft making a move like this, albeit the portrayal of good intent, relies way too much on assumption and treats context as irrelevant. This could be a positive thing if we were living in a time where words such as these had a true serious impact on others, whereas, the context has moved on from derogatory (not in all, but most cases) and has been ushered into an age of comedy. Not everyone is going to casually accept that using slurs is commonplace for the community, but it is what it is.

Banning players for saying words is ridiculous, and to say it plainly, games meant for children have better chat protection systems than Ubisoft. Roblox performs better in stopping toxicity than a triple-a dev studio.

How sad is that?

1