Let’s face it, guys; toxic behavior is a serious problem in the gaming world. It seems like you can’t go into any random lobby without being bombarded with sexist insults and racial slurs. It’s not even isolated to small communities or angry trolls that hide out in their parents’ basements. Some of our most notable pro gamers, our Michael Jordans and Babe Ruths, are getting banned from games that they actually pay their bills with because they were acting like total douchebags. Big-name studios like 343 Industries are instituting strict punishments for any toxic behavior toward others in huge AAA titles like Halo 4, and yet this garbage bin of assholery continues to go largely unchecked in almost every online game in existence. Well, Microsoft thinks that they finally have the answer.
About a month ago, Microsoft revealed the new Xbox One reputation system, which would “reward healthy participation while reducing troublemakers and cheaters.” The system would show players’ reputation right next to their Gamerscore, allowing users to preemptively ban other users with a history of trash talking, rage quitting, or other disruptive behaviors. Hopefully, this would act as a deterrent as toxic users would find it harder and harder to get into any game at all.
At the time of its announcement, Microsoft did not have many details about this new system to share. This spawned a cavalcade of skeptical comments from gamers who figured that the worst trolls of Xbox Live would simply find a way to exploit the system. In a recent blog post on Xbox.Com, Michael Dunn, program manager on Xbox Live, explained a bit more about how the reputation system would work:
So, how are we doing this? We are simplifying the mechanism for Xbox One – moving from a survey option to more direct feedback, including things like “block” or “mute player” actions into the feedback model. The new model will take all of the feedback from a player’s online flow, put it in the system with a crazy algorithm we created and validated with an MSR PhD to make sure things are fair for everyone.
Ultimately, your reputation score will determine which category you are assigned – "Green = Good Player," "Yellow = Needs Improvement" or "Red = Avoid Me." Looking at someone’s gamer card you’ll be able to quickly see their reputation. And, your reputation score is ultimately up to you. The more hours you play online without being a jerk, the better your reputation will be; similar to the more hours you drive without an accident, the better your driving record and insurance rates will be.
Unfortunately, that still lends itself to some pretty big holes. If “online time” is what allows your reputation to fall back to normal, yellow- or red-flagged players can simply leave their console online to reduce their rating. If good reports reduce your rating, then guilds of trolls (and yes, they exist) can simply vote themselves up whenever someone gets a complaint.
Dunn says that their system is designed to work around exploits like this:
The algorithm is sophisticated and won’t penalize you for a few bad reports. Even good players might receive a few player feedback reports each month and that is OK. The algorithm weighs the data collected so if a dozen people suddenly reporting a single user, the system will look at a variety of factors before docking their reputation. We’ll verify if those people actually played in an online game with the person reported – if not, all of those players’ feedback won’t matter as much as a single person who spent 15 minutes playing with the reported person. The system also looks at the reputation of the person reporting and the alleged offender, frequency of reports from a single user and a number of other factors.
He also noted that the system will change as people find ways to exploit it.
However, the biggest loophole in the system is that it takes no active role in punishing offenders. It merely assumes the red-flagged users won’t be played with. While this may be the case in Call of Duty or Halo, which have thriving online communities, smaller games such as fighting games, indie titles, and racing games won’t be as lucky. Users will be forced to continue to play with red-flagged players just to be able to find a match, which makes the whole system moot.
It’s nice that Microsoft is taking an active role in trying to make their online community better, but no algorithm is smart enough to outdo humans that can reverse engineer it. If Microsoft isn’t willing to just come out and issue flat bans to disruptive players across the board, then it’s unlikely this new reputation system will fix the toxic-gamer problem.
Then again, Dunn wrote “most Xbox Live players are polite online and know how to socially adjust to people they’re playing with. But not everyone does this. And, it can be challenging to pick up on social cues when you are connected online and not face-to-face in the same room” in his blog post, so maybe Microsoft is just too far disconnected from the reality of the situation to do much about it. I don’t mean to criticize you Dunn, but anyone who calls me a “sh$%t bagging c#$%ckweasel” isn’t just failing to pick up on social cues.
Angelo M. D'Argenio
Senior Contributing Writer