To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Call Of Duty is now listening in on your voice chats to ban toxic players

Call Of Duty is now listening in on your voice chats to ban toxic players

Ever get the feeling someone is listening to you...? Well, turns out they are.

Yet another online game has taken extra measures in an attempt to combat the rising tide of toxicity that (unfortunately) goes hand-in-hand with gaming. Call of Duty will begin using special AI software to detect when someone is being abusive.

The tool in question is called ToxMod and is used to “identify in real-time and enforce against toxic speech – including hate speech, discriminatory language, harassment and more”, and is “the only proactive voice chat moderation solution purpose-built for games”. All those times you wondered if someone else is listening in, well, now your paranoia is founded because someone is actually listening to you.

Get ready for Modern Warfare III with the official gameplay reveal!

As mentioned earlier, ToxMod has been used by online games before, however, these are much smaller games, meaning that Call of Duty will herald the first time the software has been used by this many players. But, while this is a massive step forward in tackling toxic behaviour, the software won’t issue any punishments, as Activision is eager to clarify. “Call of Duty’s Voice Chat Moderation system only submits reports about toxic behaviour, categorized by its type of behaviour and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations,” said the Q&A section for voice chat moderation.

Furthermore, any flagged abuse will be judged not just on specific words used but also the context as well, as explained by Modulate (the company behind the tech), “While the n-word is typically considered a vile slur, many players who identify as black or brown have reclaimed it and use it positively within their communities… If someone says the n-word and clearly offends others in the chat, that will be rated much more severely than what appears to be reclaimed usage that is incorporated naturally into a conversation.”

What could possibly go wrong, we ask while side-eyeing Instagram for having incredibly questionable moderation rules which often target the wrong users. Even though we’d like to think all will go well, there are just too many problems with moderation, especially when people are asked to use their best judgement.

If you’re feeling a little dejected, maybe because you know this software is going to catch you out, just remember that you can watch Nicki Minaj step on enemies to your heart’s content. There, don’t you feel all better now?

The new moderation tool will be rolled out in its beta stage across North America from 31 August onwards. As for when it’ll roll out to all users, that's something that Activision will update its players on accordingly. Just behave, play nicely and you shouldn’t run into any issues.


Featured Image Credit: Activision

Topics: Call Of Duty, Call Of Duty Modern Warfare, Call of Duty: Modern Warfare II, Activision, PlayStation, Xbox, PC