Activision is turning to AI to make your Call of Duty matches a bit less toxic

Gaming

Products You May Like

A new voice moderation tool, powered by AI, has been added to Call of Duty to protect players from “toxic or disruptive behaviour they may encounter.” This tool is said to help increase Activision’s ability to “identify and enforce against bad behaviour that has gone unreported.”

The initial rollout of the voice chat moderation tool will begin in North America for Call of Duty: Modern Warfare 2 and Call of Duty: Warzone from today. It’ll be rolled out globally (excluding Asia) on November 10 when Call of Duty: Modern Warfare 3 arrives.

How exactly does this tool work, and what behaviour will it be looking out for? In a comprehensive FAQ shared to Activision’s website, the developer shares that voice chat moderation is “managed and operated by Activision and uses the AI-powered model ToxMod from Modulate.” It will be focused on “detecting harm within voice chat versus specific keywords”, and violations of Call of Duty’s Code of Conduct will see more toxic players be punished for their behaviour.

In case you’re concerned about flaming your friends during a round of Warzone and being subsequently punished as a result, the developer has shared that the tool does allow for “trash-talk and friendly banter.” However, as is to be expected, hate speech, sexism, and other types of discrimination will not be tolerated.

And if you don’t like the tool? Well, the developer suggests that those who do not wish to have their voice chat moderated should simply “disable in-game voice chat in the settings menu.” Problem solved. It has also specified that ToxMod’s job is just to moderate voice chat; the tool will not be dishing out punishment to problematic players.

“Call of Duty’s Voice Chat Moderation system only submits reports about toxic behavior, categorized by its type of behavior and a rated level of severity based on an evolving model. Activision determines how it will enforce voice chat moderation violations.”

So, if you’re just mucking about in Call of Duty having a good time with your friends, this tool won’t affect you. And if you’re worried your trash-talk might get flagged, anything that ToxMod flags should, according to Activision’s FAQ on the tool, be judged by a human before any punishments are dished out.

Even then, Modulate states that ToxMod “was born to understand all the nuance of voice. It goes beyond transcription to consider emotion, speech acts, listener responses, and much more.” If true, you can expect to see less toxicity in your Call of Duty games, hopefully.


Call of Duty: Modern Warfare 3 will be hosting two open beta weekends during October. Will you be playing?

Articles You May Like

Capcom Intends To Continue “Re-Activating Dormant IPs”
The Nintendo Switch Year in Review 2024 is now live
The Beastmaster of S.T.A.L.K.E.R. 2, part four: the last bite
Revenge of the Savage Planet is a deeply funny open world metroidvania that oozes charisma… and ooze
Montgomery Fox and the Case of the Diamond Necklace: A Charming Hidden Object Adventure for Xbox

Leave a Reply

Your email address will not be published. Required fields are marked *