‘Call of Duty’ relies on artificial intelligence to solve its biggest problem

Gaming | September 6, 2023

‘Call of Duty’ relies on artificial intelligence to solve its biggest problem

Toxic players are one of the biggest issues in Call of Duty. Despite mechanisms for reporting or muting them, the measures don’t always work. That’s why Activision has begun testing a tool that uses artificial intelligence to combat abusive behavior.

In a blog post, Activision announced that it will implement a new large-scale real-time voice chat moderation system. The company will use ToxMod, a tool developed by Modulate that employs AI to identify discriminatory language, hate speech, harassment, and other toxic behaviors.

ToxMod works in three steps: first, it classifies and processes voice chat data to determine if the conversation needs attention. If it meets certain parameters, the system proceeds to analyze tone, context, and perceived emotion to determine the type of behavior. This is achieved through machine learning models that, according to Modulate, understand emotional and nuanced signals to differentiate between a joke and misconduct.

“These classification models cannot understand everything about a conversation at a glance, but they can look for telling signs of anger, distress, aggression, and even more subtle sinister intentions,” Modulate states in one of its documents. Finally, the tool can escalate the most toxic voice chats for moderators to take appropriate action.

Call of Duty will combat toxicity in all languages

Gamer PC

ToxMod understands 18 languages, including Spanish, and can comprehend the complete context of a conversation in two or more of them. This is important because it’s common to hear people utter racist insults in a language other than English in Call of Duty.

Modulate uses language models, artificial intelligence, and human experts who are native speakers of each language and can identify certain harmful actions. “Fighting toxic behavior has more nuances and requires fluency in a language and the culture of the language’s country of origin, as well as the subculture and psychology of gaming and online behavior in general,” the company says.

In a previous post, Modulate mentions that ToxMod detects specific risk categories, such as violent radicalization or child harassment. This is possible thanks to detection algorithms that identify repeated patterns of behavior. For example, if a player uses extremist language once, it’s categorized as an offense, but if they repeat it or intensify it over several weeks, they are added to a risk category for moderators to assess.

AI is not infallible

Control PS4

It’s important to note that ToxMod is not the definitive solution to abusive behavior in Call of Duty. The tool will be integrated into a game moderation system that includes text-based filtering available in 14 languages, as well as a mechanism to report players who do not follow the rules. The final decision to penalize the offender or not will be up to the anti-toxicity team.

The first beta will begin today with integration into Call of Duty: Modern Warfare II and Call of Duty: Warzone in North America. It will later be expanded to the general public with the release of Call of Duty: Modern Warfare III on November 10. Activision confirmed that moderation will start in English, with Spanish and other languages being added at a later date.


Related Posts: