Call of Duty Takes Aim at Voice Chat Toxicity, Details Year-to-Date Moderation Progress

Call of Duty’s anti-toxicity team details efforts to reduce disruptive in-game behavior

Call of Duty is taking the next leap forward in its commitment to combat toxic and disruptive behavior with in-game voice chat moderation beginning with the launch of Call of Duty®: Modern Warfare® III this November 10th. Activision will team with Modulate to deliver global real-time voice chat moderation, at-scale, starting with this fall’s upcoming Call of Duty blockbuster.

Call of Duty’s new voice chat moderation system utilizes ToxMod, the AI-Powered voice chat moderation technology from Modulate, to identify in real-time and enforce against toxic speech—including hate speech, discriminatory language, harassment and more. This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system.

An initial beta rollout of the voice chat moderation technology will begin in North America on August 30 inside the existing games, Call of Duty: Modern Warfare II and Call of Duty: Warzone™, to be followed by a full worldwide release (excluding Asia) timed to Call of Duty: Modern Warfare III on November 10th. Support will begin in English with additional languages to follow at a later date.

Read the Call of Duty Voice Chat Moderation Q&A

Since the launch of Modern Warfare II, Call of Duty’s existing anti-toxicity moderation has restricted voice and/or text chat to over 1 million accounts detected to have violated the Call of Duty Code of Conduct. Consistently updated text and username filtering technology has established better real-time rejection of harmful language.

In examining the data focused on previously announced enforcement, 20% of players did not reoffend after receiving a first warning. Those who did reoffend were met with account penalties, which include but are not limited to feature restrictions (such as voice and text chat bans) and temporary account restrictions. This positive impact aligns with our strategy to work with players in providing clear feedback for their behavior.

As part of the collaboration with our partner studios, the anti-toxicity team also helped add Malicious Reporting to the Call of Duty Security and Enforcement Policy to combat a rise in false reporting in-game.

Read more about the Malicious Reporting policy update

This type of commitment to the game and the Community from our players is incredibly important and we are grateful to our community for their efforts in combating disruptive behavior. We ask our Call of Duty players to continue to report any disruptive behavior they encounter as we work to reduce and limit the impact of disruptive behavior in Call of Duty.

Teams across Call of Duty are dedicated to combating toxicity within our games. Utilizing new technology, developing critical partnerships, and evolving our methodologies is key in this ongoing commitment. As always, we look forward to working with our community to continue to make Call of Duty fair and fun for all.

Leave a Reply

4 Comments

  1. Screw off and stop censoring a 17+ game… hate speech etc is one thing, bad language stuff like that go to hell let adults say what they want

  2. i get why from a business standpoint they want to keep servers clean but the use of AI to try and figure out what i’m saying just makes me feel this will be a resounding failure since automated systems like this NEVER work out.

    like i’m gonna sneeze into the mic and it’s gonna think i just said every single swear, slur and mean thing about a player’s mother in the lobby possible so it sends an ICBM directly at my home.