Activision has revealed that it’s implementing a brand-new tool to further its efforts to combat toxic and inappropriate online voice chat in its modern Call of Duty games.
Beginning with a US-only beta within Call of Duty: Modern Warfare II and Call of Duty: Warzone now, with a full roll-out planned to happen alongside the launch of Call of Duty: Modern Warfare III on November 10, the new feature comes via collaboration with Modulate and their ToxMod AI-powered voice chat moderation technology.
ToxMod is able to identify things like hate speech, discriminatory language and harassment in real time and apply enforcements on the spot, which Activision hopes will work alongside the current moderation systems that filter text-based chat across 14 languages as well as allow for detailed player-submitted reporting. At launch, the new AI-powered voice moderation will only work for English-speaking players, but the hope is that it’ll work to not only pick up on specific words and phrases but holistically detect disruptive and harmful behaviour.
We're taking the next leap forward in our commitment to combat toxicity with in-game voice moderation launching day-and-date with #MW3
The moderation beta will launch today for North America (English only). Learn more ? https://t.co/FsBVNk2LbN pic.twitter.com/Z3O11JMJYF
— Call of Duty (@CallofDuty) August 30, 2023
The Call of Duty team says it’s already restricted voice and/or text chat to over 1 million accounts since the launch of last year’s Call of Duty: Modern Warfare II, which is frankly a staggering number while simultaneously being sadly unsurprising. It advises that players who don’t wish to have their voice chat moderated by this new AI-powered system can simply disable voice chat in the game, which is a hilariously diplomatic way of saying “If you can’t play an online video game without literally harassing other people, just don’t talk.”
There’s a comprehensive FAQ available on the new Call of Duty Voice Chat Moderation tech, which you can read here.