Can AI Make ‘Call of Duty’ Matches Less Toxic? We’re About to Find Out
Game developers are looking for ways to clamp down on abusive language online—and the massively popular shooter franchise Call of Duty is notorious amongst gamers for toxic players, as a 2022 study reinforced. Now publisher Activision is trying to do more about it, with help from AI.
Call of Duty is turning to artificial intelligence to moderate voice chats in its online player community with the launch of a new ToxMod feature, Activision said Wednesday. The feature, developed in collaboration with Boston-based AI startup Modulate, is now live in North America in the games Call of Duty: Modern Warfare II and Call of Duty: Warzone.
ToxMod uses artificial intelligence to identify and take action against toxic speech, including harmful language, hate speech, discriminatory language, and harassment—all in real time. It will expand globally (excluding Asia) on November 10 with the launch of the latest Call of Duty title, Modern Warfare III.
“This new development will bolster the ongoing moderation systems led by the Call of Duty anti-toxicity team, which includes text-based filtering across 14 languages for in-game text (chat and usernames) as well as a robust in-game player reporting system,” Activision said in a statement.
Decrypt contacted Activision for comment and additional details about the new feature, but a representative did not provide any further information. Modulate did not immediately respond to Decrypt’s questions.
Gaming is no stranger to toxic behavior. Activision said that since last November’s release of Call of Duty: Modern Warfare II, over 1 million accounts have been restricted for violating the game franchise’s code of conduct. The publisher claimed that its existing filtering tools have made a “positive impact” in curbing abusive speech.
“Twenty percent of players did not re-offend after receiving a first warning,” Activision said in a post. “Those who did re-offend were met with account penalties, which include but are not limited to feature restrictions (such as voice and text chat bans) and temporary account restrictions.”
Despite the rollout of the new AI feature, Activision asked Call of Duty players to continue reporting disruptive behavior.
“This type of commitment to the game and the community from our players is incredibly important, and we are grateful to our community for their efforts in combating disruptive behavior,” Activision said.