One of the biggest video game franchises has revealed that its new AI voice moderation software detected more than two million “toxic” chats. Activision, the developers behind Call of Duty, announced it would begin using an AI moderation tool in August 2023.
Since then, it has been investigating and curtailing hate speech on the platform as part of a beta phase. In a post, Activision said that it did not tolerate “bullying or harassment, including derogatory comments based on race, gender identity or expression, sexual orientation, age, culture, faith, mental or physical abilities, or country of origin, or the amplification of any person, agenda, or movement that promotes discrimination or violence based on the above.”
The system was initially introduced in North America, and then rolled out globally when Modern Warfare III launched. It added Spanish and Portuguese moderation. English, Spanish, and Portuguese voice moderation is active worldwide, excluding Asia. At the time of writing, voice moderation is online in Call of Duty: Modern Warfare II, Modern Warfare III, and Call of Duty: Warzone.
“More than 2 million accounts have seen in-game enforcement for disruptive voice chat, based on the Call of Duty Code of Conduct,” it said, adding that only one in five users reported the behavior, “showing an unfortunate trend that players often do not report in-game instances to our Disruptive Behavior team.”
“Active reporting is still critical so that players can raise any negative situation they encounter,” the company stated.
On the upside, it reported an 8% reduction in repeat offenders since the rollout of in-game voice chat moderation, while there was a 50% reduction in players exposed to severe instances of online abuse.
Consequences of violating rules of conduct
Activision warned that anyone detected to have violated the code of conduct would either be globally muted from voice and text chat, as well as restricted from other social features. It said: “Repeat offenders are restricted further, muting them from communication in both voice and text channels within Call of Duty HQ.”
The company continued, saying that they plan to incorporate additional languages into the voice moderation system in the future.
“Call of Duty is dedicated to combating toxicity within our games and will empower our teams to deploy and evolve our moderation technology to fight disruptive behavior, whether it be via voice or text chat. We understand this is ongoing work, but we are committed to working with our community to make sure Call of Duty is fair and fun for all,” it reiterated.
Call of Duty Modern Warfare II and Call of Duty Modern Warfare III were two of the ten best-selling video games of 2023, according to the data.
Featured image: Canva / Call of Duty
The post “Call of Duty uses AI to detect 2 million toxic voice chats” by Suswati Basu was published on 01/24/2024 by readwrite.com