Activision Blizzard Inc ATVI partnered with Modulate to address the toxic behavior often found in the voice chats of its first-person shooter Call of Duty.
The company used an AI technology called ToxMod, designed to identify and flag negative behaviors such as hate speech, discrimination and harassment in real time, The Verge reported.
See Also: Xbox Cracks Down On Gaming Toxicity: 8 Strikes Policy For Gamers
The initial beta rollout of ToxMod started in North America and is active in Call of Duty: Modern Warfare II and Call of Duty: Warzone.
A global release, excluding Asia, is planned for Nov. 10, coinciding with the launch of Call of Duty: Modern Warfare III.
The exact workings of ToxMod were not extensively detailed in Modulate's press release, but the tool appeared to analyze voice chat conversations, considering elements such as tone, emotions and volume to determine the level of toxicity.
It is not intended to take direct action against players but rather to report incidents to Activision's human moderators.
This approach was considered necessary due to concerns about biases in AI systems, especially in recognizing different accents and racial identities.
Read Next: Survey Reveals Over Half Of Game Users Encounter Extremist Ideologies, 36% Experience Harassment
Photo: Activision Blizzard via Steam.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.