In a recent blog post on callofduty.com, Microsoft Corp.'s MSFT Activision Blizzard and its associated studios addressed the issue of in-game conduct within the Call of Duty franchise.
The focus was on the implementation of a new automated chat monitoring system designed to detect and penalize players engaging in aggressive or offensive behavior during gameplay.
See Also: This Best-Selling Game Outsold Modern Warfare 3 In US — Can You Guess Which One?
The post highlighted that over two million accounts have faced consequences for disruptive voice chat, indicating a significant prevalence of offensive language and behavior within the Call of Duty community.
Despite the acknowledgment that some degree of trash talk is inherent to the gaming ecosystem, Activision emphasized a 50% reduction in players exposed to severe instances of disruptive voice chat since the launch of Modern Warfare III on Nov. 10, 2023.
Promising a continued effort to combat toxicity, Activision announced forthcoming updates featuring additional moderation systems and more severe punishments.
"Our tools will continue to evolve and expand over time, including the addition of new languages to our voice moderation system in future updates," the publisher wrote.
The blog also urged players to report any malicious activity, but recent changes introduce a potential challenge: a new rule penalizing false reports without specifying thresholds, raising concerns about the risk of automated systems interpreting legitimate reports as spam reporting.
Image credits: Miguel Lagoa on Shutterstock.
© 2024 Benzinga.com. Benzinga does not provide investment advice. All rights reserved.
Comments
Trade confidently with insights and alerts from analyst ratings, free reports and breaking news that affects the stocks you care about.