Activision releases a Disruptive Behavior Progress Report ahead of Black Ops 6’s launch detailing chat moderation efforts

Activision releases a Disruptive Behavior Progress Report ahead of Black Ops 6's launch detailing chat moderation efforts

What you need to know 

  • Activision, the Microsoft-owned publisher of blockbuster first-persoon shooter franchise Call of Duty, has released an internal research paper in partnership with California Institute of Technology on disruptive player behavior and moderation efforts.
  • The Disruptive Behavior team has shared that it will continue to use ToxMod, a proactive AI-powered voice moderation tool that was implemented in Call of Duty: Modern Warfare 3 (2023).
  • Activision reports the current anti-toxicity strategy has effectively reduced voice toxicity by 43% among English, Spanish, and Portuguese-speaking players.
  • The strategy will expand for Black Ops 6 to include support for French and German at launch on October 25.
  • Over 45 million disruptive text messages were blocked since 2019 thanks to text-based moderation by Community Sift. Text moderation will expand to 20 languages with Black Ops 6.

As a tech enthusiast who has spent countless hours immersed in the virtual worlds of gaming, I have witnessed firsthand the darker side of online multiplayer games – toxicity and harassment that can make the experience less enjoyable for many players. The research paper by Activision and Caltech, focusing on disruptive behavior and moderation efforts in Call of Duty: Modern Warfare 3, is a step in the right direction towards creating a more positive gaming environment.


A recent study conducted by Activision in collaboration with Caltech showcases the impact of anti-toxicity measures implemented in Call of Duty: Modern Warfare 3. As Call of Duty: Black Ops 6 is set to debut in two weeks, Activision’s Disruptive Behavior team has shared a blog post explaining how these moderation efforts might apply to the forthcoming high-profile installment within the competitive and successful franchise.

Playing Call of Duty online often involves banter that can escalate into harsh, disrespectful behavior, a problem that’s become so prevalent it’s almost a running joke within the gaming community. Statements like “You wouldn’t last in a Call of Duty match” are commonly used when discussing offensive language, sexism, and general toxicity in online gaming environments. Interestingly, research conducted by Activision suggests that this widespread toxicity not only harms the game but also sets a damaging precedent for speech on other platforms by promoting hate speech, racism, and sexism.

Although Activision didn’t disclose exact player figures, a research paper attributed the large number of Call of Duty players and Activision’s approach to collecting gameplay data to analyze the impact of disruptive behavior in competitive gaming. The study took into account favoritism among team members and hostility towards outsiders potentially leading to exaggerated reports, the psychological concept called the “bystander effect” that might result in underestimating toxic conduct, and the strategies employed by problematic players to evade moderation attempts.

After releasing Modern Warfare 3 in 2023, Activision teamed up with ToxMod to introduce an AI-assisted voice moderation system. This strategy aimed to reduce harmful and abusive chat during game lobbies and matches. The implementation of this tool resulted in a decrease of 43% in verbal toxicity among English, Spanish, and Portuguese players in Modern Warfare 3. Activision intends to broaden the use of ToxMod with the debut of Call of Duty: Black Ops 6, which will have this feature active from the start, to support French and German languages. However, it is not accessible in Asia.

Based on reports from Activision, around 80% of players who were given chat restrictions through pop-up warnings and temporary communication bans did not engage in further disruptive behavior. In total, both Modern Warfare 3 and Warzone experienced a decrease of approximately 67% in the number of recurring offenders.

Activision releases a Disruptive Behavior Progress Report ahead of Black Ops 6's launch detailing chat moderation efforts

In the gaming sensation Call of Duty, players can engage in both voice and text-based conversations. Starting August 2024, text moderation partner Community Sift extended their services to support 20 languages, up from the initial 14. This partnership has successfully blocked over 45 million offensive messages from the game’s chat service since November 2023. Languages covered include English, French, Italian, German, Spanish, Portuguese, Russian, Polish, Japanese, Korean, Arabic, Chinese (Traditional and Simplified), Turkish, Dutch, Thai, Vietnamese, Indonesian, Finnish, and Romanian for text moderation.

It turns out that disrespectful actions aren’t confined only to chat rooms during games. Players often display unacceptable behavior through their gamertags and clan tags as well. To address this issue in Call of Duty, a new system has been introduced to analyze and manage reports related to offensive usernames and clan tags. This system flags serious instances of inappropriate names and tags for review by our moderation teams.

Activision’s research reveals the importance of robust moderation strategies to prevent recurring misconduct. Strategies such as expelling cheaters from the leaderboard significantly reduced re-offenses, with just 0.25% of players resorting to cheating again. However, changing offensive usernames, restricting access to certain features through communication bans, and forbidding clan tags were less successful, as 16-18% of troublesome players repeated their misconduct within a month. In certain instances, implementing stringent rules triggered the “reactance phenomenon,” where some players responded negatively to regulation by becoming even more hostile. Others learned to be more covert and evade detection methods.

Improving the management of disruptive behavior in Call of Duty continues to be a challenging endeavor. According to Activision’s research, enhancing the reporting system within the game, increasing transparency with players, ensuring fairness in moderation, and encouraging positive actions can significantly reduce exposure to toxicity for both players and moderators.

The highly-anticipated game “Call of Duty: Black Ops 6” will be released on October 25 across Xbox, PC, and PlayStation platforms. To ensure a positive gaming experience, ToxMod and Community Sift are enhancing their efforts to combat disruptive behavior by adding support for new languages in their moderation system.

Read More

2024-10-10 22:39