Chat behavior in multiplayer games is a little difficult to moderate as it is happening in real-time. In a lot of cases, you need to rely on other players in the same multiplayer chat to report or try to get other players banned when something inappropriate happens or is said. Starting today ToxMod, an AI service will be able identify toxic speech in real-time and enforce against it.
The language that can be moderated via this AI in real-time includes racist comments and discriminatory language. Right now, it is only in a beta version and is currently available in Call of Duty: Modern Warfare II and Call of Duty: Warzone. However, it will be fully launched alongside the release of Call of Duty: Modern Warfare III in December.
Curbing toxicity is a difficult feat, but a mission that Infinity Ward and Activision has been trying to fight against for many years. Hopefully, these AI efforts will be able to find and report these players much faster than before. In doing so, we can only hope that more players will feel safe enough to jump in on the action when the newest installment launches later this year.
Stay tuned to @CODTracker on Twitter for full Call of Duty coverage.
If you're new to Call of Duty Tracker, you should know that we also offer free services such as player statistics, leaderboards, and a cosmetics database. Consider checking us out and let us know on Twitter what you'd like to see next!