Call of Dutyvoice chat is Horror Archivesfamously not great. Activision is hoping AI can fix that.
According to a company blog post on Wednesday, the next major CODrelease will use AI to observe and report on toxic voice chat in multiplayer matches. Specifically, Activision is deploying ToxMod, a bit of AI tech developed by a company called Modulate, to make sure people aren't super racist while gunning each other down in team deathmatch.
SEE ALSO: 'Baldur's Gate 3' will come to Xbox this year, but with one caveatFor what it's worth, ToxMod is interesting if nothing else. As noticed by PCGamer, Modulate claims ToxMod can listen to conversational context to determine whether or not something is trulyhate speech.
You May Also Like
ToxMod also claims to be able to detect grooming language for white supremacist groups. Modern Warfare IIIwill be by far the biggest game ToxMod has been deployed in so far, so it'll be quite a test. One thing to note, however, is that the software itself does not punish players for hate speech. Instead, it merely reports violations to the Call of Dutycode of conduct and humans at Activision take further action.
Of all the different potential uses for AI, this one is perhaps less offensive than most of the rest. After all, it's not trying to replicate or even replace human creativity; instead, it's potentially stopping people from having to listen to hate speech. But that also means there could be real limitations to the tech, or ways to circumvent its filters.
Related Stories
- 'Baldur's Gate 3' will come to Xbox this year, but with one caveat
- Soon, you can stream Xbox games directly to Discord
- 'Final Fantasy XVI' doesn't always feel like 'Final Fantasy,' and that's OK
- The original 'Red Dead Redemption' comes to Nintendo Switch this month
- Nintendo's next-gen console could launch in 2024
We'll all find out over the next few months, as every Call of Dutyplayer becomes a guinea pig.
Topics Gaming