Rockstar

GTA Online AI Voice Chat Moderation Explained By Developer

The AI ​​voice chat moderation service that is part of GTA Online has been explained by its developer. It’s been a little over a year since Rockstar Games first started testing this new way of controlling and banning players in GTA Online.

When the launch began, it was met with quite a bit of criticism from some vocal gamers. However, since it was fully enabled across all forms of GTA Online and Red Dead Online in April this year, few people have talked about it since.

Now Modulate, the company behind this software called ToxMod wrote an article about their partnership with Rockstar Games. In addition, included what they do for GTA Online. Rockstar also re-shared this article on Twitter. It was met with quite a few annoying responses from gamers and non-gamers alike. As noted above, there was very little talk about ToxMod until Modulate/Rockstar themselves mentioned it.

Working With Rockstar Games

Modulu begins their article by saying that they are “committed to working closely with the team at Rockstar Games to protect the millions of players enjoying GTA Online from unwanted toxicity and harassment.”

Why It Is Used

As for why they and their systems were brought in by Rockstar, Modulate explains that the GTA Online community is as large and active as other games. This means that it is always difficult for Rockstar to take care of its extremely large player base. GTA V has sold over 205 million copies to date.

What Does It Actually Do?

Relying on player reports from other players is not enough to maintain a GTA Online safe space state Modulu. ToxMod is used to help “prevent bullying and toxic behavior”.

When ToxMod was fully launched in GTA Online and Red Dead Online last year, Rockstar Games also issued a new set of Community Guidelines that all players must follow along with updating their Terms of Service. Modulate has tweaked ToxMod to meet GTA Online’s Community Guidelines, which of course differ from other games.

From there, ToxMod alerts Rockstar’s GTA Online moderation team the moment “problematic voice chat interactions” occur rather than waiting for player reporting. Those at Rockstar can then decide to take action on the potential conduct violations. This helped Rockstar Games with “enforcing the guidelines and encouraging positive player behavior within the community”.

Utilizing advanced machine learning technology, ToxMod understands the nuances of player conversations, distinguishing between trash talk and intentional harm and harassment aimed at other players. This means that moderation teams can quickly intervene before those toxic interactions can escalate.

What are your thoughts on ToxMod since it was part of GTA Online and Red Dead Online? Let us know below in the comments.

To stay up to date on every GTA Online news update, be sure to check back on RockstarINTEL and sign up to our newsletter for a weekly roundup of all things Rockstar Games.

Subscribe to our newsletter!

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button