Modulate has been awarded $30 million to remove game voice chat with AI

Modulate has been awarded $30 million to remove game voice chat with AI ...

Modulate has raised $30 million to expand its AI product, ToxMod, which scans voice chat using machine learning to identify harmful players in online games.

ToxMod uses artificial intelligence to highlight issues that should be addressed by human moderators in online games, like in novels such as Snow Crash andReady Player One. It's a problem that will only get worse with the metaverse, the universe of virtual worlds that are all interconnected, like in novels such as Snow Crash and Ready Player One.

In a recent interview with GamesBeat, Mike Pappas, the CEO of Modulate, said the industry has absolutely needed to solve this challenge. This is a large-scale market need, and we were eager to demonstrate that weve actually built the software to meet this requirement.

Everblue Management, Hyperplane Ventures, and others participated in the round, and Mika Salmi, the managing partner of Lakestar, will join the Modulates board.

In 2022, the GamesBeat Summit will take place.

This October 25-26, gaming leaders will meet live in San Francisco to discuss the future big opportunities within the gaming industry.

Modulates ToxMod is a proactive voice moderation algorithm that attempts to capture not only visible toxicity (hate speech, adult language), but also more subtle harms such as child grooming, violent radicalization, and self-harm. The AI has been trained on more than 10 million hours of audio.

Modulate, based in Cambridge, Massachusetts, wants to revolutionize the way game designers approach the ongoing battle against internet toxicology. The investment, according to Pappas, is a proof of the business's mission.

Pappas said the company's primary focus is proactive voice moderation. This is a way to demonstrate that you can actually fulfill your duty of care and correct all of the bad behavior across your platform in a more comprehensive way.

ToxMod employs advanced machine learning techniques to analyze not just what each player is saying, but how they are saying it, including their emotion, volume, prosody, and other elements. This is critically important, as what may be harmful in one context might be genuine support in another.

ToxMod claims to be able to distinguish between these kinds of situations by its nuanced voice, while allowing everyone else to choose their own style of gameplay. ToxMod claims to be capable of detecting offenses with a 99 percent accuracy (which continually improves over time) and allowing moderation teams to respond to incidents 25 times faster.

Salmi said in an interview that she first saw the company about a year and a half ago. We saw it as a team with world-class technology. That's what we invest in.

The main issue was whether they could commercialize it. Salmi said they have done that. And Pappas said the company has a number of unannounced large customers using it.

Evidently no other individual out there has it. We were looking for this sort of technology for a long time and nothing came close, according to Salmi.

Dennis Fong, CEO of GGWP, which analyzes text chat, said that human moderators at those companies can only process a small percentage of these reports. GGWP also works at improving players' reputations over a long period.

Companies may engage in different strategies with players who are only occasionally toxic versus those who engage in it much more frequently. These so-called reputation scores can travel with players.

The main concern for us was how do we illuminate what's going on in the first place, according to Pappas. We begin with understanding the environment and how it develops, where it is happening, how players are interacting, and how do we collaborate with our customers in developing education campaigns.

If players are punished, they must understand why. If toxicity occurs amid allegations of cheating, thats important to know. Modulate is also contemplating how to preserve the mental health of moderators who must deal with all of the abuse.

Before going to the competition's applications, it makes sense for game companies to try to resolve these problems in the context of their own games.

ToxMod offers proactive moderation that allows platform and game moderators to make informed decisions to protect players from harassment, toxic behavior, and even more perverse harms.

Pappas said the company is making sure that things like trash talk, which may be acceptable in mature Call of Duty games, are not misclassified as racial slurs. The goal is to make moderators more effective across the platform.

Human moderators may sift through the results and identify false positives, and the system can learn from that. They can immediately begin taking immediate action, according to Pappas. Sometimes, it can be difficult to understand the conversation because human language is complex. And each game has its own set of guidelines.

No company in the world has built a data set specifically designed to focus on real social voice chats online, according to Pappas. That has allowed us to improve our models, which far outperforms any of the public big company transcription models out there by a good deal.

Pappas said Modulate had good connections with venture capitalists and had an easier time raising funds at a time when it was difficult to do so, even for game companies. Salmi said he's glad to find a company like Modulate.

With only 27 people, the business has reached its milestones, which speaks to the power of artificial intelligence.

You may also like: