Microsoft is taking stronger actions to address toxicity in multiplayer Xbox games by introducing a new feature that allows players to report abusive or inappropriate voice chat. This feature, available for Xbox Series X/S and Xbox One players, enables them to capture a 60-second video clip of the offensive chat and submit it for moderation.
Dave McCarthy, Xbox Player Services corporate vice-president, explained in a blog post that this reporting tool is designed to support a wide range of in-game interactions between players. It works across thousands of games with in-game multiplayer voice chat features, including Xbox 360 backward-compatible titles.
Microsoft has prioritized ease of use and minimal impact on gameplay in the design of this feature. When players capture a clip for reporting, it stays on their Xbox for 24 online hours. They can choose to submit it immediately or wait until they finish their gaming session. A reminder will be given before the 24 hours are up, and if a player decides not to report the clip, it will be automatically deleted.
The submitted clips are only accessible to the player who captured them. Microsoft ensures that Xbox is not saving or uploading any voice clips without the player initiating the reporting process. These clips cannot be downloaded, shared, or modified, and they are exclusively utilized for moderation purposes. Players will receive a notification informing them whether any action has been taken against the reported abusive player after the safety team reviews the report.
According to an Xbox spokesperson, the safety team will analyze the clips using a combination of AI and human moderators. These moderators will review both the audio and video to determine if any player has violated the community standards. However, in cases of inappropriate voice chat from players on other platforms due to cross-platform play, the safety team will not take any action. The reactive voice moderation feature is specifically designed to report Xbox players to the Xbox Safety Team.
Microsoft’s initiative to combat toxic voice chat at a platform-wide level is commendable. The PlayStation 5 has already implemented a similar feature since its debut in 2020. Additionally, several game studios, like Riot and Blizzard, have adopted similar approaches within their own games. Riot announced the recording of Valorant voice communications in 2021, and Blizzard introduced automatic transcription and analysis of voice chats in Overwatch 2 following a player’s report.
Initially, Xbox’s voice reporting feature will be available for Alpha and Alpha-skip Xbox Insiders in English-speaking markets such as the US, Canada, Great Britain, Ireland, Australia, and New Zealand. Microsoft encourages these insiders to provide feedback to help improve the feature. The company has plans to invest further in voice moderation and intends to support more languages. Xbox will also share data and updates on voice chat moderation in its bi-annual transparency report.
Overall, Microsoft’s effort to address toxicity in multiplayer Xbox games through the voice reporting feature demonstrates its commitment to creating a safer gaming environment. By allowing players to report abusive voice chat and leveraging AI and human moderation, Xbox aims to foster a more inclusive and positive gaming community.