Microsoft is taking further steps to address toxicity in multiplayer Xbox games with the introduction of a new feature. Xbox Series X/S and Xbox One players will now have the ability to capture a 60-second video clip of abusive or inappropriate voice chat and submit it for review by moderators.
The feature has been designed to support a wide range of in-game interactions. It is compatible with thousands of games that offer in-game multiplayer voice chat, including Xbox 360 backward-compatible titles. According to Dave McCarthy, Xbox Player Services corporate vice-president, the tool was created with ease of use in mind, ensuring minimal impact on gameplay.
When a player captures a clip for reporting, it will be stored on their Xbox for a period of “24 online hours.” The player has the option to submit the clip immediately or wait until the gaming session is finished. A reminder will be provided before the 24-hour timeframe expires. If the player chooses not to report the clip, it will be automatically deleted from their Xbox.
McCarthy emphasized that no one else will have access to the clip unless it is submitted for review. Xbox does not save or upload voice clips without the player initiating the reporting process. Clips captured through the tool will not appear in recent captures and cannot be downloaded, shared, or modified. They will be used solely for moderation purposes. After the safety team has reviewed a report, the player will receive a notification regarding any action taken against an abusive player.
The safety team at Xbox will employ a combination of AI and human moderators, utilizing a range of moderation tools to analyze the submitted clips. The moderators will review the audio and video content to determine if any violations of community standards have occurred.
Initially, the reactive voice reporting system allows players to report up to three individuals simultaneously. So, if a moderator cannot identify the individual responsible for the reported voice chat and link it to the reported Xbox Live player. The report will be closed as unactionable. In such cases, no enforcement action will be taken, and the captured video will be deleted within 90 days.
It’s worth noting that this feature is specifically for reporting Xbox players to the Xbox Safety Team. Voice chat from players on other platforms will not be subject to action by the safety team.
This system-wide approach by Xbox to address toxic voice chat is a positive development. The PlayStation 5 introduced a similar feature when it launched in 2020. Alternatively, other game studios have also adopted similar approaches. Riot, for example, implemented a voice recording system in Valorant where communications are only reviewed when a report is filed.
Initially, the voice reporting feature will be available to Alpha and Alpha-skip Xbox Insiders in English-speaking markets. This includes the US, Canada, Great Britain, Ireland, Australia, and New Zealand. Microsoft encourages insiders to provide feedback to help improve the feature. The company plans to continue investing in voice moderation and expand support for more languages. Xbox will also provide data and updates on voice chat moderation in its bi-annual transparency report.