Xbox Explains Its Evolutionary Approach to Creating Safer Gaming Experiences

Third Transparency Report Hero Image Ecran Partage

On November 7, 2023, Xbox’s third transparency report was released to the public , providing detailed insights into initiatives to create safer gaming experiences. This document highlights the critical importance of human supervision combined with the constant advancements of artificial intelligence, thus strengthening the foundation of their security efforts so far.

With the game’s platform and community constantly evolving, security requires increased depth, speed, and agility to protect players from potential toxicity.

Xbox is continually adapting to changes in the industry, exploring and using artificial intelligence (AI) more and more. AI is playing an increasing role in accelerating content moderation globally. The team is actively innovating by responsibly applying AI to deliver safer player experiences, combining human oversight with the emerging capabilities of AI.

Currently, Xbox uses various existing AI models, such as Community Sift and Turing Bletchley v3, to detect toxic content. Community Sift, an AI-powered content moderation platform, filters billions of human interactions per year. They seek to improve these systems by better understanding the context of interactions, expanding scale for players, increasing the capabilities of human moderators, and reducing exposure to sensitive content.

The article highlights key findings from the report, including a 39% increase in the number of blocked frames through the use of the new Turing Bletchley v3 model base. The platform’s proactive security blocks content before it reaches players, with 87% of total penalties coming from proactive moderation efforts.

There is an increased emphasis on the fight against harassment, with a 95% increase in proactive sanctions for harassment/bullying. A new voice reporting feature is also being launched to capture and report in-game voice harassment.

The article also discusses understanding player behavior after a sanction. Analytics indicate that the majority of players do not violate community norms after a sanction, and many engage positively with the community. The new sanction system aims to help players understand the severity of the sanctions, the cumulative effect and the total impact on their case.

Apart from the transparency report, the team works with partners to promote safety in the gaming industry. Minecraft partners with GamerSafer to promote safe servers , and security toolkits for Xbox games are launched in Japan and Singapore, providing parents with resources to manage their children’s gaming experience.

By introducing safety measures such as the voice reporting feature and the sanctions system, Xbox aims to create a community where everyone can have fun without fear or intimidation. Every player has a role to play in creating a positive environment, and Xbox continues to innovate to enhance safety and respectful interaction between players.

The full report can be found here.

Source: Microsoft

Featured Image Credits: Third Xbox Transparency Report Shows Our Evolving Approach to Creating Safer Gaming Experiences

About Marc Shakour

Former video game programmer, columnist, teacher, competitor ... Marc has always been very familiar with the world and industry of video games. He decided to help neophytes about it, to discover new universes, worlds and fantastic creatures.

View all posts by Marc Shakour