The eSafety Commission issued legally enforceable notices to Mojang parent Microsoft and other major platforms demanding details on moderation staffing, safety systems, and how they stop grooming and radicalization in games played by millions of kids.

Minecraft has come under formal regulatory pressure in Australia. The eSafety Commission sent legally binding notices to platform owners including the company behind Minecraft requiring detailed information about their child safety practices.
What exactly regulators are demanding
The notices ask for specifics on safety systems, staffing levels for moderation, and procedures to detect and prevent sexual exploitation and violent extremism. Failure to comply can lead to fines or court action. Nine in ten Australian children aged eight to seventeen play online games, turning titles like Minecraft into major social spaces.
eSafety Commissioner Julie Inman Grant highlighted two main risks: grooming that leads to contact offenses and the embedding of terrorist or extremist narratives directly inside gameplay. The regulator sees games as the new frontier after restricting social media access for minors.
Broader crackdown and Roblox parallels
This action follows Australia’s 2025 ban on social media for users under sixteen, though many children continued accessing banned apps. Roblox faces over 140 related lawsuits in the United States and recently reached multimillion dollar settlements with two states while rolling out age-specific accounts.
For the Minecraft community the notices raise practical questions about chat filtering, reporting tools, Realms moderation, and server hosting rules. Multiplayer has always been one of the game’s biggest draws, but increased scrutiny could force visible changes to how players interact.
No immediate response from Microsoft or Mojang has been detailed in coverage of the notices. The deadline for compliance and any potential policy shifts will likely shape discussion in the coming weeks among server owners, parents, and creators.
- Notices issued April 22 2026 are legally enforceable
- Focus includes both sexual predation and radicalization risks
- Part of post social media ban effort to protect minors
- Nine out of ten Australian kids aged 8 17 play online games
Other







