Australia made global headlines by banning social media access for anyone under 16, with the law taking effect December 10, 2025. Platforms like Instagram, Snapchat, X, Facebook, and TikTok face fines up to $50 million if they don’t prevent underage users from creating accounts. But there’s a massive hole in this supposedly comprehensive child protection law: gaming platforms are completely exempt. Kids can’t use Instagram, but they can spend unlimited hours in Roblox, Fortnite, or Minecraft chatting with strangers. Parents and experts are calling out this bizarre inconsistency, and even the government admits gaming poses similar risks.
The Ban That Shook Big Tech
The Online Safety Amendment passed Parliament on November 29, 2024, and became enforceable December 10, 2025. It’s the world’s first national law completely banning children under 16 from major social media platforms. The legislation puts the burden on platforms, not parents or kids, to implement age verification systems and prevent underage account creation. Non-compliance carries penalties up to AUD $49.5 million, roughly $33 million USD.
Communications Minister Michelle Rowland framed the law as protecting children from algorithmic curation, psychological manipulation designed to encourage endless engagement, cyberbullying, and predatory content. Ten platforms initially fell under the ban: Instagram, Facebook, Snapchat, TikTok, X (formerly Twitter), Reddit, and others. Twitch was added to the list shortly after when regulators realized the livestreaming platform primarily functions as social media despite its gaming focus.
Why Gaming Got The Exemption
Gaming platforms were explicitly excluded from the age-restricted definition through subsidiary legislative rules. The government’s rationale hinges on three criteria used to determine what counts as a social media platform: whether the primary purpose is facilitating online social interaction, whether users can interact with each other, and whether user-generated posts are allowed. According to the eSafety Commissioner, gaming was exempted because its main function doesn’t revolve around social media-style interaction.
Minister Rowland defended the exemption by claiming games are already regulated under Australia’s National Classification Scheme, and adding age-based regulation would create “unnecessary regulatory overlap.” She acknowledged risks exist in gaming, stating: “We are not saying that risks don’t exist on messenger apps or online gaming. While users can still be exposed to harmful content by other users, they do not face the same algorithmic curation of content and psychological manipulation to encourage near endless engagement.”

The Classification System Defense
The National Classification Scheme argument is weak at best. Australia’s game rating system caps at MA15+, meaning games suitable for 15-year-olds with parental guidance. There is no R18+ equivalent for games like there is for movies, and the classification system focuses on content within games, not the social features surrounding them. A game rated E for Everyone can still have unrestricted voice chat with strangers, and the classification won’t reflect that.
This creates an absurd situation where a 14-year-old can’t post a photo on Instagram because of algorithmic manipulation concerns, but can spend eight hours daily in Fortnite’s Creative mode chatting with random adults while the game’s engagement systems encourage continuous play through battle passes, daily challenges, and FOMO-inducing limited-time events. If algorithmic manipulation is the concern, gaming platforms perfected those dark patterns years before social media copied them.
Parents And Experts Call It Out
Dr. Joanne Orlando, a researcher who studies children’s technology use, questioned the exemption’s logic in interviews with BBC. She finds it peculiar that gaming platforms were excluded, noting: “Gaming and social media are so intertwined that it’s challenging to differentiate between them. Those who spend excessive time gaming often also engage heavily with social media, where they can watch other gamers or live stream their gaming sessions.”
The concerns extend beyond just time spent. The Australian Federal Police have warned that chatrooms on gaming platforms can be breeding grounds for radicalization and child exploitation. Roblox specifically has faced numerous reports of predators using the platform to groom children. In 2023, Bloomberg documented multiple cases of adults using Roblox’s chat features to target minors for sexual exploitation. Yet Roblox remains completely unrestricted under Australia’s new law.
The Addiction Argument
Critics argue gaming can be just as addictive as social media, if not more so. Games use sophisticated reward schedules, progression systems, and social pressure to keep players engaged. Kids aren’t just playing games anymore. They’re attending virtual concerts in Fortnite, building social worlds in Minecraft, and maintaining friend groups primarily through gaming platforms. The line between “game” and “social platform” barely exists.
A parent interviewed by BBC expressed frustration: “My son can’t use Instagram to see what his friends are doing, but he can spend all night in Roblox chatting with people I’ve never met and will never know anything about. How is that safer?” The sentiment captures the core problem with gaming’s exemption. Parents lose visibility into their children’s online social lives because those interactions moved from regulated social media to unregulated gaming platforms.
Roblox Is Being Watched Closely
Minister Anika Wells confirmed the eSafety Commissioner is monitoring Roblox closely, emphasizing that the social media ban is “not a cure; it’s a treatment plan” that will “continue to evolve.” This suggests regulators know the gaming exemption is problematic but weren’t ready to fight that battle during initial implementation. Roblox presents a particularly difficult case because it’s explicitly a social platform where the primary activity is interacting with other users in shared virtual spaces.
The government maintains they’ll continuously evaluate the list of banned platforms. Twitch’s addition shortly after the law passed demonstrates they’re willing to expand coverage. But adding platforms reactively creates a whack-a-mole problem. By the time regulators identify a gaming platform functioning primarily as social media, a new generation of kids has already migrated there specifically to escape the regulations on traditional platforms.

The Discord And Messaging Exemption
Besides gaming, the law also exempts messaging apps. Discord announced it would restrict new account creation for under-16s in Australia starting December 10, suggesting it sees itself falling under the social media category rather than messaging. This is smart positioning because Discord absolutely functions as social media with servers, communities, and content sharing beyond just direct messaging.
The messaging exemption makes more sense than gaming because one-to-one communication with known contacts poses fewer risks than algorithmically-curated public feeds. WhatsApp, Signal, and traditional SMS don’t have the same psychological manipulation mechanics as TikTok’s For You page. But Discord demonstrates how blurry these categories are. Is it messaging when you DM a friend? Social media when you join a 10,000-person server? Both exist on the same platform.
Education Platforms Also Exempt
The law carves out exceptions for services that “significantly function to support the health and education of users.” This means Google Classroom, YouTube (without accounts), and similar educational platforms remain accessible. This exemption makes practical sense because remote learning requires digital tools. But again, enforcement gets messy. YouTube without an account is fine. YouTube with an account for commenting is banned. Kids will definitely understand and comply with that distinction.
The health and education exemption could also potentially cover mental health apps, telemedicine platforms, and educational games. Defining what “significantly functions” for health or education means will inevitably lead to edge cases where platforms argue they should be exempt while regulators disagree. Every borderline case becomes a potential legal battle with millions in fines at stake.

Age Verification Is The Real Problem
Beyond which platforms are covered, the law’s practical implementation faces significant challenges around age verification. The legislation requires platforms to take “reasonable steps” to prevent underage users from creating accounts, but doesn’t specify what reasonable means. This ambiguity leaves platforms uncertain about compliance requirements while the government works out technical standards during a 12-month implementation period.
Proposed verification methods range from uploading government ID to biometric facial scanning. Privacy advocates rightfully worry about companies collecting sensitive identification documents from everyone just to use social media. Data breaches happen constantly, and now platforms would hold massive databases of ID scans perfect for identity theft. Discord’s recent experiments with facial scanning in Australia and the UK for age verification generated immediate backlash over privacy concerns.
VPNs And Workarounds
Tech-savvy teenagers will immediately realize VPNs let them bypass geographic restrictions. Connect to a US server, suddenly you’re not subject to Australian law. The same kids being protected by this legislation are exactly the demographic most likely to know how to circumvent it. This creates a perverse outcome where compliant kids follow the rules while technically proficient ones freely access everything anyway.
Even without VPNs, kids will share older siblings’ accounts, lie about birthdates during signup, or use parents’ accounts to access restricted platforms. Social media companies have dealt with age restrictions for years, and underage users consistently find ways around them. The difference is now penalties are $50 million instead of just terms of service violations, supposedly encouraging more aggressive enforcement. Whether that financial threat actually changes behavior remains to be seen.
Global Implications
Other countries are watching Australia’s experiment closely. The UK, France, and several US states have proposed similar age restrictions. If Australia’s implementation goes smoothly, expect a wave of copycat legislation globally. If it becomes an enforcement nightmare with kids easily circumventing restrictions, other governments might pursue different approaches like parental controls or educational programs instead of blanket bans.
Tech companies are particularly concerned because fragmented regulations across jurisdictions create compliance nightmares. Building separate systems for Australia versus Europe versus California versus Texas gets expensive and complicated quickly. This is why industry groups lobbied hard against Australia’s law, not just because of the restrictions but because it sets precedent for a world where every country has different age verification requirements.
Frequently Asked Questions
When did Australia’s social media ban take effect?
December 10, 2025. The law passed Parliament on November 29, 2024, and became enforceable the following December.
Which platforms are banned for under-16s?
Instagram, Facebook, Snapchat, TikTok, X (Twitter), Reddit, and Twitch. The government maintains a list that can be updated to include additional platforms.
Why is gaming exempt from the ban?
The government claims gaming’s primary purpose isn’t social media-style interaction and games are already regulated under the National Classification Scheme. Critics say this reasoning is flawed.
What happens if platforms don’t comply?
Fines up to AUD $49.5 million (approximately $33 million USD) for platforms that don’t take reasonable steps to prevent underage account creation.
Can kids still use YouTube?
Yes, but without accounts. They can watch videos but can’t comment, subscribe, or engage with social features that require account creation.
Are messaging apps affected?
No. WhatsApp, Signal, and similar one-to-one messaging apps are exempt. Discord voluntarily restricted under-16 account creation despite the exemption.
How will age verification work?
Not finalized yet. Platforms have 12 months to implement systems. Options include ID verification, biometric scanning, or other methods deemed “reasonable.”
Is Roblox banned?
No. Despite functioning primarily as a social platform, Roblox is classified as gaming and remains exempt. The eSafety Commissioner is monitoring it closely.
The Gaming Exemption Undermines Everything
If Australia’s goal was protecting children from algorithmic manipulation, predatory content, cyberbullying, and psychological harm from social platforms, exempting gaming makes the entire law incoherent. Games use the same engagement tactics as social media. They host the same toxic communities. They expose children to the same stranger-danger risks. The only difference is the interface. Instead of scrolling a feed, kids are running around a 3D world. The harms are identical.
The most generous interpretation is that regulators understood gaming should be included but lacked the political will to fight the gaming industry simultaneously with social media companies. Taking on Meta, TikTok, and X was ambitious enough. Adding Epic Games, Mojang, and Roblox Corporation to the list of enemies would have made passage even harder. So they compromised, planning to expand coverage later once the framework was established.
The cynical interpretation is that gaming companies lobbied more effectively than social media platforms, or that regulators genuinely don’t understand how modern games function. Either explanation is concerning. If it’s political compromise, kids remain exposed to identical harms in unregulated gaming spaces. If it’s ignorance about gaming, the law was written by people who don’t understand the digital landscape they’re regulating.
Parents watching this unfold are justifiably confused and frustrated. They can’t explain to their children why Instagram is dangerous but Fortnite is fine when both involve chatting with strangers, both use engagement mechanics to maximize time spent, and both expose kids to inappropriate content. The inconsistency undermines the law’s credibility and makes enforcement by parents harder because the reasoning doesn’t make intuitive sense.
Australia deserves credit for attempting meaningful child protection legislation in the digital age. Most countries talk about protecting children online while doing nothing substantive. But good intentions don’t excuse poorly designed policy. The gaming exemption is a massive hole that will likely result in kids migrating their social lives from regulated platforms to unregulated gaming spaces, achieving the opposite of the law’s intended effect. Until regulators close that loophole, Australia’s groundbreaking social media ban remains more symbolic than effective.