what are the COPPA, UK online safety, and AU under-16 law

The New Laws Protecting Your Child Online — And What They Still Don’t Cover (2026)

If you’re a parent trying to figure out what the law actually does to keep your kids safe online, you’re not alone. The landscape is shifting fast. In the US, COPPA has been around since 1998 but just got its first major update in decades. The UK’s Online Safety Act became enforceable in March 2025 and has now been in full operation for a year. Australia went further than most expected by outright banning social media for under-16s in late 2024, with implementation now underway. Understanding new child online protection laws has become essential for any parent trying to navigate this space, but here’s the uncomfortable truth about child online safety: these regulations help, but they don’t solve everything.

Some people argue that with all these new laws, parents can finally relax. Platforms will handle it. Government enforcement will catch the bad actors. Kids are protected by default now. This view misses something fundamental. Regulations set floors, not ceilings. They tell platforms the minimum they must do, not the maximum they should do. And enforcement? It’s inconsistent at best. The gaps that remain are yours to fill.

Think of it like climbing a mountain. The trail markers (regulations) tell you where the path is supposed to go. They warn you about cliffs and suggest where to cross streams. But they don’t carry your pack. They don’t check the weather. They don’t make sure you brought enough water. Those decisions are yours.

What Do COPPA, the UK Online Safety Act, and Australia’s Under-16 Ban Require From Platforms?

Overview of COPPA Requirements in the US

COPPA—the Children’s Online Privacy Protection Act—has been the baseline for child privacy in America since Bill Clinton signed it into law in 1998. The core requirement: websites and apps directed at children under 13, or that knowingly collect data from kids that age, must get verifiable parental consent before gathering personal information. They also have to post clear privacy policies and give parents control over what data gets collected and shared.

The FTC updated COPPA rules in 2024, tightening requirements around targeted advertising to children and expanding what counts as “personal information” in an era of biometric data and precise geolocation. But COPPA still only covers under-13s. A 14-year-old is legally treated as an adult for data collection purposes—which, if you have a teenager, you know is absurd.

what is the UK online safety act

Key Obligations of the UK Online Safety Act

The UK took a different approach. The Online Safety Act received Royal Assent in October 2023, with illegal harms safety duties becoming enforceable on March 17, 2025. Now, one year into active enforcement, platforms operating in the UK must assess and mitigate risks of harm to children from content itself. That means illegal content, yes, but also content that’s legal for adults but harmful for kids—think eating disorder content, self-harm tutorials, or violent material.

Platforms are required to use “proportionate systems and processes” to prevent children from encountering this content. Age verification or age estimation technology is now required for services likely to be accessed by minors. The kicker: platforms face significant fines and, in theory, criminal liability for senior managers who don’t comply. In August 2025, Ofcom fined 4chan £20,000 for alleged non-compliance—the first major enforcement action under the Act—with additional fines accruing at £100 per day.

what is the Australia under-16 age ban

Australia’s Under-16 Age Ban and Its Enforcement

Australia made headlines in November 2024 with the Online Safety Amendment (Social Media Minimum Age) Act 2024, which prohibits social media platforms from allowing users under 16. No exceptions for parental consent. No “well, they said they were 18” loopholes—at least in theory. Platforms must take “reasonable steps” to verify ages, with technical standards still being refined during the implementation period.

The law is now in its implementation phase as of early 2026, with platforms actively working to comply with verification requirements. The intent is clear: Australia decided that below a certain age, children simply shouldn’t be on these platforms at all. Whether that’s fully enforceable remains to be seen as the rollout continues.

The table below summarizes the core requirements and current enforcement status of each major regulation:

RegulationAge ThresholdCore RequirementEnforcement Status (March 2026)
COPPA (US)Under 13Parental consent for data collectionActive, updated 2024
UK Online Safety ActAll minorsRisk mitigation, age assuranceFully enforceable since March 2025
Australia Social Media BanUnder 16Platform access prohibitionImplementation phase, 2025-2026

Real compliance varies wildly. Some platforms have invested heavily in age verification—others still rely on self-declaration checkboxes that any kid can click through. The UK’s approach of requiring “proportionate” measures sounds reasonable until you realize that platforms interpret this standard differently, often in ways that favor minimal friction over robust protection.

What Practical Protections Do These Regulations Give Parents Right Now?

Here’s what you can actually use today. Under COPPA, if your child is under 13 and using a service that collects their data, you have legal grounds to demand that data be deleted. You can refuse consent for targeted advertising. You can request access to what’s been collected. Most major platforms have compliance portals for exactly this—though finding them often requires digging through privacy settings.

In the UK, parents can report content that shouldn’t be accessible to children. Ofcom has set up complaint mechanisms, and platforms face real pressure to respond. If your child encounters harmful content on a major platform, there’s now regulatory backing for your complaint—and with enforcement active for a full year, platforms are taking these obligations more seriously.

Australia’s protections are still developing as the under-16 ban rolls out. Parents there can currently rely on existing parental controls and platform-specific features while age verification systems become operational. The expectation is that by late 2026, major platforms will have compliant systems in place.

The limitation? Enforcement remains reactive, not proactive. Regulators investigate after harm occurs, after complaints pile up, after some journalist writes an exposé. They’re not watching your kid’s feed in real time. No law can be.

Frequently Asked Questions

What protections does COPPA really offer?

COPPA requires websites and apps to get parental consent before collecting personal data from children under 13. It also gives parents the right to review, delete, and control what data is collected. The 2024 updates expanded protections around targeted advertising and biometric data, but the law still doesn’t cover teenagers.

How does the UK Online Safety Act affect my child’s favorite apps?

If the app operates in the UK, it must now assess risks of harm to children and implement measures to mitigate them. This includes age verification, content filtering, and reporting mechanisms. With enforcement active since March 2025, popular platforms are under ongoing Ofcom scrutiny, and we’ve already seen the first fines issued for non-compliance.

Can Australia’s under-16 ban prevent all harmful interactions?

No. The ban covers social media platforms but not messaging apps, gaming services, or other online spaces. Enforcement depends on age verification technology that’s still being refined during the current implementation phase. It’s a significant step, but gaps remain that parents will need to address through