The UK government is taking major steps to tackle child online safety, as new rules under the Online Safety Act come into force. Online platforms and pornography websites must now implement stricter age-checking and child protection measures or face serious consequences.
Campaigner and crossbench peer Beeban Kidron has urged the government to “detoxify the dopamine loops” that drive addictive behavior in children on social media. Kidron is calling on Technology Secretary Peter Kyle to use his powers under the Online Safety Act to introduce new codes of conduct addressing both disinformation and addictiveness engineered into online content.
“These platforms are investing billions to make their services addictive,” Kidron said. “It’s not a ‘nanny state’ move — it’s about protecting under-18s using powers that already exist.”
Kidron’s campaign group, 5Rights, highlighted features such as like counters, notifications, and expiring content like Instagram stories, as mechanisms that exploit dopamine responses to hook users — especially children.
Deadline for Online Platforms to Act
Friday marked a critical deadline for major tech companies like Facebook, Instagram, YouTube, TikTok, and X (formerly Twitter) to comply with new child safety requirements. These include shielding children from explicit content, harmful material, and online abuse.
X announced it would restrict access to adult material by default unless users verify they are over 18 using facial recognition or official ID. It also plans to introduce facial age estimation tools. Other platforms including Discord, Reddit, and Bluesky have introduced or pledged similar age-gating measures.
Strict Penalties for Non-Compliance
Ofcom, the UK’s digital watchdog, warned that platforms prioritizing engagement over child online safety will face enforcement. Penalties include fines up to 10% of global turnover and potential prosecution of tech executives. Meta, for example, could face penalties up to $16.5 billion if found in breach.
Ofcom requires platforms to:
• Rapidly remove dangerous content
• Offer easy reporting tools for children
• Deploy content filters in recommendation algorithms
• Conduct age checks for harmful content access
The regulator also launched a wide-ranging monitoring program targeting platforms popular with young users, such as Roblox, Snap, and YouTube.
Pornography Sites Must Comply
Pornography websites, including Pornhub, must also implement strict age verification. Acceptable methods include:
• Facial age estimation using live photos
• Photo ID matching (e.g., passport + selfie)
• Credit card or mobile network age checks
• Secure digital identity wallets
Meta and TikTok claim to already meet the age-checking requirements through teen-specific settings and content filters. TikTok also announced additional age-gating features starting Friday.
The UK’s Online Safety Act represents a global benchmark in holding tech companies accountable for child online safety, with real consequences for those who fail to act.
