Legal & Safety
Parents
UK Online Safety Act 2026
A detailed guide to your new legal protections under the 2026 safety standards.
THE LAW
The 2026 update to the UK Online Safety Act has shifted the burden of proof onto the tech giants. Platforms are now legally classified as 'Publishers' of the content they promote via algorithms, making them liable for the harm caused by those algorithms.
YOUR NEW RIGHTS
1. The 48-Hour Takedown: For any content involving harassment, bullying, or 'deepfake' non-consensual imagery, platforms must remove the content within 48 hours of a report or face massive fines.
2. Mandatory Age Verification: Apps with high social interactivity must now use secure ID verification or 'Face Estimation' technology. This prevents children under 13 from accessing 'adult-lite' content.
3. Duty of Care for Algorithms: If an algorithm is found to be intentionally promoting eating disorder content or self-harm to minors, the platform executives can now face personal legal action.
4. The CEOP Link: Every major app in the UK must now feature a direct, one-click link to the Child Exploitation and Online Protection Command.
HOW TO USE THIS
If you report content and the platform ignores you, keep a screenshot of your report. Under the 2026 law, you can escalate this to Ofcom, who now has the power to block the app's access to the UK market until they comply.
The 2026 update to the UK Online Safety Act has shifted the burden of proof onto the tech giants. Platforms are now legally classified as 'Publishers' of the content they promote via algorithms, making them liable for the harm caused by those algorithms.
YOUR NEW RIGHTS
1. The 48-Hour Takedown: For any content involving harassment, bullying, or 'deepfake' non-consensual imagery, platforms must remove the content within 48 hours of a report or face massive fines.
2. Mandatory Age Verification: Apps with high social interactivity must now use secure ID verification or 'Face Estimation' technology. This prevents children under 13 from accessing 'adult-lite' content.
3. Duty of Care for Algorithms: If an algorithm is found to be intentionally promoting eating disorder content or self-harm to minors, the platform executives can now face personal legal action.
4. The CEOP Link: Every major app in the UK must now feature a direct, one-click link to the Child Exploitation and Online Protection Command.
HOW TO USE THIS
If you report content and the platform ignores you, keep a screenshot of your report. Under the 2026 law, you can escalate this to Ofcom, who now has the power to block the app's access to the UK market until they comply.