
The legislation, rolling out through 2025, imposes systemic duties on online platforms—including non-UK providers—to tackle illegal and harmful content, especially content affecting children. Ofcom’s enforcement powers include fines up to £18m or 10% of global turnover, service blocks, and criminal liability for executives. The Act’s extraterritorial scope means global companies must comply if UK users are at risk.
While not directly regulating disinformation, the Act mandates risk-based safety measures and age-appropriate protections. Ofcom’s codes of practice offer ‘safe harbour’ compliance routes, but alternative measures must prove equally effective.
With investigations already underway, companies face pressure to over-censor, risking free expression. The authors urge proactive compliance and close monitoring of Ofcom’s evolving guidance.