
The legislation empowers Ofcom to levy fines of up to 10% (or up to 18 million pounds whichever is higher) of annual turnover for violations of the regime. The Online Safety (nee Harms) Bill has been years in the making as UK policymakers have grappled with how to response to a range of online safety concerns. In 2019 these efforts manifested as a white paper with a focus on rules for tackling illegal content (such as terrorism and CSAM) but also an ambition to address a broad sweep of online activity that might be considered harmful, such as violent content and the incitement of violence; encouraging suicide; disinformation; cyber bullying; and adult material being accessed by children. The effort then morphed into a bill that was finally published in May 2021. […]
In a brief statement the UK’s new web content sheriff gave no hint of the complex challenges that lie ahead — merely welcoming the bill’s passage through parliament and stating that it stands ready to implement the new rulebook. “Today is a major milestone in the mission to create a safer life online for children and adults in the UK. Everyone at Ofcom feels privileged to be entrusted with this important role, and we’re ready to start implementing these new laws,” said Dame Melanie Dawes, Ofcom’s CEO. “Very soon after the Bill receives Royal Assent, we’ll consult on the first set of standards that we’ll expect tech firms to meet in tackling illegal online harms, including child sexual exploitation, fraud and terrorism.” Beyond specific issues of concern, there is over-arching general worry over the scale of the regulatory burden the legislation will apply to the UK’s digital economy — since the rules apply not only to major social media platforms; scores of far smaller and less well resourced online services must also comply or risk big penalties.