Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

The UK’s Online Safety Bill has passed through the final stages of parliamentary scrutiny and will shortly come into effect, following Royal Assent. This legislation represents the biggest change to intermediary liability in the UK for 20 years. Businesses operating in the UK will be subject to a new regulatory regime to address illegal and harmful content online. The legislation empowers the government and Ofcom (the UK communications regulator) to set detailed requirements for tech companies to fulfil in order to meet a duty of care toward their users and is backed by fines of up to GBP 18 million or 10% of global turnover, whichever is greater.

The Online Safety Bill discussion dates back nearly seven years, with the government’s Internet Safety Strategy green paper looking at the responsibility of companies to keep users safe and prevent online harms first published in 2017. The bill has seen repeated pauses through the eight secretaries of state charged with overseeing it since that time, with Michelle Donelan returning to office in July 2023.

Primary obligations fall on tech businesses that provide a user-to-user service (Category 1), search services (Category 2A), and adult content providers displaying pornographic content (Category 2B). These are referred to as regulated service providers and will be tracked in an Ofcom registry. Ofcom will make recommendations to the Secretary of State to define relevant thresholds for designation and this will be introduced shortly after the bill. Businesses concerned over their potential exposure as a regulated service provider should follow the developments of secondary legislation closely.

The scope is wide and applies to a variety of services, such as social media and online games. The targeted functionalities are also broad, including actions such as posting images, uploading videos, and sending direct or private messages. However, Ofcom has indicated that while most tech businesses will not fall within the scope of a regulated service provider, they will still need to undertake risk assessments.

The bill looks at two main types of content: that which is illegal and that which is harmful to children. The primary legislation establishes online safety requirements, such as designing systems with effective risk management processes, incorporating a higher protection standard for children, and utilising controls such as age verification technology

In the final rounds of amendment, the government resolved – or perhaps merely postponed – tension over requirements for encrypted services, which challenges the UK operations of businesses such as WhatsApp and Signal. The government clarified that Ofcom would only require companies to scan their networks when the technology capable of doing so is developed, which many experts believe could take years. Critics argue that such technology does not exist without undermining user privacy. The government assures that it will only enforce these measures as a last resort with stringent privacy safeguards in place.

How does the Online Safety Bill differ from the Digital Services Act?

The EU and UK have adopted a broadly different content moderation regulatory approach, varying largely in scope, approach, and enforcement. Compared to the DSA, the Online Safety Bill implements a more complex and tougher regime. The fines for non-compliance are higher and the regime distinguishes between different obligations for different types of illegal content. Whilst the Online Safety Bill identified specific regulated service providers, all services are subject to risk assessment obligations and will be held accountable by Ofcom.

What next?

Following Royal Assent, which has no given timeline but is likely to be in the next couple of weeks, Ofcom will launch into a consultation phase about the enforcement of the bill. This is a prime opportunity for businesses to engage in shaping the implementation of the regime and informing secondary legislation. Within the first 100 days, Ofcom has committed to publish draft codes of illegal content harm, draft guidance on illegal content risk assessment, transparency reporting and enforcement.

Access Partnership continues to engage in the implementation of the Online Safety regime closely. Please contact Michael Laughton at [email protected] or Jessica Birch at [email protected] to understand more about what his new content moderation regime means for your business.

Related Articles

Access Alert: India General Elections 2024 – What’s Next?

Access Alert: India General Elections 2024 – What’s Next?

Between 19 April and 1 June, India held the world’s largest democratic elections, with 969 million eligible voters. This marathon...

8 Jul 2024 Opinion
Access Alert: 2024 UK general election – Labour triumphs with pledge for change

Access Alert: 2024 UK general election – Labour triumphs with pledge for change

Labour landslide UK voters have elected the first Labour government since 2010, ending 14 years of Conservative-led administrations. At the...

5 Jul 2024 Opinion
India’s App Market: Creating Global Impact

India’s App Market: Creating Global Impact

The Indian app market is experiencing rapid growth and continues to solidify its position as a major global player. For...

2 Jul 2024 Opinion
The State of Broadband 2024 Annual Report: Leveraging AI for Universal Connectivity

The State of Broadband 2024 Annual Report: Leveraging AI for Universal Connectivity

With the artificial intelligence (AI) revolution already well underway, the Broadband Commission has added yet another task to AI’s to-do...

2 Jul 2024 Opinion