Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

The UK’s Online Safety Bill has passed through the final stages of parliamentary scrutiny and will shortly come into effect, following Royal Assent. This legislation represents the biggest change to intermediary liability in the UK for 20 years. Businesses operating in the UK will be subject to a new regulatory regime to address illegal and harmful content online. The legislation empowers the government and Ofcom (the UK communications regulator) to set detailed requirements for tech companies to fulfil in order to meet a duty of care toward their users and is backed by fines of up to GBP 18 million or 10% of global turnover, whichever is greater.

The Online Safety Bill discussion dates back nearly seven years, with the government’s Internet Safety Strategy green paper looking at the responsibility of companies to keep users safe and prevent online harms first published in 2017. The bill has seen repeated pauses through the eight secretaries of state charged with overseeing it since that time, with Michelle Donelan returning to office in July 2023.

Primary obligations fall on tech businesses that provide a user-to-user service (Category 1), search services (Category 2A), and adult content providers displaying pornographic content (Category 2B). These are referred to as regulated service providers and will be tracked in an Ofcom registry. Ofcom will make recommendations to the Secretary of State to define relevant thresholds for designation and this will be introduced shortly after the bill. Businesses concerned over their potential exposure as a regulated service provider should follow the developments of secondary legislation closely.

The scope is wide and applies to a variety of services, such as social media and online games. The targeted functionalities are also broad, including actions such as posting images, uploading videos, and sending direct or private messages. However, Ofcom has indicated that while most tech businesses will not fall within the scope of a regulated service provider, they will still need to undertake risk assessments.

The bill looks at two main types of content: that which is illegal and that which is harmful to children. The primary legislation establishes online safety requirements, such as designing systems with effective risk management processes, incorporating a higher protection standard for children, and utilising controls such as age verification technology

In the final rounds of amendment, the government resolved – or perhaps merely postponed – tension over requirements for encrypted services, which challenges the UK operations of businesses such as WhatsApp and Signal. The government clarified that Ofcom would only require companies to scan their networks when the technology capable of doing so is developed, which many experts believe could take years. Critics argue that such technology does not exist without undermining user privacy. The government assures that it will only enforce these measures as a last resort with stringent privacy safeguards in place.

How does the Online Safety Bill differ from the Digital Services Act?

The EU and UK have adopted a broadly different content moderation regulatory approach, varying largely in scope, approach, and enforcement. Compared to the DSA, the Online Safety Bill implements a more complex and tougher regime. The fines for non-compliance are higher and the regime distinguishes between different obligations for different types of illegal content. Whilst the Online Safety Bill identified specific regulated service providers, all services are subject to risk assessment obligations and will be held accountable by Ofcom.

What next?

Following Royal Assent, which has no given timeline but is likely to be in the next couple of weeks, Ofcom will launch into a consultation phase about the enforcement of the bill. This is a prime opportunity for businesses to engage in shaping the implementation of the regime and informing secondary legislation. Within the first 100 days, Ofcom has committed to publish draft codes of illegal content harm, draft guidance on illegal content risk assessment, transparency reporting and enforcement.

Access Partnership continues to engage in the implementation of the Online Safety regime closely. Please contact Michael Laughton at [email protected] or Jessica Birch at [email protected] to understand more about what his new content moderation regime means for your business.

Related Articles

Driving Brazil’s app ecosystem: The economic impact of Google Play and Android

Driving Brazil’s app ecosystem: The economic impact of Google Play and Android

With the largest Internet population in Latin America and the fourth-largest market for app adoption globally, Brazil is an established...

15 Apr 2024 Opinion
Access Alert: Brazilian authorities ask for contributions on AI and connectivity

Access Alert: Brazilian authorities ask for contributions on AI and connectivity

On 9 April, Brazil’s National Telecommunications Authority (Anatel) released a public consultation to gather contributions and insights about the role...

11 Apr 2024 Latest AI Thought Leadership
Access Alert: Orbiting innovation – key satellite industry trends unveiled at SATELLITE 2024

Access Alert: Orbiting innovation – key satellite industry trends unveiled at SATELLITE 2024

The SATELLITE 2024 conference in Washington, DC, took place between 18-21 March 2024. The event brought together close to 15,000...

28 Mar 2024 Opinion
Access Alert: Saudi Arabia launches consultation on spectrum management

Access Alert: Saudi Arabia launches consultation on spectrum management

Continuing the efforts carried out by the Communications and Information Technology Commission (CST) to improve Saudi Arabia’s regulatory framework and...

26 Mar 2024 Opinion