Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

The UK’s Online Safety Bill has passed through the final stages of parliamentary scrutiny and will shortly come into effect, following Royal Assent. This legislation represents the biggest change to intermediary liability in the UK for 20 years. Businesses operating in the UK will be subject to a new regulatory regime to address illegal and harmful content online. The legislation empowers the government and Ofcom (the UK communications regulator) to set detailed requirements for tech companies to fulfil in order to meet a duty of care toward their users and is backed by fines of up to GBP 18 million or 10% of global turnover, whichever is greater.

The Online Safety Bill discussion dates back nearly seven years, with the government’s Internet Safety Strategy green paper looking at the responsibility of companies to keep users safe and prevent online harms first published in 2017. The bill has seen repeated pauses through the eight secretaries of state charged with overseeing it since that time, with Michelle Donelan returning to office in July 2023.

Primary obligations fall on tech businesses that provide a user-to-user service (Category 1), search services (Category 2A), and adult content providers displaying pornographic content (Category 2B). These are referred to as regulated service providers and will be tracked in an Ofcom registry. Ofcom will make recommendations to the Secretary of State to define relevant thresholds for designation and this will be introduced shortly after the bill. Businesses concerned over their potential exposure as a regulated service provider should follow the developments of secondary legislation closely.

The scope is wide and applies to a variety of services, such as social media and online games. The targeted functionalities are also broad, including actions such as posting images, uploading videos, and sending direct or private messages. However, Ofcom has indicated that while most tech businesses will not fall within the scope of a regulated service provider, they will still need to undertake risk assessments.

The bill looks at two main types of content: that which is illegal and that which is harmful to children. The primary legislation establishes online safety requirements, such as designing systems with effective risk management processes, incorporating a higher protection standard for children, and utilising controls such as age verification technology

In the final rounds of amendment, the government resolved – or perhaps merely postponed – tension over requirements for encrypted services, which challenges the UK operations of businesses such as WhatsApp and Signal. The government clarified that Ofcom would only require companies to scan their networks when the technology capable of doing so is developed, which many experts believe could take years. Critics argue that such technology does not exist without undermining user privacy. The government assures that it will only enforce these measures as a last resort with stringent privacy safeguards in place.

How does the Online Safety Bill differ from the Digital Services Act?

The EU and UK have adopted a broadly different content moderation regulatory approach, varying largely in scope, approach, and enforcement. Compared to the DSA, the Online Safety Bill implements a more complex and tougher regime. The fines for non-compliance are higher and the regime distinguishes between different obligations for different types of illegal content. Whilst the Online Safety Bill identified specific regulated service providers, all services are subject to risk assessment obligations and will be held accountable by Ofcom.

What next?

Following Royal Assent, which has no given timeline but is likely to be in the next couple of weeks, Ofcom will launch into a consultation phase about the enforcement of the bill. This is a prime opportunity for businesses to engage in shaping the implementation of the regime and informing secondary legislation. Within the first 100 days, Ofcom has committed to publish draft codes of illegal content harm, draft guidance on illegal content risk assessment, transparency reporting and enforcement.

Access Partnership continues to engage in the implementation of the Online Safety regime closely. Please contact Michael Laughton at [email protected] or Jessica Birch at [email protected] to understand more about what his new content moderation regime means for your business.

Related Articles

Access Alert: The intensifying battle between Musk and Ambani over India’s satellite broadband spectrum

Access Alert: The intensifying battle between Musk and Ambani over India’s satellite broadband spectrum

Space industry players should take note of the escalating competition in India’s satellite broadband market, as Elon Musk’s Starlink and...

25 Oct 2024 Opinion
2024 data reveals the urgent need for a village approach to child online safety

2024 data reveals the urgent need for a village approach to child online safety

Children are facing unprecedented online risks. Recent data shows that one in eight children globally (approximately 302 million) have fallen...

21 Oct 2024 Opinion
Access Alert: Mexico’s telecommunications and competition authorities set to be dissolved

Access Alert: Mexico’s telecommunications and competition authorities set to be dissolved

Overview On 13 October, Deputy Ricardo Monreal, leader of the ruling Morena party, announced that reforms to eliminate the Federal...

15 Oct 2024 Opinion
Cybersecurity skills in the EU: A new dawn?

Cybersecurity skills in the EU: A new dawn?

In September 2024, the Hungarian Presidency and the European Union Agency for Cybersecurity (ENISA) co-hosted the third annual edition of...

15 Oct 2024 Opinion