Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

Access Alert: The Online Safety Bill awaits Royal Assent – what does this mean for businesses operating in the UK?

The UK’s Online Safety Bill has passed through the final stages of parliamentary scrutiny and will shortly come into effect, following Royal Assent. This legislation represents the biggest change to intermediary liability in the UK for 20 years. Businesses operating in the UK will be subject to a new regulatory regime to address illegal and harmful content online. The legislation empowers the government and Ofcom (the UK communications regulator) to set detailed requirements for tech companies to fulfil in order to meet a duty of care toward their users and is backed by fines of up to GBP 18 million or 10% of global turnover, whichever is greater.

The Online Safety Bill discussion dates back nearly seven years, with the government’s Internet Safety Strategy green paper looking at the responsibility of companies to keep users safe and prevent online harms first published in 2017. The bill has seen repeated pauses through the eight secretaries of state charged with overseeing it since that time, with Michelle Donelan returning to office in July 2023.

Primary obligations fall on tech businesses that provide a user-to-user service (Category 1), search services (Category 2A), and adult content providers displaying pornographic content (Category 2B). These are referred to as regulated service providers and will be tracked in an Ofcom registry. Ofcom will make recommendations to the Secretary of State to define relevant thresholds for designation and this will be introduced shortly after the bill. Businesses concerned over their potential exposure as a regulated service provider should follow the developments of secondary legislation closely.

The scope is wide and applies to a variety of services, such as social media and online games. The targeted functionalities are also broad, including actions such as posting images, uploading videos, and sending direct or private messages. However, Ofcom has indicated that while most tech businesses will not fall within the scope of a regulated service provider, they will still need to undertake risk assessments.

The bill looks at two main types of content: that which is illegal and that which is harmful to children. The primary legislation establishes online safety requirements, such as designing systems with effective risk management processes, incorporating a higher protection standard for children, and utilising controls such as age verification technology

In the final rounds of amendment, the government resolved – or perhaps merely postponed – tension over requirements for encrypted services, which challenges the UK operations of businesses such as WhatsApp and Signal. The government clarified that Ofcom would only require companies to scan their networks when the technology capable of doing so is developed, which many experts believe could take years. Critics argue that such technology does not exist without undermining user privacy. The government assures that it will only enforce these measures as a last resort with stringent privacy safeguards in place.

How does the Online Safety Bill differ from the Digital Services Act?

The EU and UK have adopted a broadly different content moderation regulatory approach, varying largely in scope, approach, and enforcement. Compared to the DSA, the Online Safety Bill implements a more complex and tougher regime. The fines for non-compliance are higher and the regime distinguishes between different obligations for different types of illegal content. Whilst the Online Safety Bill identified specific regulated service providers, all services are subject to risk assessment obligations and will be held accountable by Ofcom.

What next?

Following Royal Assent, which has no given timeline but is likely to be in the next couple of weeks, Ofcom will launch into a consultation phase about the enforcement of the bill. This is a prime opportunity for businesses to engage in shaping the implementation of the regime and informing secondary legislation. Within the first 100 days, Ofcom has committed to publish draft codes of illegal content harm, draft guidance on illegal content risk assessment, transparency reporting and enforcement.

Access Partnership continues to engage in the implementation of the Online Safety regime closely. Please contact Michael Laughton at [email protected] or Jessica Birch at [email protected] to understand more about what his new content moderation regime means for your business.

Related Articles

AI for All in Thailand: Building an AI-ready economy with Google

AI for All in Thailand: Building an AI-ready economy with Google

อ่านบทความนี้เป็นภาษาไทย A doctor in Bangkok analyzes medical images with AI, leading to a faster, more accurate diagnosis for her patient....

19 Dec 2024 AI Policy Lab
The Role of Earth Observation in Combating Desertification in Middle Eastern Countries

The Role of Earth Observation in Combating Desertification in Middle Eastern Countries

This month’s UNCCD COP16 in Riyadh marked a pivotal moment in combating global land degradation and drought, with outcomes including...

13 Dec 2024 Opinion
Access Alert: Enhancing Efficiency in India’s Logistics Through AI and Digital Integration

Access Alert: Enhancing Efficiency in India’s Logistics Through AI and Digital Integration

A recent panel discussion at the Bengaluru Tech Summit 2024 on 20 November 2024 focused on the transformative role of...

29 Nov 2024 Opinion
Access Alert: How Will Deepfake Regulations in APAC Impact Your Business?

Access Alert: How Will Deepfake Regulations in APAC Impact Your Business?

The rise of deepfakes – AI-generated content that manipulates audio, video, or images to create realistic but false representations –...

29 Nov 2024 Opinion