The Online Safety Bill – Status Report

  • by Professor Lorna Woods, Professor of Internet Law
  • 1 December 2022
  • 11 minute read

This blog was originally posted on LSE Media & Communications.

The Online Safety Bill returned to the headlines this week after the government announced some ‘improvements’ to the Bill, whose progress been on hold since July. Professor Lorna Woods of Essex University explains here the difference between the original Bill and the proposed changes, and explains what is likely to happen next.

The Online Safety Bill is returning to Parliament. While it will finish its interrupted Report Stage on 5 December based on the model introduced by the Johnson regime, the Sunak regime promises changes to the Bill to offer a “triple shield” of protection online in lieu of current protections for adults. So, what does the Bill do and how do the new proposals change things? To understand the change, a review of the original Bill is necessary. What follows is a summary of the Bill, not an assessment of whether it is good, bad or ugly.

Also note that it seems that the Bill will finish its report stage in its July 2022 version. The Government will then – in a highly unusual step – send the Bill back to committee to give MPs the chance to scrutinise amendments implementing the most significant of the Sunak-era policy changes. Then it will go to the Lords, most likely in January. Consequently, the precise detail of what is proposed is not yet available. What we do know is found in a press release from DCMS and a Written Ministerial Statement, as well as a couple of amendments already tabled.

The Bill is complex, partly because it does three distinct but interconnected things:

While it seems the pornography provisions will not be changed, the other two aspects of the Bill are to be amended.

The Original Bill

The Regulatory Regime (Part 3)

The Bill as introduced imposed obligations on two types of services: “user to user” services (essentially social media) and search. Search engines would be under less onerous obligations than user to user. The obligations were in essence structured around risk assessment requirements and obligations to mitigate risks, but these latter obligations were termed ‘safety duties’.  More detail on how to do a risk assessment and how to comply with safety duties will be found in guidance and codes produced by the regulator, Ofcom.  The duties distinguished between different categories of content:

In general, there are stricter duties with regard to illegal content than content harmful to children (with rules in relation to fraudulent advertising being dealt with separately and subject to slightly less stringent rules than the rest of the criminal rules), and in turn the duties to content harmful to adults. While the illegal content duty and the harmful to children duty both require the service provider to take proportionate measures to effectively mitigate and manage risks of harm, in relation to harmful to adults, the duty is to summarise in the terms of service the findings of the most recent risk assessment.  Moreover, only a sub-set of user-to-user services (those deemed higher risk and placed in ‘Category 1’) will be subject to the duties in relation to content harmful to adults. These Category 1 providers also have obligations in relation to democratic content and journalistic content.  All services have duties in relation to freedom of expression and privacy.

Each of these content categories have a sub-set designated as ‘priority’ in relation to which the safety duties specify more detailed rules.  A Written Ministerial Statement from July identified likely categories in addition to the priority illegal categories already listed in the Bill. The distinction between the categories of content can be seen in relation to priority content too: for illegal content there is an emphasis on ensuring that illegal content is not promoted and is taken down swiftly when the provider is notified about it.  For children the rules focus on ensuring that children should not encounter priority content.

By contrast, the rules in relation to adults are essentially about transparency – the provider should tell users how it responds to priority harms to adults. This includes the possibility of telling users it chooses to do nothing.  It means that headlines dealing with the proposed changes to the Bill, such as “Plan to make big tech remove harmful content axed” are wrong, simply because that was not required in the first place.

The harmful but legal duties also include the requirement to provide ‘user empowerment tools’. As part of this general obligation in relation to priority content harmful to adults, the Bill specifies that users should be free not to interact with unverified accounts (and services should provide tools for users to verify themselves).

Providers will be required to engage in transparency reporting delivered to Ofcom, which will then provide its own general transparency report based on the information supplied.

The Pornography Regime (Part 5)

A separate section deals with pornography providers – this relates to porn created or commissioned by the service provider, rather than user generated content. It contains no content-based duties (eg there is no obligation to take down illegal content on notification) but requires the providers “to ensure that children are not normally able to encounter” pornographic content. (For more detail on how the current regime deals with pornography see here.)

The Criminal Offences (Part 10)

The criminal offences added by the Online Safety Bill are not strictly part of the regime, though they will add to the scope of ‘illegal content’. They aim to tidy up some issues with communications related offences identified by the Law Commission. This also included a cyberflashing offence.

The Amendments

Much of the Bill will remain unchanged. The key areas of change involve:

“Triple Shield”

This comprises the following principles for protection of adults:

It seems likely that these are the amendments that will be considered when the Bill goes back to Committee.

More Crimes

The Written Ministerial Statement indicates a number of offences will be introduced:

The weekend saw news reports of the introduction of a “down-blousing” offence as well as one covering deepfake porn, but neither was listed in this statement – these seem to fall within the bailiwick of the Ministry of Justice. Rather, the statement noted that separate from the Bill there would be “additional laws to tackle a range of abusive behaviour including the installation of equipment, such as hidden cameras, to take or record images of someone without their consent.” There were no details as to timing on this.

The Bill included two new communications offences (cls 151 and 152); it seems cl 151 will be removed.  The threatening communications offence (cl 153) will stay. Neither the Malicious Communications Act nor Section 127 Communications Act will be repealed but the overlap between them and the new false and threatening communications offences will be removed, though it is not clear how or where.

Transparency and Other Provisions

Service providers are currently obliged to report to Ofcom in their transparency reporting but not to the general public. With the exception of content harmful to adults, however, providers have not been obliged to publish the outcome of their risk assessments. Now the Government is proposing that the largest platforms should publish summaries of their risk assessments for content that is harmful to children, so as to give parents greater information about the relative safety of services their children use. Whether the obligation to publish such a summary remains in relation to content harmful to adults is unclear. It also does not make sense that summaries in relation to risk assessments for illegal content should not be required if the intention is to give users (or the parents of users) enough information to make informed choices. We will not see the drafting on this until the Bill reaches the Lords. Ofcom will be given the power to require platforms to publish details of enforcement action it takes against them (in addition to the existing provisions giving it power to publish details of enforcement action (cl 129)). This clause will be considered at Report stage (see NC51) and it seems to apply across all enforcement actions, although the press release talks about this in the context of protections for children.

More detail will also be required as to how platforms enforce minimum age limits, if specified (this presumably is a separate issue from age verification to show a platform is not accessed by children). These amendments will be tabled when the Bill returns to the Commons.

The Written Ministerial Statement also adds the Children’s Commissioner, the Victim’s Commissioner and the Domestic Abuse Commissioner as statutory consultees for Ofcom when developing guidance and codes. Given the Commissioners’ remits cover England, it might be expected that the amendments to the Bill will also include the respective commissioners in other parts of the UK.

Next Steps

The next set of amendments will start to implement these policy changes, but the most significant will be discussed when the Bill returns to Committee stage in the Commons. Given this phased roll-out of amendments, in which the Government may well seek to amend its own previous amendments, tracking progress may well prove complex. The difficulties will be compounded with the possibilities of linked measures (eg deepfake porn as well as measures on hidden cameras) being introduced through different legislative vehicles.