Secretary of State’s powers and the draft Online Safety Bill

  • By William Perrin, Trustee, Carnegie UK, Professor Lorna Woods, Professor of Internet Law, University of Essex and Maeve Walsh, Carnegie Associate
  • 14 September 2021
  • 6 minute read

In the first of a new series of blog posts on the draft Online Safety Bill, we focus on the exceptional powers given to the Secretary of State, why these powers are problematic and how to amend the Bill to bring it in line with media regulation and our international commitments on free speech.

The draft Online Safety Bill gives too many powers to the Secretary of State over too many things.[1] This is a rare point of unity between safety campaigners, who want tough legislation to address hate crime, mis/dis-information and online abuse[2] and radical free speech campaigners who oppose much of the Bill.

To meet the UK’s international commitments on free speech in media regulation, the independence of the regulator from Government is fundamental. This boundary between the respective roles of the Government and the regulator in most Western democracies is well-established. The United Kingdom is party to a Council of Europe declaration that states that national rules for a broadcasting regulator should:

Avoid that regulatory authorities are under the influence of political power.

The United Kingdom was also party to a 2013 joint statement on freedom of expression between the Organisation for Security and Co-operation in Europe (OSCE) (of which the UK is a participant), the Office of the United Nations High Commissioner on Human Rights, the Organisation of American States and the African Commission on Human and Peoples’ Rights. In that statement, made at a time of great international regulatory change due to the move to digital transmission, the United Kingdom also agreed that:

“While key policy decisions regarding the digital terrestrial transition need to be taken by Government, implementation of those decisions is legitimate only if it is undertaken by a body which is protected against political, commercial and other forms of unwarranted interference, in accordance with international human rights standards (i.e. an independent regulator).”

The United Kingdom has been a leading exemplar of the independent regulator approach. In the Communications Act 2003, Parliament set OFCOM a list of objectives for setting its standards codes, then leaves OFCOM to set the codes without further interference or even having to report back to Parliament. This is a good demonstration of the balance referred to in the OSCE statement. Parliament and government set high-level objectives in legislation then do not interfere in how the regulator does its day-to-day business.

With the Digital Economy Act 2017, Parliament agreed that Government could direct OFCOM, but that power was limited to exclude OFCOM’s content rules. The Wireless Telegraphy Act 2006 powers of direction also do not touch content.

Unfortunately the draft Online Safety Bill deviates from these sound principles and allows the Secretary of State to interfere with OFCOM’s independence on content matters in four principal areas. The draft Bill gives the Secretary of State relatively unconstrained powers to:

The UK Government has not explained why the Secretary of State needs these powers. We propose that the draft Online Safety Bill provisions relating to these powers should be amended to create a more conventional balance between democratic oversight and regulatory independence to underpin freedom of expression.[3]

Parliament and Government set OFCOM’s initial priorities

Parliament and Government, working with the traditional checks and balances, should be able to set broad priorities for OFCOM’s work on preventing harm. We understand that OFCOM would also welcome initial prioritisation, as would regulated companies. Victims’ groups also want reassurance the harms that oppress them will be covered by the legislation. Parliament will want to be confident in what OFCOM will do with the powers being delegated to it.

However, the Secretary of State’s powers should not cross the line in the Digital Economy Act and permit the Government to direct OFCOM on content matters through Statutory Instruments (SIs).  Clauses 109 and 57 do so on strategy (albeit with some Parliamentary oversight in cl 110) and cl 41 and cl 47 on Priority Content. These extensive powers enable detailed government influence on the implementation of policy, potentially influencing decisions that impact content, and undermine OFCOM’s independence.

A better balance can be struck between Parliament and the executive in setting priorities that maintain OFCOM’s independence. We suggest examining the issue in two parts: regime start up; and response to issues during operation.  The draft Bill should be amended so that:

The Secretary of State should periodically (every three years) be able to give OFCOM an indication of their strategic priorities for Internet Safety, but this should not cut across into content, nor into OFCOM’s day-to-day administration.

Parliament and government then respect OFCOM’s independence

The draft Online Safety Bill envisages a continuing control in the hands of the Executive beyond high level strategic direction. Clauses 33 and 113 affect OFCOM’s role to implement policy; the OSCE statement is particularly clear that this should be an area in which there is no Government interference. Yet both clauses cross the boundary emphatically. Moreover, there is no attempt to provide for scrutiny or control of these powers by Parliament. The Secretary of State’s power to direct OFCOM to make amendments to the code to reflect Government policy (cl 33) and to give guidance as to the exercise of functions and powers are simply egregious and should be deleted.

We are working on detailed amendments to the draft Bill to give effect to the above, which we shall publish very shortly and would be grateful for feedback: [email protected]

 

 

 

[1]

See table here.

[2]

See Professor Lorna Woods quoted here.

[3] See Professor Lorna Wood’s paper on how the balance between user safety and the protection of fundamental rights, including freedom of expression, can be struck: “A Statutory Duty of Care and Fundamental Freedoms” (2019)