Who should regulate to reduce harm in social media services?

Carnegie Logo
  • by William Perrin, trustee of Good Things Foundation, Indigo Trust and 360 Giving. He is a former senior civil servant in the UK government and Professor Lorna Woods, University of Essex.
  • 10 May 2018
  • 8 minute read

We have set out in a series of blog posts a proposal for reducing harm from social media services in the UK (see end for details about the authors).   The harm reduction system will require new legislation and a regulator.  In this post we set out our first thoughts on the regulator, how to fund it and the simple legislation required to implement this proposal.

At the outset of this work we described a ‘regulatory function’.  Our approach was to start with the problem – harm reduction – and work forwards from that, as opposed to starting with a regulator and their existing powers and trying to fit the problem into the shape of their activities. Our detailed post on comparative regulatory regimes gave some insight into our thinking.  We now address two linked questions:

The Need for a Regulator

The first question is whether a regulator is needed at all if a duty of care is to be created.

Is the fact that individuals may seek redress in relation to this overarching duty (by contrast to an action in relation to an individual piece of content) in the courts not sufficient? At least two pieces of profound legislation based on duties of care do not have ‘regulators’ as such – the 1957 Occupiers Liability Act and the 1973 Defective Premises Act.  By contrast, the 1974 Health and Safety at Work Act does rely on a regulator, now the Health and Safety Executive (HSE). A regulator can address asymmetries of power between the victim and the harm causer.  It is conceivable for a home owner to sue a builder or a person for harm from a building, or a person to sue a local authority for harm at a playground. However there is a strong power imbalance between an employee and their boss or even between a trade union and a multinational. A fully functioning regulator compensates for these asymmetries.  In our opinion there are profound asymmetries between a user of a social media service and the company that runs it, even where the user is a business, and so a regulator is required to compensate for the users’ relative weakness.

What Sort of Regulator?

Assuming a regulator is needed, should it be a new regulator from the ground up or an existing regulator upon which the powers and resources are conferred?  Need it be a traditional regulator, or would a self or co-regulator suffice?  We do not at this stage rule out a co-regulatory model, although our preliminary conclusion is that a regulator is required.  As we shall see below, instances of co-regulation in the communications sector have run into problems.  Self-regulation works best when the public interest to be served and those of the industry coincide.  This is not the case here.

Whichever model is adopted, the important point is that the regulator be independent (and its members comply with the Nolan Principles). The regulator must be independent not only from government but also from industry, so that it can make decisions based on objective evidence (and not under pressure from other interests) and be viewed as a credible regulator by the public.  Independence means that it must have sufficient resources, as well as relevant expertise.

A completely new regulator created by statute would take some years before it was operational.  OFCOM, for instance, was first proposed in the Communications White Paper in December 2000, was created in a paving act of Parliament in 2002 but did not vest and become operational until December 29 2003 at a cost of £120m (2018 prices).  In our view harm reduction requires more urgent (and less expensive) action.

We therefore propose extending the competence of an existing regulator.  This approach has a number of advantages. It spreads the regulator’s overheads further, draws upon existing expertise within the regulator (both in terms of process and substantive knowledge) and allows a faster start. We consider that the following (co) regulators should be considered: Advertising Standards Authority (ASA), the British Board of Film Classification (BBFC), the Health and Safety Executive (HSE) or the Office of Communications (OFCOM), all of which have the long proven regulatory ability.

The BBFC seems to have its hands full with the age verification regulator from the Digital Economy Act 2017.  The launch date has been missed for reasons that are unclear and in our view this removes them from consideration.  This also raises the question of how well delegated responsibilities work; Ofcom has recently absorbed responsibilities in relation to video on demand, rather than continue to delegate them to ATVOD. While the ASA regulates some content online including material on social media platforms, but this is limited to advertisements (including sponsorship and the like). Overall the ASA focusses quite tightly on advertising.  Adding in the substantial task of grappling with harm social media services more broadly could damage its core functions.  The HSE has a strong track record in running a risk based system to reduce harm in the workplace, including to some extent emotional harm.  It has a substantial scientific and research capability, employing over 800 scientists and analysts.  However our judgement is that harm reduction in social media service providers require a regulator with deep experience of and specialism in online industries, which is not where the HSE’s strengths lie.

Our recommendation is to vest the powers to reduce harm in social media services to OFCOM.  OFCOM has over 15 years’ experience of digital issues, including regulating harm and protecting young people in broadcasting, a strong research capability, proven independence, a consumer panel, and also resilience in dealing with multinational companies. OFCOM is of a size (£110-£120 annual income and 790 staff) where, with the correct funding it could support an additional organisational unit to take on this work without unbalancing the organisation.

The regulator could be funded by a small fraction of the revenue planned to be raised by the Treasury from taxing the revenues of internet companies, of which this would be but a tiny percentage.  The relative costs of large regulators suggests that the required resource would be in the low tens of millions of pounds.

Simple legislation to pass quickly

Action to reduce harm on social media is urgently needed.  We think that there is a relatively quick route to implementation in law. A short bill before parliament would create a duty of care, appoint, fund and give instructions to a regulator.

We have reviewed the very short Acts that set up far more profound duties of care than  regulating social media services – The Defective Premises Act 1972 is only seven sections and 28 clauses (very this was a unusually a private members bill written by the Law Commission); the Occupiers Liability Act 1957 is slightly shorter. The central clauses of the Health and Safety at Work Act 1974 creating a duty of care and a duty to provide safe machines are brief.

For social media services, a duty of care and key harms are simple to express in law, requiring less than ten clauses or less if the key harms are set out as sub clauses. A duty for safe design would require a couple of clauses.  Some further clauses to amend the Communications Act 2003 would appoint OFCOM as the regulator and fund them for this new work. The most clauses might be required for definitions and parameters for the list the regulator has to prepare.  We speculate that an overall length of six sections totalling thirty clauses might do it.  This would be very small compared to the Communications Act 2003 of 411 Sections, thousands of clauses in the main body of the Act and 19 Schedules of further clauses.

This makes for a short and simple bill in Parliament that could slot into the legislative timetable, even though it is crowded by Brexit legislation. If government did not bring legislation forward a Private Peers/Members Bill could be considered.

We are considering drafting such a bill to inform debate and test our estimate.

About this blog post

This blog is the seventh in a programme of work on a proposed new regulatory framework to reduce the harm occurring on and facilitated by social media services.  The authors William Perrin and Lorna Woods have vast experience in regulation and free speech issues.  William has worked on technology policy since the 1990s, was a driving force behind the creation of OFCOM and worked on regulatory regimes in many economic and social sectors while working in the UK government’s Cabinet Office.  Lorna is Professor of Internet Law at University of Essex, an EU national expert on regulation in the TMT sector, and was a solicitor in private practice specialising in telecoms, media and technology law.  The blog posts form part of a proposal to Carnegie UK Trust and will culminate in a report later in the Spring.