Safer Internet Day 2021, where have we got to?
- by Maeve Walsh, Carnegie Associate
- 9 February 2021
- 5 minute read
“Online harms regulation is long overdue,” I wrote this time last year as we waited for the UK Government’s response to the Online Harms White Paper and anticipated the progress of Lord McNally’s Private Bill (supported by Carnegie UK Trust) “to bring some much needed urgency to this issue, by giving Ofcom new powers to prepare for the implementation of a duty of care system”.
“Let’s hope we don’t have to wait until Safer Internet Day 2021 for them to do so.”
So, on Safer Internet Day 2021, where have we got to? The pandemic – and long periods of time out of school and online at home – has increased the risk of the most serious harm to children, particularly the most vulnerable, and further emphasised the need for urgent regulatory action to protect them. Meanwhile, in Westminster, the DCMS Secretary of State and Digital Minister have changed and the Government has published both its interim response (Feb 2020) and full response (December 2020) to the White Paper. The latter response confirmed Ofcom as the new regulator for Online Harms and legislation (now called the “Online Safety Bill”) is expected “later this year”. Lord McNally’s Bill, however, still awaits its Second Reading and, as things stand, Ofcom will not be exercising any of its new regulatory powers until the Government’s Bill receives Royal Assent – which could now be well into 2022.
Looking beyond the UK we can see the Digital Services Act in Brussels, the Online Safety and Media Regulation Bill in Ireland and proposals for Internet regulation bring drafted in Canada. All in all, a step forward from last year. But the major exception is the United States, home to so many services, where despite positive noises it isn’t yet clear how the new Biden Administration will act.
There is much to welcome in the UK Government’s proposals, particularly a proposed approach that is built on the need for robust and continuous risk assessment of the design and safety of the systems and processes online platforms use to deliver their services. This approach draws heavily on our work on a statutory duty of care for online harm reduction that bites at the systemic level, rather than the level of individual pieces of content.
The protection of children is at the heart of the UK Government’s proposals. However, we have concerns about the way that the threshold for the general duty is framed (i.e. significant physical and psychological harm) and what that will mean in practice in relation to harms to children and other vulnerable users. We also want to see further detail on how this requirement on companies will work and who will independently scrutinise the companies’ assessments:
All companies will be required to assess the likelihood of children accessing their services. If they assess that children are likely to access their services, they will be required to provide additional protections for children using them. (Para 2.15)
The publication of the (for now, voluntary) code of practice to address child sexual exploitation and abuse will provide companies and the regulator with an impetus to take more stringent action on the most harmful, illegal activity ahead of the introduction of the Bill. Furthermore this is an area where Ofcom might consider taking a more active role – for example, chairing meetings to oversee the industry action being taken under the voluntary code and, in effect, adopting a “practice” role ahead of receiving its statutory powers. (Their new powers in relation to enacting the new provisions for video-sharing platforms, in effect, give them a road map for this already; at the very least, the introduction of the new Online Safety Bill shouldn’t lead to a diminution in the level of protection to children.)
The requirement for companies that are likely to be accessed by children to to carry out a child safety risk assessment is also the right approach, and we look forward this being built on “safety by design” principles that can be consistent between this particular area of harm and all the others within scope of the legislation. If Ofcom it wanted to go further in its “practice” role, the development and implementation of child risk assessments could be one of the priority areas for it to start developing and testing elements of the regime. This would follow the example of the Medical Research Council in response to the Warnock review of human embryology, before it receives its full powers.
Safer Internet Day is – every year – a great opportunity for educators, parents, carers and children to reflect on their online lives and share resources, information and tools to help young people and children make the most of the opportunities of the online world, while managing the risks. The online world continues to move fast and it remains difficult for people to keep up. The pandemic has thrown the very real risks of serious harms to children into stark relief. But, as I wrote last year:
“the tech companies funding the educational outreach [for schools and families] are the same companies who could take much more significant steps – eg designing their systems with a view to reducing the risk of harm to users – to prevent those harms happening in the first place, regardless of how new or innovative the product or service is, or who it is intended to be used by”.
The UK Government has signalled that this will become a regulatory requirement against which Ofcom can hold them to account. The clock is ticking now on the tech companies to get their system design and their risk assessment approaches in place to meet this statutory requirement.
So … Let’s hope we don’t have to wait until Safer Internet Day 2022 – or the passage of the Online Safety Bill into law (whichever is sooner) – for them to do so.
Help us make the case for wellbeing policy
Keep in touch with Carnegie UK’s research and activities. Learn more about ways to get involved with our work.
"*" indicates required fields