The draft Online Safety Bill: the view of the Joint Committee

The government’s draft Online Safety Bill has been subjected to pre-legislative scrutiny by a joint committee of MPs and peers: an unusual procedural step. Following on from publication of its report, committee chair Damian Collins outlines its key findings and recommendations.

On 14 December the Joint Committee on the Draft Online Safety Bill, which I chair, published our final report on the government’s plans to ‘make the UK the safest place in the world to be online.’

Keen followers of Westminster will know that a pre-legislative, joint committee of the House of Lords and House of Commons is a rare creature, brought into existence little more than once in the duration of a four-year parliament. When there are high levels of interest in a draft bill across all parties and both chambers, such a committee can prove a useful tool to stress test its most critical clauses. Given that the Communications Act 2003, which established Ofcom, was subject to such scrutiny under the chairmanship of Lord (David) Puttnam, it is fitting that the next major reform in media regulation should have followed the same path.

For me this started in 2018, when I chaired a House of Commons inquiry into Disinformation and ‘Fake News’, followed by another in 2019 into Immersive and Addictive Technologies. These were conducted by the Digital, Culture, Media and Sport Select Committee and called out big tech companies for being ‘digital gangsters’ with users’ privacy and safety, and recommended that the UK set up an independent regulator to hold them to account for any harms they caused.

Fast-forward to 2021, and the government set out to do this, publishing a draft Online Safety Bill in the spring, and setting up a Joint Committee in the summer to scrutinise the proposed legislation. Composed of some of the most longstanding experts in parliament on tech policy, media regulation, civil liberties and business governance, we set straight to work. Over the last five months we have held 30 hours of public evidence sessions and read more than 200 pieces of written evidence. We have spoken with over 50 witnesses: ministers, academics, civil society campaigners, industry executives, whistleblowers, and many other parliamentarians, from the UK and abroad. After many hours of closed deliberations, we unanimously agreed on 127 recommendations.

Overall, it was clear to us that the bill needed to be restructured, to make it clearer for the regulator as well as the services in scope, and stronger too, to guarantee that platforms should be following UK laws, not just terms of service written in Silicon Valley.

The bill should state up front what this law will mean and the overall safety objectives which both the regulator and the service providers should follow. In essence, Ofcom will be put in charge of making sure platforms’ systems and processes comply with UK laws that exist offline, so that their algorithms don’t serve up content that would clearly never be allowed offline. Ofcom should also make sure that the same platforms don’t endanger public health or national security, protect children more than adults, ensure their business models and systems are safe by design rather than by afterthought, and uphold users’ freedom of expression and right to privacy.

That way, platforms will have a proactive (not just reactive) duty to ensure they mitigate content and activity that fuels the crimes of terrorism, child sexual abuse, and fraud, but also:

  • discrimination against protected characteristics listed in equalities legislation;
  • disinformation about the administration of an election, the intimidation of a political candidate, or failing to declare if you have paid for a political ad, all offences in the new Elections Bill;
  • or facilitating human trafficking, which the Nationality and Borders Bill is making a life-sentence offence.

We have also recognised that as well as existing or soon-to-be updated legislation, there is a clear need to create new communications offences, for harms that we never thought would exist as little as 10 years ago. Based on the Law Commission’s report on modern communications offences, and the evidence we received from the Epilepsy Society, Professor Clare McGlynn at Durham University, and Ian Russell, father of Molly Russell, we believe that the Online Safety Bill should create new offences for: maliciously sending flashing images to someone with epilepsy (Zach’s Law); cyberflashing, which we were told affects 76% of girls aged 12 to 18; and promoting self-harm and suicide. With these new offences in law, under the Online Safety Bill, platforms would have to make sure their systems and processes do not facilitate or promote such content or activity.

Throughout our inquiry, we knew that we had to find the right balance between the objective of protecting users’ safety with the right to freedom of expression, which is safeguarded under Article 10 of the European Convention on Human Rights. The draft bill already imposes a duty on platforms to uphold users’ freedom of speech, by being transparent about their content policies and having clear methods of redress if you wish to contest content being taken down. We have added another level to that: if you believe you have exhausted all of the possible complaints processes and still have a case to make, you should be able to take your case to a new Online Safety Ombudsman. We also believe that individuals should be able to seek redress in the courts against a service provider, if they can show that they have suffered as a consequence of that company failing to meet the obligations created for it by the Online Safety Bill.

We have also recommended that the proposed exemption for news media organisations in the draft bill be made automatic. The news media is already subject to the courts and its own frameworks of self-regulation with clear editorial liability. By its very nature it relies on the immediacy of its content to survive, so it is only right that news publisher content should not be moderated, restricted or removed unless it clearly constitutes a criminal offence. The draft bill also sought to specifically protect journalistic content and content of democratic importance, and whilst we sympathised with the motivation to protect such high-value speech, it was our view that the exemptions were better encompassed in the well-established concept of content of ‘public interest’, a test that any media lawyer will recognise, and that we believe will better protect citizen-journalists.

If the government accepts our revised model, Ofcom would have the power to set the safety standards that the bill expects online service providers to meet. It would also have extensive auditing powers to gather the information it requires to ensure that the companies are complying with what is expected of them. Specifically, Ofcom would create a list of all the risks that can be found on all of the platforms and search engines in scope. For example: addictive algorithm design, infinite scrolling, one-click sharing, AI moderation, end-to-end encryption, and anonymity, to name some of the risky features we heard evidence on. Based on that, they would create profiles of service providers, from the highest to the lowest risk platforms. According to their assigned risk profile, an individual platform would internally have to establish a risk assessment of their own specific service, identifying what offences potentially arise due to their design features, and how they will mitigate for them. All of this they would have to demonstrate in regular transparency reports that Ofcom could audit.

To make sure every service in scope can meet these requirements, Ofcom will help by issuing mandatory Codes of Practice on how best to tackle terrorist content and child sexual exploitation and abuse, promote safety by design, freedom of expression and digital literacy, and age assurance when the platform is accessible to children. Ofcom would also set out how existing offences in UK law would apply online and what the service providers would be expected to do to mitigate the existence and distribution of content that promoted them. But if platforms refuse to play ball, not only should they face fines of up to 10% of global turnover as the draft bill proposes, we also think that a designated, director-level Safety Controller should be held criminally liable for non-compliance with the bill.

As to next steps, the government will issue a response to the Committee within two months, and send a revised bill to the floor of the House of Commons. It will then undergo the usual process of parliamentary scrutiny. I believe our unanimous report is a true reflection of views held on all sides of the Lords and Commons on what is an incredibly complex issue: I heartily recommend that the government accept the recommendations in full.

If you enjoy the Constitution Unit blog, sign up for updates in the left sidebar, join our mailing list for news of our events and research, and support us through a one-off or regular donation. Donations are crucial to funding the blog, and the Unit’s research.

About the author

Damian Collins MP is Chair of the Joint Committee on the Draft Online Safety Bill.