The Joint Committee on the Draft Online Safety Bill: steps in the right direction for democracy

The government’s draft Online Safety Bill does little to protect democracy from damage caused by online actors, despite a previous commitment to take action. Alex Walker argues that this was an error. Here, he analyses the December report of the parliamentary joint committee tasked with examining the bill. A post in early February will critique the conclusions and recommendations of the DCMS select committee, which published its report earlier this week.

In December, the joint committee tasked with scrutinising the government’s draft Online Safety Bill published its report, the conclusions of which were outlined by its Chair, Damian Collins, on this blog. The committee recommended significant overarching changes to the draft bill, which represents the first major attempt in the UK at online regulation.

Since its publication in May 2021, the draft bill has been subject to extensive criticism, including on this blog. In previous posts, I’ve highlighted that it fails to address online threats to democracy. The government’s 2019 Online Harms white paper acknowledged the seriousness of this issue and set out measures to tackle it. These proposals were then later abandoned.

Positively, the committee noted the government’s change of direction and concluded to the contrary that online harms to democracy should be tackled by legislation. Whilst the committee’s recommendations have their own limitations, if adopted they would better protect democratic processes from online harm than at present.

Evidence of online harms to democracy

The committee took evidence from a wide range of experts and stakeholders, receiving over 200 written submissions and hearing from more than 50 witnesses. Many of the witnesses told of the damage to democracy caused online. The committee’s report reflects this, noting the role of algorithms in artificially inflating the reach of dangerous disinformation and referencing the 2021 insurrection at the US Capitol as an example of what can happen when election-related conspiracy theories are allowed to flourish. Furthermore, it cites the findings of the Disinformation and ‘fake news’ and Russia inquiries that digital platforms have been used by hostile state actors to try and influence UK electoral processes. The committee recognised that online disinformation has the capacity to harm democracy, national security and society more broadly.

In spite of this evidence, the government opted to focus only on individual harms, due to concerns about freedom of expression. The committee concluded that this would leave serious online harms unchallenged. It outlined an alternative approach which would tackle disinformation whilst minimising unjustified infringements on online speech.

Safety by design and enforcement of terms and conditions

The government’s present strategy for dealing with disinformation that harms society and democracy is through media literacy. The committee, however, said that ‘[the] viral spread of misinformation and disinformation poses a serious threat to societies around the world. Media literacy is not a standalone solution.’ The committee put forward recommendations for improving the government’s media literacy strategy and Ofcom’s statutory duties in this area, but agreed that this alone was insufficient. To tackle the wider harms caused by disinformation it also recommended platform design changes, better enforcement of terms and conditions and, for some types of content, targeted statutory provisions.

The committee received evidence from Facebook whistleblower Frances Haugen and others that the design of platforms can exacerbate the risk of harm. Algorithms are often designed to maximise engagement and target users based on what they are most likely to interact with, even, for example, if this is harmful conspiracy theory content. Haugen argued that it would be more effective to focus on the amplification effects of these design features rather than individual pieces of content. The committee agreed that ‘safety by design’ should be an important component of the legislation. Companies should be required to make non-content based modifications that would help minimise wider harms on their platforms, particularly those caused by the viral spread of disinformation.

The committee also noted that many social media companies already have policies on disinformation, but that these are often inconsistently applied. It recommended that a statutory provision be included in the bill requiring companies to consistently apply their terms and conditions.

Statutory protection against election-related disinformation

The committee’s response to the challenge posed by harmful disinformation focused on safety by design, consistent enforcement of terms and conditions, and media literacy. Importantly, however, it concluded that for some categories of disinformation more specific statutory measures would be necessary. In particular, the committee argued that disinformation surrounding elections posed a sufficient risk to democracy that it should be addressed directly in legislation.

It proposed that ‘election material that is disinformation about election administration, has been funded by a foreign organisation targeting voters in the UK or fails to comply with the requirement to include information about the promoter of that material in the Elections Bill’ should be one of the categories of specified illegal content. On this basis, the revised bill should include ‘a statutory requirement on providers to have in place proportionate systems and processes to identify and mitigate reasonably foreseeable risks of harm’ arising from this kind of activity and content. It envisaged Ofcom’s mandatory code of practice requiring ‘dedicated teams for election periods’ and ‘the use of fact checking in proportion to reach and risk’.  

Narrow and specific versus broad and flexible

The committee’s recommendations regarding election-related disinformation tie in with one of the wider changes advocated in its report. The government’s draft bill contains a broad requirement that companies in scope set out how they will deal with content that is legal, but potentially harmful to adults. This provision has been criticised for delegating too much power to companies to decide what falls within this category. It has also been said that it might encourage overzealous moderation of legal content which would limit freedom of speech online.

The committee recommended that this broad requirement be removed. Instead, regulated online service providers should be required to mitigate the harms arising from specific categories of content and activity. The categories, which are set out in the report, should reflect ‘specific areas of law that are recognised in the offline world’ as this means ‘society has recognised they are legitimate reasons to interfere with freedom of speech rights.’

A bill revised along these lines would provide a greater degree of certainty for users and companies as to what harmful content should be addressed. This would help mitigate against overzealous moderation and allay some of the concerns about freedom of expression. But as the committee itself acknowledged, it would create a narrower regulatory requirement than that contained in the draft bill and – in tying the categories in question to other areas of law – it would also be less flexible.

Further areas of uncertainty

Mandating technology companies to mitigate the harm caused by election-related disinformation is a positive development. However, the committee was unclear about where the provision should be included. The report first states that the government should ‘address the issue of disinformation which aims to disrupt elections’ via the Elections Bill, which was approved by the Commons on 17 January and has now moved on to the Lords. It seems unlikely that the government would be minded to insert a new disinformation provision at this stage.

However, the report then goes on to say that the Elections Bill should be amended to include the measure if ‘the government decides that the Online Safety Bill is not the appropriate place to do so’. This muddies the waters as to which piece of legislation the committee believes is the most appropriate vehicle for its proposal. This is unhelpful. It increases the likelihood that the measure will fall by the wayside – neither included via amendment to the Elections Bill in the Lords nor inserted into the Online Safety Bill when it comes to be redrafted. Without this statutory underpinning, it is uncertain whether election-related disinformation would still be addressed in Ofcom’s code of practice, setting out the steps companies have to take to fulfil their duties. Without such steps, UK elections will remain vulnerable.

Furthermore, much disinformation that causes damage to democracy would fall outside of the scope of the proposal. Although it would guard against disinformation alleging election fraud, for example, there are many kinds of false claims that circulate online and erode trust in democracy that do not relate specifically to the administration of elections. In taking a purely election-focused view of democracy, the committee’s approach would leave many types of damaging content beyond its scope.

Additionally, the committee focused primarily on countering disinformation at the expense of more positive measures to provide the public with better information online. These could have included a requirement that platforms promote authoritative news sources, as mentioned in the 2019 white paper, or the establishment of a ‘democratic information hub’, as recommended by the Unit’s Doing Democracy Better report. 

Secretary of State powers

Despite these limitations, the committee’s proposals would move the bill towards offering some protection to democracy from the destabilising effects of disinformation. The committee’s recommended restructuring would also help address the criticism that the draft bill allows for insufficient parliamentary input. As it stands, the draft bill is light on detail – with Ofcom and the Secretary of State delegated significant powers to fill in the gaps. The committee’s proposed changes – which would involve much more detail on the face of the bill – would help address this issue, allowing for parliamentary debate on a more precise piece of legislation.

The committee also recommended that clause 33, which has been especially controversial, be changed. As currently drafted, it gives the Secretary of State the power to direct Ofcom to modify its codes of practice so they are in line with government policy. This would seriously undermine the independence of the regulator and leave the door open to a government changing the rules to its own advantage. In the committee’s version of the bill, the government would only be able to direct Ofcom to change its codes of practice in relation to national security and public safety issues.

Joint Committee on Digital Regulation

It is vital that Ofcom be properly independent from government in performing its regulatory role. However, with such significant powers it is important that the regulator also be held accountable and its work overseen. The committee recommended an ongoing parliamentary Joint Committee on Digital Regulation to fulfil this function. It suggested this would fill a gap in the existing committee landscape, whilst also providing a focus for greater scrutiny of Ofcom in relation to its new remit.

Conclusion

Highlighting the value of pre-legislative scrutiny, the committee has put together a viable reworking of the draft Online Safety Bill that addresses several of its defects. If the government wants to improve the effectiveness of the legislation it should consider accepting the committee’s proposals.

In particular, it is vital that election-related disinformation be addressed in statute and that companies be required to tackle it on their platforms. This would protect democracy from the most immediate and serious of online harms. However, in order to strengthen democracy against the full range of challenges posed by digital technology further interventions will be necessary.

This is the first in a two-part series of posts by Alex on the draft Online Safety Bill. The second will cover the conclusions and recommendations of the Digital, Culture, Media and Sport Select Committee’s report on the draft bill, which was published on 24 January. To be notified when the second post goes live, sign up for updates in the left sidebar.

About the author

Alex Walker is Communications Manager and Researcher at The Constitution Society.

2 thoughts on “The Joint Committee on the Draft Online Safety Bill: steps in the right direction for democracy

  1. Pingback: Democracy and the draft Online Safety Bill: the report of the Digital, Culture, Media and Sport Committee | The Constitution Unit Blog

  2. Pingback: I·CONnect – What’s New in Public Law

Comments are closed.