The draft Online Safety Bill: abandoning democracy to disinformation

The draft Online Safety Bill published in May is the first significant attempt to safeguard the public from online harms through legislation. However, as Alex Walker explains, the government’s current proposals are a missed opportunity to address online harms to democracy and could even make tackling disinformation more difficult.

In May, the government published its draft Online Safety Bill, which is currently undergoing pre-legislative scrutiny by a committee of both Houses. It is also the subject of an inquiry by the Digital, Culture, Media and Sport (DCMS) Sub-committee on Online Harms and Disinformation. Published two years after the Online Harms white paper, the draft bill represents the first major attempt in this country to regulate the online environment and the major companies that dominate it. Given the significance of the bill, the parliamentary attention it is currently receiving is welcome. Nevertheless, as much of the evidence given to parliament points out, the draft bill has significant weaknesses. In September, Constitution Unit Deputy Director Alan Renwick and I submitted evidence to the DCMS Sub-committee inquiry. We highlighted the draft bill’s failure to address online harms to democracy. There is a danger that in its present form the bill will make it more difficult to tackle disinformation that damages and undermines democracy.

Abandoning the field: from the Online Harms white paper to the draft Online Safety Bill

As previously documented, in the course of the development of the online safety regime measures to strengthen democracy in the face of new challenges posed by digital technology have been dropped from the proposals. The Online Harms white paper, published in April 2019, was explicit that various types of online activity could harm democracy. It referenced concerted disinformation campaigns, deepfakes, and micro-targeting. The white paper set out a number of actions that it was expected would be in the regulator’s Code of Practice. They included: using fact-checking services, especially during election campaigns; limiting the visibility of disputed content; promoting authoritative news sources and diverse news content; and processes to tackle those who mispresent their identity to spread disinformation.

In many areas, the white paper’s position chimed with the findings of a major inquiry into disinformation conducted by the DCMS select committee over the previous eighteen months.

But the publication of the draft Online Safety Bill in May confirmed that the government has opted for a much more limited approach. Only disinformation that could have a significant adverse physical or psychological impact on an individual is now in scope. In choosing this approach, the government ignored the recommendations of the House of Lords Democracy and Digital Technologies Committee, which proposed that certain service providers should have a duty of care towards democracy.

The emphasis has shifted decisively away from acknowledging that online platforms have a responsibility for the impact their technology has on democracy, towards a completely unregulated approach to political content, regardless of the broader democratic consequences.

Continue reading

Updating campaign regulation for the digital era

John Pullinger, chair of the Electoral Commission, argues digital campaign regulations need  an ‘overhaul’ to make the electoral process more transparent and accessible to voters, thereby increasing confidence in the system in a manner that doesn’t discourage parties, candidates and campaigners to take in part in elections. He also calls on the UK’s parliaments to show that they do not tolerate the use of online activities that undermine democracy.

Digital channels are transforming our democracy. Action now can harness that transformation to make political campaigns better. Without the right action, our democracy may not be resilient in the face of the challenges posed by the digital era. But there is nothing unique to elections in this. It applies in the same way to how technological change is affecting so many aspects of our lives. And we can respond in the same way.

Voters can already be sceptical about what they see on social media and practise the art of asking. Who is telling me this? Can I be sure it is really from them? Why are they telling me this? Can I believe what they are saying? How can I check it out? Parties, candidates and campaigners can already use digital tools like imprints to show where information is coming from.

Other voices can already accentuate the positive and shame the bad. Social media platforms, news organisations, influencers and fact checkers increasingly see this as central to their own reputation. A platform is not neutral. It has values and shows its true colours by how it acts. By standing on the sidelines, they are getting the message that they will be seen to be complicit in undermining democracy. By standing tall they can see that they can provide a vital public service that will enhance their brand.

Continue reading