How the new Sub-Committee on Disinformation can help strengthen democracy in the digital age

Michela.Palese (1)In April 2019 the Commons Digital, Culture, Media and Sport select committee established a sub-committee to continue its inquiry into disinformation and data privacy in the digital age. Michela Palese considers the motivations underlying the establishment of this sub-committee, its stated priorities, and how it can help confront the challenges and threats to our democratic processes arising from online campaigning.

Last month the Digital, Culture, Media and Sport (DCMS) select committee launched a new Sub-Committee on Disinformation. Its task is to become ‘Parliament’s institutional home’ for matters concerning disinformation and data privacy; a focal point that will bring together those seeking to scrutinise and examine threats to democracy.’

The new sub-committee promises to offer an ongoing channel through which to gather evidence on disinformation and online political campaigning, and to highlight the urgent need for government, parliament, tech companies and others to take action so as to protect the integrity of our political system from online threats.

Damian Collins, chair of the DCMS committee, explained that the sub-committee was created because of:

‘concerns about the spread of disinformation and the pivotal role that social media plays. Disinformation is a growing issue for democracy and society, and robust public policy responses are needed to tackle it at source, as well as through the channels through which it is shared. We need to look principally at the responsibilities of big technology companies to act more effectively against the dissemination of disinformation, to provide more tools for their users to help them identify untrustworthy sources of information, and to provide greater transparency about who is promoting that content.’

The sub-committee follows up on the significant work conducted as part of the DCMS committee’s long-running inquiry into Disinformation and ‘Fake News’, whose final report was published in February 2019.

This inquiry ran for 18 months, held 23 oral evidence sessions, and took evidence from 73 witnesses: its final report contained a series of important conclusions and recommendations.

Among these, the report called on the government to look at how UK law should define ‘digital campaigning’ and ‘online political advertising’, and to acknowledge the role and influence of unpaid campaigns and Facebook groups both outside and during regulated campaign periods. It also advocated the creation of a code of practice around the political use of personal data, which would offer transparency about how people’s data are being collected and used, and about what messages users are being targeted with and by whom. It would also mean that political parties would have to take greater responsibility with regards to the use of personal data for political purposes, and ensure compliance with data protection and user consent legislation.

One of the report’s key conclusions was that current electoral law is not fit for purpose and does not reflect the changes in campaigning techniques brought about by the digital revolution. In particular, the DCMS committee recognised the lack of transparency about who is behind online adverts. In this regard, both the DCMS committee’s interim and final reports on disinformation called for electoral law to be strengthened and updated for the digital age, including through extending the rules on imprints (a disclosure stating who has paid for or promoted an advert) to online election material and creating an online advertising repository; recommendations which Dr Alan Renwick and I echoed in our report on improving information during election and referendum campaigns.

What should the sub-committee’s priorities be?

The sub-committee’s primary objective is to continue the DCMS committee’s ‘rigorous scrutiny of democratic accountability, and to play our part in protecting individuals from the insidious onslaught of disinformation and digital disruption.’ But what will its other priorities be and what else should it consider?

1. The sub-committee will consider new, specific instances of disinformation campaigns that lack transparency or are deliberately misleading.

One of the primary benefits of the sub-committee is that it will allow for the ongoing monitoring of developments in the field of disinformation and digital campaigning, and provide a constant, long-term parliamentary focus in this area. The sub-committee has already begun looking into Facebook ads run by anonymous campaign groups such as Mainstream Network and Britain’s Future, following up on evidence obtained as part of its inquiry into disinformation. As online campaigning is a rapidly changing area – with campaigners already beginning to circumvent social media companies’ new transparency and advertising requirements (such as Facebook’s online ads library) – the sub-committee will allow parliament – and the wider public – to be kept updated on these developments.

2. The sub-committee will analyse and respond to the government’s Online Harms White Paper.

In April 2019, the government published its long-awaited white paper on online harms, where it set out its plans for a package of measures aimed at improving and guaranteeing citizens’ safety online and tackling online harms such as child abuse, terrorism, cyberbullying, and disinformation. The white paper accepted the DCMS committee report’s recommendation that social media companies should have a legal liability to take down harmful content hosted on their platforms. To this end, it proposed establishing a new statutory duty of care towards users, which would make tech companies responsible for the safety of their users and tackle harm caused by content or activity on their services. Compliance with this duty of care would be overseen by a new independent regulator. The white paper and accompanying consultation offer the sub-committee an important way of informing this statutory duty of care and the new regulator’s powers.

Somewhat disappointingly, and despite the wide range of harms in its scope, the white paper had little to say on the regulation of online political campaigning and on improving transparency around political advertising. Indeed, both the DCMS committee chair and Information Commissioner Elizabeth Denham, lamented the lack of space given to these topics. For example, Damian Collins said:

‘The White Paper does not address the concerns raised by the Select Committee into the need for transparency for political advertising and campaigning on social media. We understand that this will soon be addressed separately by the Cabinet Office. Again, it is vital that our electoral law is brought up to date as soon as possible, so that social media users know who is contacting them with political messages and why. Should there be an early election, then emergency legislation should be introduced to achieve this.’

Beyond responding to the white paper, the sub-committee should continue to gather evidence, provide information and exert pressure on government to take action on disinformation, the regulation of online political campaigning and on improving transparency around political ads.

3. The sub-committee will hold regular sessions with regulators and other experts and stakeholders.

As part of its inquiry into disinformation, the DCMS committee took evidence from a variety of experts and stakeholders, including the Electoral Commission and the Information Commissioner’s Office (ICO), which provided valuable insight into the complicated and interconnected issues surrounding disinformation, online political campaigning and foreign interference in elections and referendums. The new sub-committee will continue to provide a way for parliament to engage and collaborate with regulators, academics, civil society organisations, and other practitioners.

The sub-committee has already held an evidence session with the Information Commissioner, during which she updated MPs on the ICO’s investigation into the Mainstream Network campaign group. The Commissioner raised concerns about how personal data was being collected by anonymous campaign groups – potentially in breach of data protection legislation –  and about the limits of Facebook’s new political advertising transparency tools.

The sub-committee has also announced that it will be holding an evidence session with Culture Secretary Jeremy Wright in May. A new standing order also means that the sub-committee will be able to invite members of other select committees to attend meetings and ask questions, thus allowing for collaboration and information-sharing, which is particularly important given the interconnected and cross-departmental nature of the issues involved.

4. The sub-committee will continue to collaborate with other national parliaments as part of the ‘International Grand Committee on Disinformation and “Fake News”’.

Given the globalised nature of social media, the DCMS committee’s inquiry into disinformation meant that it had to work extremely closely with other national parliaments. This transnational collaboration led to the establishment of an ‘International Grand Committee’ in autumn 2018, comprising members from nine legislatures around the world (Argentina, Belgium, Brazil, Canada, France, Ireland, Latvia, Singapore, and the UK). The grand committee held an evidence session in November 2018 with Facebook’s Vice President of Policy Solutions, the Information Commissioner and Deputy Information Commissioner Steve Wood all providing evidence. The Grand Committee had invited Facebook CEO Mark Zuckerberg to appear, but he ‘refused twice saying he was “not able to be in London” and “not able to accept the invitation.”’

The Grand Committee’s second meeting will take place in Ottawa, Canada, on 28 May 2019 and invitations to appear before it have been sent to the CEOs of the main tech platforms, including Facebook, Google, Apple, Amazon, Twitter and Snapchat.

5. The sub-committee should put pressure on tech companies to take action on disinformation and online campaigning, and to provide evidence on how their platforms (and their algorithms in particular) work. 

The DCMS committee has repeatedly criticised tech companies’ – and in particular Facebook’s – lack of transparency and unwillingness to cooperate with its inquiry, and investigations conducted by the ICO and others. In its interim report of July 2018, the committee stated: 

‘What we found, time and again, during the course of our inquiry, was the failure on occasions of Facebook and other tech companies, to provide us with the information that we sought. We undertook fifteen exchanges of correspondence with Facebook, and two oral evidence sessions, in an attempt to elicit some of the information that they held, including information regarding users’ data, foreign interference and details of the so-called ‘dark ads’ that had reached Facebook users. Facebook consistently responded to questions by giving the minimal amount of information possible, and routinely failed to offer information relevant to the inquiry, unless it had been expressly asked for.’

In their final report, the DCMS committee reiterated this criticism saying that tech companies ‘should not be allowed to behave like “digital gangsters” in the online world, considering themselves to be ahead of and beyond the law.’

In light of the tech companies’ recent attempts to self-regulate and cooperate with national governments and parliaments, the new sub-committee will offer an avenue through which to monitor the action they take on disinformation and online campaigning, and to (hopefully) obtain more information on how these platforms work and how they are used by political campaigners.

6. The sub-committee’s status and remit should be as long-standing and have as wide a scope as possible.

While the DCMS committee’s inquiry into disinformation was very much focused on the 2016 EU referendum and that same year’s presidential election in the United States, the new sub-committee should consider the issue of disinformation more comprehensively, focusing on the bigger societal problems and considering the broader implications for the integrity, credibility and transparency of our elections and referendums, and by extension of our democratic system as a whole. In order to do this properly, the sub-committee should be given sufficient time to gather evidence and testimonies, hold hearings, and collaborate with external stakeholders.

The sub-committee should also expand its remit beyond current technologies, engaging in horizon-scanning to understand how election rules and democratic processes can be future-proofed. 

Concluding Remarks

In recent years and months, a variety of voices have advocated for action to be taken to regulate online political campaigning and tackle disinformation, including the Electoral Commission, the ICO, the DCMS committee, the Electoral Reform Society (whose report on this issue I co-edited), and academics. Yet, beyond a limited consultation on extending imprints to online election material, the government has taken little action to address these concerns directly and in a comprehensive manner.

As Damian Collins expressed in his statement in Westminster Hall on the launch of the sub-committee, there is a ‘genuine danger to democracy and to society in the deliberate and malicious targeting of disinformation at citizens largely using social media to influence what they see and to influence their opinions of politics, society and institutions.’

Apart from the local elections taking place in England this week, the UK could soon be facing European Parliament elections, second referendums on both Brexit and Scottish independence, and even a general election, while our election and online campaign rules remain unfit for purpose for the digital age. The DCMS’s sub-committee will serve as an important reminder to act to protect our democracy from online harms.

About the author

Michela Palese is Research and Policy Officer at the Electoral Reform Society. She was previously Research Assistant (McDougall Fellow) at the Constitution Unit, where she co-authored the report on Improving Discourse During Election and Referendum Campaigns alongside the Constitution Unit’s Deputy Director, Dr Alan Renwick.