The House of Lords Democracy and Digital Technologies Committee has published a report about how democracy can be done better as technology evolves, which endorsed Unit Deputy Director Alan Renwick’s key recommendation of a democratic information hub. Alex Walker offers an analysis of the report.
On 29 June, the House of Lords Select Committee on Democracy and Digital Technology published a major report, following its inquiry into the effects of digital technology on democracy. The report focuses on how the practices of many large digital technology platforms risks feeding an erosion of trust in democracy and sets out a regulatory framework designed to restore faith in the system. Importantly, it goes beyond this to look at improving digital skills and using technology to aid democratic engagement.
The committee’s recommendations on fact-checking, digital imprints, libraries of online political advertising, and promotion of digital literacy echo those of many earlier analyses, including those of the Electoral Commission and the Independent Commission on Referendums, as well as the Unit’s Doing Democracy Better report, published last year. Drawing on one of the core proposals of Doing Democracy Better, the Unit’s Deputy Director and author of the report Alan Renwick, along with co-author Michela Palese and Joe Mitchell (then of Democracy Club), gave written evidence to the committee setting out the case for an independent democratic information hub. The committee fully endorsed the proposal.
What follows is a summary of the main issues and recommendations contained in the report.
Democracy, trust and digital technology
The committee took a broad view of democracy, defining it in terms of shared institutions and values. Technology has transformed the political landscape, with online platforms such as Facebook, Twitter and Google now the main forums of political debate and sources of information. Whilst this has enabled increased participation it has also transferred considerable power to these corporations. The title of the report reflects what the committee saw to be the major, worrying consequence of these developments – an undermining of trust in democracy.
Digital technology has in many ways exacerbated a pre-existing crisis of trust in politicians and provided a means by which those who are hostile to democracy can undermine it. The committee argues that it is imperative we rethink how the online world is governed and seek to make it more conducive to democracy. To this end, the report sets out six principles the committee believes to be essential to a healthy democracy: informed citizens; accountability; transparency; free and fair elections; inclusive public debate; and an active citizenry. Their recommendations aim to shift digital technology towards supporting these principles.
Informed citizens
One of the report’s primary concerns is with what the committee’s Chair, Lord Puttnam, has referred to as ‘a pandemic of “misinformation” and “disinformation”’ online. The report cites research from Full Fact into the 2019 election showing that misinformation diminishes trust in democracy and makes people less likely to vote.
The committee began by looking at the subject of online political advertising, highlighting that political adverts are not covered by the Committee of Advertising Practice’s non-broadcast code, which regulates online commercial advertising. The report proposes a regulatory committee on political advertising to establish an enforceable code of practice that bans fundamentally inaccurate political adverts.
However, it acknowledged that the problem of misinformation and disinformation is much broader than political campaigns. The coronavirus pandemic has made abundantly clear the tangible dangers of false information. On this front, the report examines the increasingly important role of fact checkers. Whilst it is welcome that companies such as Facebook have introduced third-party fact checking initiatives, they observe that the current fact checking infrastructure in the UK is not up to the scale of the task and that online platforms lack a consistent approach to dealing with misinformation. The committee recommends a system of funding to promote independent fact checking organisations and a code of practice that stops online platforms from recommending content identified as misinformation to new audiences.
Crucially, the report agrees with the argument made in Doing Democracy Better that confronting misinformation is not a sufficient strategy – promoting quality information is also vital. The committee considers how official statistics might be more effectively communicated to the public and how parliamentary expertise might be better used. On the latter point, it references the oral evidence given by Unit Deputy Director Alan Renwick, who highlighted that the House libraries ‘produce great, impartial information but that more could be done for them to feed into wider democratic processes’.
Accountability
That powerful institutions and organisations are held accountable is a core feature of democracy. Technology platforms have often claimed that they are not responsible for the political content hosted on their sites, arguing that they do not want to curtail freedom of expression. Yet, as the report points out, it is increasingly clear that platforms do shape democratic debate online. The algorithms that categorise platform users and promote content to drive advertising revenue determine what content thrives online. Furthermore, through their content moderation policies digital platforms set the limits of online conversation.
The government’s Online Harms White Paper, published in April 2019, acknowledged the active role of technology companies and stated that they should be made responsible for harms to the individual experienced on their platforms. It proposed a statutory duty of care, with Ofcom as the regulator.
The committee proposes that this duty of care should extend to actions that undermine democracy, arguing that generic harms to democracy also impact the rights of individuals. However, it also recognised that much harmful content is legal and that banning it may limit free expression. In answer to this, the report recommends that technology platforms only be responsible for harmful content that their systems recommend to a large number of users.
This should be set out in a code of practice, enforced by Ofcom, who would be given fining powers up to 4% of global turnover. In addition to this, it suggests an independent ombudsman to which the public can appeal content moderation decisions and a parliamentary committee of both Houses to oversee the ombudsman and Ofcom’s work on the issue.
Transparency
The report considers evidence for a number of further ways in which it is purported that technology platforms damage democratic discussion, from the proliferation of ‘filter bubbles’ to the promotion of outrageous content by algorithmic design. Whilst it does not discount these risks, the committee observes that the evidence it heard was mixed, something researchers have often pointed out stemmed from a lack of data.
In order to hold powerful technology platforms properly to account, there needs to be greater transparency about their practices. The report recommends that Ofcom be given the power to compel technology companies to facilitate research that is in the public interest through data sharing. Furthermore, it proposes greater transparency regarding the algorithms used by platforms, with Ofcom required to conduct regular audits.
Free and fair elections
It is abundantly clear from the committees and commissions that have reported on the subject that electoral law has not kept up with the pace of technological change. The current legal framework was designed for a more analogue age, and, as the committee argue, its insufficiency undermines trust in the electoral process.
One issue that has been raised repeatedly in these discussions, including by the Unit, is that of imprints on digital campaign materials. Whilst offline election advertisements are required to show who is behind the material, no parallel requirement is made of online adverts, making them less transparent. The committee helpfully points out that the Political Parties, Elections and Referendums Act 2000 (PPERA) contains the delegated power to extend imprints to online election material. It recommends that the government make this change via secondary legislation without delay.
Since 2010, the Electoral Commission has had the power to issue fines of up to £20,000. However, with a total of £41.6 million being spent at the 2017 election, there is a worry that a £20,000 fine could be considered merely a cost of doing business. On this basis, the committee recommends raising maximum fine the Electoral Commission can levy to £500,000, or 4% of the total campaign spend, whichever is greater. A number of other extensions to the power and remit of the Electoral Commission are proposed, including oversight of local candidate spending and further information gathering powers.
Several additional measures put forward are designed to restore trust in online election-related activity. These include greater scrutiny of small donations, a comprehensive, publicly accessible library of online political advertising, and a statutory code of practice governing the use of personal data by political campaigns.
Active citizens: digital media literacy
The committee is clear that ensuring a healthy democracy requires more than purely preventative measures. Equipping citizens with the knowledge and skills they need to be active participants is also crucial. The extent to which citizenship education in the UK has been successful in providing adequate political literacy is questionable. What is clear is that additional skills are now needed to prepare citizens to take part in an increasingly online environment.
Foremost amongst these is digital media literacy, which the committee defines as being able to distinguish fact from fiction, understand digital platforms and influence decision-makers online. The report examines the examples of Finland and Estonia, both of which have made concerted efforts to impart the critical skills needed for digital citizenship. The report describes the UK curriculum, by comparison, as representing ‘a chronic lack of ambition’.
There are at present a diverse and diffuse range of digital media literacy initiatives, many of which are led by civil society groups. The government has not taken a leading role on this front, which might be partly explained by a lack of clarity about departmental responsibility. The report makes two recommendations aimed at improving this situation. Firstly, there should be a large-scale programme set up to map, review and evaluate existing initiatives. Secondly, the Department for Education, informed by the above review, should work to embed critical digital media literacy across the curriculum.
Inclusive debate: democracy-enhancing technology
Although much of the report focuses on mitigating the damaging effects of online platforms, the committee also considers the ways in which technological innovations can help support democracy. It is positive about digital tools that help citizens better engage with the representative system, such as TheyWorkForYou, but warns against seeing technology as a silver bullet or a way of doing public engagement on the cheap.
The committee also praises recent citizens’ assemblies as ‘a positive, UK-based example of engagement blending digital and face-to-face participation.’ It highlights in particular the Innovation in Democracy Programme (IiDP), which used digital tools to feed into the offline deliberation process. Coronavirus has led to increased experimentation with online deliberative engagement. Whilst important lessons can be learnt from this, the committee believes that methods of non-digital engagement, such as face-to-face deliberation, should not be abandoned.
The report comments on the work being done by mySociety and Democracy Club in aggregating electoral information. However, it notes that too much of the legwork here is being done by small civic technology companies, with no centralised, official source for this information.
To fill this gap, the committee adopted the proposal put forward by Alan Renwick, Michela Palese and Joe Mitchell for an independent democratic information hub. The hub would be a publicly funded, coordinated brand with pathways to different forms of relevant information. It would include basic information such as where to vote and who the candidates are. As it becomes an established, trusted brand it could build on this to include analysis from other independent institutions, such as the Institute for Fiscal Studies and the Office for Budget Responsibility. As well as this public-facing function, the committee envisages the information hub as a centre for policymakers and civil society organisations to connect and share best practice in digital democracy.
To mark the 25th anniversary of the Constitution Unit we have launched a special fundraising initiative, encouraging supporters to make donations incorporating the figures 2 and 5. If you value our work, and would like to contribute to its future continuation, please consider making a one-off or regular donation. Contributions are essential to supporting our public-facing work. Find out more on our donations page.
Sign up to our mailing lists, for news of events and publications, or to our blog, here.
About the author
Alex Walker is a former research volunteer at the Constitution Unit.
Pingback: Covid-19 News 6 July– 12 July 2020 – Middle Temple Library Blog