What new challenges does the changing nature of campaigning pose for referendum regulation?

me-2015-large-e1485255919145.jpg jess-sargeant-resizedEarlier this year, the Constitution Unit established an Independent Commission on Referendums to review the role of referendums in British democracy – whose work will be discussed at a public seminar next week. In this blogpost, Alan Renwick and Jess Sargeant examine some of the difficult questions the commission will have to consider. Their focus is on the way in which political campaigning has changed since 2000, when the current legislation regulating referendums was enacted.  

The UK’s current legislation regulating the conduct of referendums – the Political Parties, Elections and Referendums Act (PPERA) 2000 – was designed and introduced almost two decades ago. Since then, technological innovations have led to new ways of campaigning and communicating. These changes create new challenges for referendums regulation. While most of these challenges are not unique to referendums – they apply equally to elections – one key task of the Independent Commission on Referendums is to assess how well the existing rules work in the context of new digital developments and to consider solutions to some of the problems posed by the modern world. This blog post explores just some of those challenges.

Financial regulation doesn’t reflect the modern world

Increasingly, political campaigners are using social media to communicate with voters. We know this because we can observe political adverts on Facebook, Twitter, and even Instagram during elections and referendum campaigns. However, we have very little information about how much money they are spending to do so. This is because financial regulation of political campaigns, first designed in 2000, has yet to be updated to reflect the nature of campaigning in the modern world.

Registered referendum campaign groups are required to submit returns of referendum expenses. The purpose of these transparency requirements is to allow campaign spending to be scrutinised by both the Electoral Commission and the public. Financial transparency requirements apply equally to expenses incurred for online and for offline campaigning. However, how this is reported makes scrutiny of online spending difficult. There is no separate category for spending on social media: such spending is reported as either ‘advertising’ or ‘unsolicited material sent to voters’. Furthermore, within this category it is only identifiable if spent directly with the platform, such as Facebook, Twitter, or YouTube. Spending through agencies remains opaque, with no breakdown of how money is used. In this area, it could be argued that transparency requirements are rendered meaningless.

Another problem is that the largest costs associated with online campaigning may not be captured by current financial regulations at all. Groups wanting to spend over £10,000 in a referendum campaign must register with the Electoral Commission as permitted participants. During the ten-week regulated referendum period they must adhere to spending limits. The limits for the 2016 EU referendum were as follows:

sketch-1515508273472

But the major costs – such as creating databases or profiling citizens for micro-targeting – may be incurred outside the regulated period. Groups with comprehensive, expensive databases collected prior to the 10-week period could gain an advantage over opponents, undermining the level playing field that spending limits are designed to achieve.

‘Dark ads’ and micro-targeting 

Social media can be an important tool for democratic engagement, allowing parties and campaigners to communicate with wider, and usually younger, audiences. However, concerns have been raised about the use of big data to profile UK voters and target them with highly tailored messaging. This concern is amplified by the fact that most micro-targeted ads are not public. Facebook’s ‘dark ads’ are only visible to Facebook, the advertiser, and the person being targeted. As a result, it is difficult for any claims or promises made by campaigners in these advertisements to be publicly scrutinised or debated.

It can be argued that micro-targeting is simply an old campaign technique in a new form. For example, Claire Bassett, Chief Executive of the Electoral Commission, has said:

‘Targeting is not new for political campaigning; parties and candidates have always focused their efforts on different demographics or swing voters. For decades, certain streets in a constituency may have received one version of a leaflet, with another half a mile away getting a different one. What we are seeing now is an evolution of this, deploying a different medium, albeit with increased scale, speed and specificity.’

However, a key difference between online and offline political campaign communication is the requirement to identify the source. The Political Parties, Elections and Referendum Act 2000 requires all referendum material to include an imprint stating the details of the printer and promoter of the material, and the person on whose behalf the material is promoted. As the legislation specifically refers to printing, it does not apply to online referendum material, making it much harder to identify who is disseminating campaign messages. This can undermine transparency and make it much more difficult to hold campaign groups to account.

‘Fake news’ and disinformation

The invention of the internet and social media has meant that anyone is able to publish content that can be viewed by millions of people. As such, the importance of traditional gatekeepers of information – journalists and established news organisations – has declined. Whilst gatekeepers are expected to make efforts to establish the legitimacy and accuracy of the information they are publishing, the average citizen has few obligations in this regard. As a result, false or inaccurate information has been able to proliferate on the internet, in a phenomenon often termed ‘fake news’. There is concern that disinformation can interfere with opinion formation: voters may make decisions based on false or inaccurate information. Damian Collins MP, chair of the Culture, Media and Sport Committee, which is currently undertaking an inquiry into fake news, has called it ‘a threat to democracy’.

Disseminators of misinformation are often posing as informational ‘news’ sources rather than campaigners with the stated intention of convincing the public to vote for a particular outcome. Fake news sources lack the accountability of referendum campaigners and information is more likely to be completely fictitious and deployed for non-democratic purposes.

Fake news can be amplified by fake profiles designed to distort discourse and debate, and influence voters. There are no requirements to confirm your identity when using the internet or creating a social media profile. As a consequence, it is much easier to pose as someone else than it was previously. As early as 2002, concerns were raised that companies may be creating fake profiles to shape debate on issues whilst appearing to be neutral third parties – a practice known as ‘astroturfing’. Recently, there have been allegations that the Russian government has been seeking to corrupt democratic votes around the world: Theresa May suggested in November that it was ‘seeking to weaponise information’ and ‘[d]eploying its state-run media organisations to plant fake stories and photo-shopped images in an attempt to sow discord in the West and undermine our institutions’. One group of researchers found that 419 suspended Twitter accounts that had been tweeting about Brexit were operating from the Russian Internet Research Agency, though a recent study by the Oxford Internet Institute concluded that the importance of bots in influencing political debate is limited and that only a tiny proportion are of Russian origin.

The ‘new’ gatekeepers are now tech companies on whose platforms fake news spreads. Some efforts have been made to tackle this phenomenon. Facebook now employs fact checkers who can flag false stories as ‘disputed’, although the effectiveness of their efforts has been questioned. Twitter has begun suspending suspected bot accounts. However, as private companies, they have limited incentives to act, as fake news can provide financial remuneration.

Next steps

None of the problems present easy solutions, and most of these issues are the subject of their own in-depth inquiries and commissions. It will not be the job of the Independent Commission on Referendums to solve all these problems. Nonetheless, their recommendations will need to take account of the new environment in which referendums operate.

This post draws on background research prepared for the Independent Commission on Referendums and does not represent the views of the Commission or any of its members.

To hear more about the Commission and its work, register for our seminar to be held at UCL at 1pm on 17 January 2018: register here.

The commission is keen to hear your thoughts on the questions within its remit. If you would like to express your views, please do so by filling in a public consultation formThe deadline for submissions is 15 February 2018.

About the authors

Dr Alan Renwick is the Deputy Director of the Constitution Unit. 

Jess Sargeant is a Research Assistant for the Independent Commission of Referendums, based at the Constitution Unit. 

3 thoughts on “What new challenges does the changing nature of campaigning pose for referendum regulation?

  1. Pingback: Reforming referendums: how can their use and conduct be improved? | The Constitution Unit Blog

  2. The existing Electoral Commission guidance to imprints on electronic media (taken from the GE guidance) is:

    Social media
    1.41 You should display your full imprint details prominently on your profile. You can include a shortened link to your imprint in
    your tweet or post. If it is impractical to place a full imprint on to an image, you should include the text of a link, or a hyperlinked
    logo or emblem that leads to your full imprint.
    1.42 We also suggest that if you use online discussion forums you make your identity as a candidate clear where possible.

    Websites and other electronic material
    1.43 You should also put an imprint on electronic material, such as websites and emails. The imprint should include the
    name and address of the promoter and the organisation on whose behalf it has been produced.

    If the Commision’s recommendations become mandatory requirements, then wouldn’t this be sufficient (replace should by must)?

Leave a comment