Fact-checking and the EU referendum

3adbd41me 2015 (large)

The EU referendum was the most fact-checked referendum of all time, yet voters were badly misinformed on key issues. In this post Zander Goss and Alan Renwick consider the effectiveness of fact-checking during the referendum. They conclude that, although fact-checkers were unable to overcome rampant misinformation, fact-checking must be embraced. Some suggestions are offered for how fact-checkers might better cut through to voters in future.

The claim: Despite the referendum on EU membership being the most fact-checked referendum of all time, many voters were badly misinformed.

The verdict: TRUE. It is extremely unlikely any other referendum has ever been as extensively fact-checked as this one. Sadly, misinformation was rampant even as voters went to the polls. No one is certain how to make fact-checking more effective, but there are many ideas which merit further research.

 

Fact-checking was a prominent feature of the EU referendum. Indeed, this was likely the most fact-checked referendum to date not only in the UK but anywhere in the world. Nevertheless, polling evidence suggests that widespread misperception of the EU and related issues such as immigration and so-called ‘benefit tourism’ remained – a Financial Times commenter even suggested after the vote that the UK had become a ‘post-factual democracy’. This post looks at the extent and nature of fact-checking in the UK and asks whether anything could be done to increase its impact. We are not yet ready to provide answers, but we seek to identify issues that deserve further discussion.

What is fact-checking and who are the fact-checkers?

Fact-checking is a form of journalism often credited as arising from ‘ad watches’ in the early 1990s, which assessed claims in American political advertising. Fact-check teams exercise editorial judgement to select verifiable assertions made by politicians and thoroughly analyse them, thereby informing voters and helping them to hold politicians accountable. The practice has grown dramatically since the founding of pioneers such as FactCheck.org in 2004 and PolitiFact.com in 2007. Duke University Reporters’ Lab’s 2016 fact-checking census found a 50 per cent increase in fact-checking sites worldwide in the year to 15 February 2016, listing 96 active projects in 37 countries.

This post focuses on four UK fact-checking sites contained in the Reporters’ Lab’s database: BBC Reality Check, Channel 4 FactCheck, Full Fact, and The Conversation UK. We exclude the Guardian’s Reality Check site, which is also included in Duke’s database, because of the newspaper’s public support for remaining in the EU. As broadcasters, the BBC and Channel 4 are of course required to remain impartial. Full Fact is a fact-checking charity using donations from individuals and charitable trusts to support fact-checks written by researchers and journalists. The Conversation UK publishes articles written by academics and edited by journalists in an effort to bring expert knowledge more directly to the public. BBC Reality Check was introduced for the 2015 General Election and revived for the EU referendum. The other three fact-checking units had operated for some time before the referendum was called.

politifact

Source: xdcd (http://xkcd.com/1712/)

How much fact-checking was there?

Full Fact and BBC Reality Check were by far the most prolific fact-checkers over the 125 days between David Cameron’s announcement of the referendum on 20 February and the close of polls on 23 June. In total, the four websites produced 454 fact-checks.

Fact-checkers for the 2016 referendum on the UK’s membership of the EU
Organisation Number of checks Notes
Full Fact 212* Referendum fact-checks collected in a ‘Europe’ section and subdivided by topic
BBC Reality Check 207* All referendum fact-checks in a single uncategorised feed
The Conversation UK 18 Referendum fact-checks collected in a ’Brexit‘ section, but not otherwise categorised
Channel 4 Fact-check 17 Referendum fact-checks not separated from other fact-checks, but keyword search available
* Includes videos and live fact-checks

By contrast, during the 2014 Scottish independence referendum, Full Fact posted just under twenty fact-checks. PolitiFact California has produced only four fact-checks across two referendum questions since its launch in November 2015. PolitiFact Florida published 17 checks on six ballot questions since March 2010, whilst the Georgia affiliate offered 14 checks on one referendum question in 2012. Having reviewed the fact-checking available in a range of recent referendums in the US and around the world, we are confident the UK referendum on EU membership made political history by being the most fact-checked referendum to date.

Was fact-checking effective?

Bill Adair, founder of PolitiFact and Director of Duke’s Reporters’ Lab, has stated that his ’goal is not to get politicians to stop lying … [but] to give people the information they need to make decisions‘. For the EU referendum, Full Fact specifically aimed to inform voters and assist journalists, but the organisation’s long-term goals also include ‘prevent[ing] inaccurate claims from being made in the first place’. Adair and Full Fact may have slightly different priorities, but both statements reveal two distinct ways fact-checking can affect political discourse: correcting misperceptions and deterring misrepresentations. It is much too early to assess the impact of fact-checking during the EU referendum in detail: that will require careful analysis of a range of sources. But some preliminary observations can be offered.

On the one hand, the fact-checkers clearly had impressive reach. Full Fact has published an after-action report on its work during the campaign. Its web traffic doubled overnight when the referendum was called – and continued to grow thereafter. It provided recurring fact-check segments for Good Morning Britain (ITV), Victoria Derbyshire (BBC), Sky News, and LBC, and its fact-checks were featured in many of the main newspapers. It also had significant exposure over social media, hosted a Wikipedia ‘edit-a-thon’, and was mentioned in the Commons. The BBC’s fact-checks were used in other BBC journalism online, on radio, and on television, and @BBCRealityCheck now has over 22,000 followers on Twitter. Identifying the impact of Channel 4 Fact-check is harder, but The Conversation UK reported considerable interaction on social media and some republication in traditional media.

On the other hand, much misinformation clearly remained. Ipsos MORI polling showed that misperceptions remained rife among voters. The Leave campaign’s infamous claim that the UK sent the EU £350 million per week (or some variant on this claim) was fact-checked by The Conversation UK (28/4), received a ‘Fiction’ rating by Channel 4 (19/4), and appeared in five separate fact-checks by the BBC (7/3, 11/3, 15/4, 22/4, 15/6) and a whopping 13 posts by Full Fact between 25 February and 22 June. Initially, these fact-checks explained why the £350 million figure failed to tell the full story, but by mid-April, and as Brexiteers persisted with the claim, the tone at Full Fact and the BBC shifted to declare that Leave’s figure was simply wrong. Despite the unequivocal verdicts in these fact-checks, by mid-June 47 per cent of British adults still thought that the £350 million per week claim was true (39 per cent said it was false, and 14 per cent were unsure).

The failure of such extensive fact-checking to curb public misperception or continued misrepresentations by politicians raises the question of whether anything could have been done to increase the impact of this work.

Corrective effects: visual ratings and readership

In the US and around the world, roughly four out five fact-checkers use ratings systems. These are usually depicted as graphics illustrating how accurate the alleged fact is, such as PolitiFact’s six-point Truth-O-Meter™ scale from ‘True’ to ‘Pants on Fire’ (from the playground taunt ‘Liar! Liar! Pants on fire!’) and the Washington Post Fact Checker’s Pinocchio scale (wholly accurate claims receive a ‘Gepetto Checkmark’, while anything less than completely true receives between one and four Pinocchios). In the UK, however, only Channel 4 uses a rating system, from Fact to Fiction, with a medium rating for claims which are dubious, cannot be verified, or are a thorough mix of fact and fiction.

Of Channel 4’s 17 fact-checks touching on the referendum, only five posts included visual ratings. Channel 4 issued nine ratings in total (two posts contained multiple ratings for different claims), comprising three ‘Facts’, four ‘Fictions’, and three medium ratings. Remain tallied one ‘Fact’ and one medium, while Leave racked up two ‘Facts’, two mediums, and all four ‘Fictions’.

Channel 4's visual ratings scale. Source: Channel 4 FactCheck (http://blogs.channel4.com/factcheck/)

Channel 4’s visual ratings scale. Source: Channel 4 FactCheck (http://blogs.channel4.com/factcheck/)

Whether the use of visual ratings makes a difference to the impact of fact-checking is unclear. The corrective effect of similar rating systems was tested in a 2015 online survey experiment. The researchers found that participants were more likely to accurately assess a claim if they read a fact-check and that a majority of participants preferred having visual ratings. However, fact-checks were equally effective regardless of whether they included a visual rating. But further research is needed to investigate whether the same holds true in real-world settings, especially in the UK. We might expect visual ratings to make fact-checking conclusions more memorable over time, which the experimental design could not pick up.

A companion study found that interest in fact-checking articles was significantly greater among respondents with high levels of political knowledge than those with low knowledge. The learning effects of fact-checking were likewise greater for highly politically aware respondents than for less aware respondents. If there are fears about fact-checking being a playground for the political junkie, then it is interesting to note that the BBC Reality Check site presented a chronological feed of fact-checks. Presumably, a chronological feed would be better suited to those who actively followed the news and campaigns than to the casual voter seeking information.

Deterrent effects: could the fact-checkers hold politicians accountable?

Regardless of whether visual ratings have a corrective effect on voter information, they may yet discourage politicians from making flawed claims. It is becoming clear in the US –where 81 per cent of fact-checkers use rating systems – that politicians are responding to the rise of fact-checking. Jeb Bush provided a notable example in 2014 when he hedged the statistics he relied on for fear of being ‘PolitFacted’. Results from a field experiment in 2012 suggest that reminding politicians of the risks to their reputation if their claims are labelled false or misleading may deter them from fibbing. American state legislators who were sent letters to this effect in the months leading up to the 2012 general election were markedly less likely subsequently to receive negative assessments from fact-checkers than their peers who were sent vaguer letters or no letters at all. Further research on this point would be quite beneficial to improve the level of debate for any future referendum, especially as there have been concerns over how to hold responsible those politicians who made misleading claims during the campaign (which may be a symptom of the increasing personalisation of politics in the UK).

It may be that UK fact-checkers could increase the deterrent effect of their posts if their verdicts came with some standard rating and were indexed more systematically. Whereas PolitiFact California’s webpage, for example, can be sorted by people, election, subject matter, or rating (showing every claim and individual who has been labelled ‘Pants on Fire’), Full Fact was alone amongst UK fact-checkers for indexing its site, and even then only according to subject matter. Fact-checkers’ overall ratings of candidate truthfulness have received much attention in the current US presidential election cycle. Making similar ratings available in the recent referendum campaign – either for individual campaigners or for the two sides of the debate in their entirety – might have cut through to voters in a way that large numbers of earnest and laudable fact-checking articles did not.

Fact-checking must be embraced

Although the fact-checkers were unable to overcome rampant misinformation during the referendum campaign, we will never know how much worse public understanding would have been without the 454 fact-checks that did circulate. The rigorous and independent fact-checking movement is still young in this country, but it has made a positive mark on UK politics and should be welcomed into the mainstream. We hope that some of the suggestions above, and any future research in the area, may help to this end.

About the authors

Zander Goss is a Research Volunteer at the Constitution Unit.

Dr Alan Renwick is the Deputy Director of the Constitution Unit.

9 thoughts on “Fact-checking and the EU referendum

  1. well, here is a fact for you re the £350 million per week claimed to go to EU coffers.
    Figures from the ONS recently released confirm that the much contested £350 million per week sent to the EU was an underestimate! So the Brexiteers were right after all (which we knew anyway), and of course the other massive costs of the imposts on British business by the tangled thicket of EU regulation is just about incalculable and obviously therefore cannot be included in the £350 million:

    The EU billed the UK almost £376m each week in 2015, according to the ONS

    According to the Office of National Statistics the EU billed the UK for £19.6 billion in 2015 or almost £376 million each week.

    The revelation comes after repeated claims by the Remain campaign and its supporters that Vote Leave’s battlebus gross figure of £350 million a week going to the EU was “a lie”.

    They claimed the number was much lower but instead the ONS has confirmed it was an UNDER-estimate.

    Once the UK rebate worth around £4.9 billion a year and EU payments to specific projects in Britain had been taken away then the UK handed over £10.9 billion to the EU in 2015 or £199 million each week.

    According to the Office of National Statistics the correct figure is £376m a week.

  2. The referendum was a complete disgrace. We allowed a charlatan Republic to take hold. In Portsmouth, press and politicians worked hard to regurgitate things which were simply not true. People on both sides of the argument with sensible and measured arguments, some of which finessed over years of research muscled out by thugs given a sense of entitlement that this is how to win. Anti Polish abuse was scrawled on a war memorial in my city.

    Lying to voters goes hand in hand with voter intimidation and abuse. Some academics absorbed too much for speaking up. The mentality was ‘whatever it takes. Our politicians were culpable in putting people in harms way with xenophobic rhetoric causing hate crime to shoot up. They deny it but there should be a list’s league compiled ahead of the September 5th debate. Not all areas were as unlucky as Portsmouth.

    The electoral commission should have stepped in when concerns were originally raised. How is it possible the returning officer could be neutral if the organisation he belonged to was telling people how to vote in the referendum? (and that on the back of a lies) the electoral commission should have stripped the agents and canvassers of being allowed anywhere near the public. When these people were asked to explain their views, answer the questions, those politicians took a vow of silence. It wasn’t a referendum, it was a circus of the surreal

  3. I thought the post by Dr Renwick prior to the referendum (see https://constitution-unit.com/2016/06/08/can-we-improve-the-quality-of-the-referendum-debate/) went further than this one in examining the possibility of some legal sanction against clear factual untruths in election or referendum campaigns. The highly reprehensible – and highly effective – campaigning methods in this recent referendum (to some extent by both sides but in terms of blatant lying clearly more by the “leave” side) make to my mind action in this direction worthy of serious consideration – as indeed I argued in the comments posted at that time. Obviously the risks to freedom of speech must be carefully examined and minimised but it is just not good enough to rely on “unofficial” fact checking sites. These can simply be countered with equally weighty-sounding sources claiming the opposite.

    Political decisions of the magnitude addressed by the EU referendum are just too important to justify the status quo in this regard.

  4. For Zander Goss and Alan Renwick –

    You may possibly be interested in my paper attached on the flawed referendum – it’s been very well received. I should be grateful for corrections.

    ________________________________

  5. Skfarouk. “Our politicians were culpable in putting people in harms way with xenophobic rhetoric causing hate crime to shoot up”

    What on earth is that supposed to mean? There is no such thing as the worn out concept of “hate crime”. There is hate, and there is crime, but not hate crime. It is not a crime to hate. This is another neo-Marxist attempt to curb suppress or remove free speech, or the expression of criticism of people or policies by those with whom we disagree by using words empty of meaning.

    I have no love of politicians but we put them in place to offer opinions, policies, and to legislate. Get used to it!

  6. I’m surprised that nobody really seems to mention the “backfire effect” when talking about fact checking services.

    Attempting to debunk a lie by a politician by repeating it in BIG BOLD LETTERS at the top and then spending several paragraphs on picking it apart can actually lead to people believing the lie more strongly.

    See “When Corrections Fail: The Persistence of Political Misperceptions”, doi: 10.1007/s11109-010-9112-2

  7. @grahamwood32

    This poster displays a common misconception about “hate crime”. This is in no way the creation of “hate” as a crime in itself. Hate crime may be defined as “Any criminal offence which is perceived, by the victim or any other person, to be motivated by a hostility or prejudice based on a person’s race or perceived race” or
    “Any criminal offence which is perceived, by the victim or any other person, to be motivated by a hostility or prejudice based on a person’s religion or perceived religion”

    It has entered the legal lexicon because the police and other criminal justice agencies consider all hate crime to be very serious, including racist and religious hate crime. When a case is prosecuted, the courts can impose a stronger sentence under powers from the Criminal Justice Act 2003. This reflects the priority placed on these crimes. However the fact that someone simply hates a person or thing does not constitute a crime.

  8. Pingback: We must be realistic about what independent regulation of referendum campaigns might be able to achieve | The Constitution Unit Blog

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s