The EU referendum was the most fact-checked referendum of all time, yet voters were badly misinformed on key issues. In this post Zander Goss and Alan Renwick consider the effectiveness of fact-checking during the referendum. They conclude that, although fact-checkers were unable to overcome rampant misinformation, fact-checking must be embraced. Some suggestions are offered for how fact-checkers might better cut through to voters in future.
Fact-checking was a prominent feature of the EU referendum. Indeed, this was likely the most fact-checked referendum to date not only in the UK but anywhere in the world. Nevertheless, polling evidence suggests that widespread misperception of the EU and related issues such as immigration and so-called ‘benefit tourism’ remained – a Financial Times commenter even suggested after the vote that the UK had become a ‘post-factual democracy’. This post looks at the extent and nature of fact-checking in the UK and asks whether anything could be done to increase its impact. We are not yet ready to provide answers, but we seek to identify issues that deserve further discussion.
What is fact-checking and who are the fact-checkers?
Fact-checking is a form of journalism often credited as arising from ‘ad watches’ in the early 1990s, which assessed claims in American political advertising. Fact-check teams exercise editorial judgement to select verifiable assertions made by politicians and thoroughly analyse them, thereby informing voters and helping them to hold politicians accountable. The practice has grown dramatically since the founding of pioneers such as FactCheck.org in 2004 and PolitiFact.com in 2007. Duke University Reporters’ Lab’s 2016 fact-checking census found a 50 per cent increase in fact-checking sites worldwide in the year to 15 February 2016, listing 96 active projects in 37 countries.
This post focuses on four UK fact-checking sites contained in the Reporters’ Lab’s database: BBC Reality Check, Channel 4 FactCheck, Full Fact, and The Conversation UK. We exclude the Guardian’s Reality Check site, which is also included in Duke’s database, because of the newspaper’s public support for remaining in the EU. As broadcasters, the BBC and Channel 4 are of course required to remain impartial. Full Fact is a fact-checking charity using donations from individuals and charitable trusts to support fact-checks written by researchers and journalists. The Conversation UK publishes articles written by academics and edited by journalists in an effort to bring expert knowledge more directly to the public. BBC Reality Check was introduced for the 2015 General Election and revived for the EU referendum. The other three fact-checking units had operated for some time before the referendum was called.
How much fact-checking was there?
Full Fact and BBC Reality Check were by far the most prolific fact-checkers over the 125 days between David Cameron’s announcement of the referendum on 20 February and the close of polls on 23 June. In total, the four websites produced 454 fact-checks.
|Fact-checkers for the 2016 referendum on the UK’s membership of the EU|
|Organisation||Number of checks||Notes|
|Full Fact||212*||Referendum fact-checks collected in a ‘Europe’ section and subdivided by topic|
|BBC Reality Check||207*||All referendum fact-checks in a single uncategorised feed|
|The Conversation UK||18||Referendum fact-checks collected in a ’Brexit‘ section, but not otherwise categorised|
|Channel 4 Fact-check||17||Referendum fact-checks not separated from other fact-checks, but keyword search available|
|* Includes videos and live fact-checks|
By contrast, during the 2014 Scottish independence referendum, Full Fact posted just under twenty fact-checks. PolitiFact California has produced only four fact-checks across two referendum questions since its launch in November 2015. PolitiFact Florida published 17 checks on six ballot questions since March 2010, whilst the Georgia affiliate offered 14 checks on one referendum question in 2012. Having reviewed the fact-checking available in a range of recent referendums in the US and around the world, we are confident the UK referendum on EU membership made political history by being the most fact-checked referendum to date.
Was fact-checking effective?
Bill Adair, founder of PolitiFact and Director of Duke’s Reporters’ Lab, has stated that his ’goal is not to get politicians to stop lying … [but] to give people the information they need to make decisions‘. For the EU referendum, Full Fact specifically aimed to inform voters and assist journalists, but the organisation’s long-term goals also include ‘prevent[ing] inaccurate claims from being made in the first place’. Adair and Full Fact may have slightly different priorities, but both statements reveal two distinct ways fact-checking can affect political discourse: correcting misperceptions and deterring misrepresentations. It is much too early to assess the impact of fact-checking during the EU referendum in detail: that will require careful analysis of a range of sources. But some preliminary observations can be offered.
On the one hand, the fact-checkers clearly had impressive reach. Full Fact has published an after-action report on its work during the campaign. Its web traffic doubled overnight when the referendum was called – and continued to grow thereafter. It provided recurring fact-check segments for Good Morning Britain (ITV), Victoria Derbyshire (BBC), Sky News, and LBC, and its fact-checks were featured in many of the main newspapers. It also had significant exposure over social media, hosted a Wikipedia ‘edit-a-thon’, and was mentioned in the Commons. The BBC’s fact-checks were used in other BBC journalism online, on radio, and on television, and @BBCRealityCheck now has over 22,000 followers on Twitter. Identifying the impact of Channel 4 Fact-check is harder, but The Conversation UK reported considerable interaction on social media and some republication in traditional media.
On the other hand, much misinformation clearly remained. Ipsos MORI polling showed that misperceptions remained rife among voters. The Leave campaign’s infamous claim that the UK sent the EU £350 million per week (or some variant on this claim) was fact-checked by The Conversation UK (28/4), received a ‘Fiction’ rating by Channel 4 (19/4), and appeared in five separate fact-checks by the BBC (7/3, 11/3, 15/4, 22/4, 15/6) and a whopping 13 posts by Full Fact between 25 February and 22 June. Initially, these fact-checks explained why the £350 million figure failed to tell the full story, but by mid-April, and as Brexiteers persisted with the claim, the tone at Full Fact and the BBC shifted to declare that Leave’s figure was simply wrong. Despite the unequivocal verdicts in these fact-checks, by mid-June 47 per cent of British adults still thought that the £350 million per week claim was true (39 per cent said it was false, and 14 per cent were unsure).
The failure of such extensive fact-checking to curb public misperception or continued misrepresentations by politicians raises the question of whether anything could have been done to increase the impact of this work.
Corrective effects: visual ratings and readership
In the US and around the world, roughly four out five fact-checkers use ratings systems. These are usually depicted as graphics illustrating how accurate the alleged fact is, such as PolitiFact’s six-point Truth-O-Meter™ scale from ‘True’ to ‘Pants on Fire’ (from the playground taunt ‘Liar! Liar! Pants on fire!’) and the Washington Post Fact Checker’s Pinocchio scale (wholly accurate claims receive a ‘Gepetto Checkmark’, while anything less than completely true receives between one and four Pinocchios). In the UK, however, only Channel 4 uses a rating system, from Fact to Fiction, with a medium rating for claims which are dubious, cannot be verified, or are a thorough mix of fact and fiction.
Of Channel 4’s 17 fact-checks touching on the referendum, only five posts included visual ratings. Channel 4 issued nine ratings in total (two posts contained multiple ratings for different claims), comprising three ‘Facts’, four ‘Fictions’, and three medium ratings. Remain tallied one ‘Fact’ and one medium, while Leave racked up two ‘Facts’, two mediums, and all four ‘Fictions’.
Whether the use of visual ratings makes a difference to the impact of fact-checking is unclear. The corrective effect of similar rating systems was tested in a 2015 online survey experiment. The researchers found that participants were more likely to accurately assess a claim if they read a fact-check and that a majority of participants preferred having visual ratings. However, fact-checks were equally effective regardless of whether they included a visual rating. But further research is needed to investigate whether the same holds true in real-world settings, especially in the UK. We might expect visual ratings to make fact-checking conclusions more memorable over time, which the experimental design could not pick up.
A companion study found that interest in fact-checking articles was significantly greater among respondents with high levels of political knowledge than those with low knowledge. The learning effects of fact-checking were likewise greater for highly politically aware respondents than for less aware respondents. If there are fears about fact-checking being a playground for the political junkie, then it is interesting to note that the BBC Reality Check site presented a chronological feed of fact-checks. Presumably, a chronological feed would be better suited to those who actively followed the news and campaigns than to the casual voter seeking information.
Deterrent effects: could the fact-checkers hold politicians accountable?
Regardless of whether visual ratings have a corrective effect on voter information, they may yet discourage politicians from making flawed claims. It is becoming clear in the US –where 81 per cent of fact-checkers use rating systems – that politicians are responding to the rise of fact-checking. Jeb Bush provided a notable example in 2014 when he hedged the statistics he relied on for fear of being ‘PolitFacted’. Results from a field experiment in 2012 suggest that reminding politicians of the risks to their reputation if their claims are labelled false or misleading may deter them from fibbing. American state legislators who were sent letters to this effect in the months leading up to the 2012 general election were markedly less likely subsequently to receive negative assessments from fact-checkers than their peers who were sent vaguer letters or no letters at all. Further research on this point would be quite beneficial to improve the level of debate for any future referendum, especially as there have been concerns over how to hold responsible those politicians who made misleading claims during the campaign (which may be a symptom of the increasing personalisation of politics in the UK).
It may be that UK fact-checkers could increase the deterrent effect of their posts if their verdicts came with some standard rating and were indexed more systematically. Whereas PolitiFact California’s webpage, for example, can be sorted by people, election, subject matter, or rating (showing every claim and individual who has been labelled ‘Pants on Fire’), Full Fact was alone amongst UK fact-checkers for indexing its site, and even then only according to subject matter. Fact-checkers’ overall ratings of candidate truthfulness have received much attention in the current US presidential election cycle. Making similar ratings available in the recent referendum campaign – either for individual campaigners or for the two sides of the debate in their entirety – might have cut through to voters in a way that large numbers of earnest and laudable fact-checking articles did not.
Fact-checking must be embraced
Although the fact-checkers were unable to overcome rampant misinformation during the referendum campaign, we will never know how much worse public understanding would have been without the 454 fact-checks that did circulate. The rigorous and independent fact-checking movement is still young in this country, but it has made a positive mark on UK politics and should be welcomed into the mainstream. We hope that some of the suggestions above, and any future research in the area, may help to this end.
About the authors
Zander Goss is a Research Volunteer at the Constitution Unit.
Dr Alan Renwick is the Deputy Director of the Constitution Unit.