The politics of polling Contents

Chapter 3: Media reporting of voting intention polls

96.In the previous chapter we highlighted the fact that, while polling practices might not be perfect, polling organisations have every incentive to be accurate and they are continually trying to improve their techniques in order to ensure that their methodologies provide results which are as robust as possible. However, we were concerned to hear that these efforts can sometimes be undermined by the ways in which voting intention polls are presented, interpreted and reported on in the media. While there are undoubtedly good examples of media reports about polls, there are also examples of polls being reported on in a hyperbolic way, overstating the importance of small changes that are not distinguishable from sampling variability. These undermine the efforts of polling organisations to reflect public opinion accurately.

97.In this Chapter we consider media reporting of voting intention polls specifically. In general, the voting intention polls to which we refer in this Chapter are produced by polling organisations which are members of the British Polling Council (BPC) (in Chapter 4 we consider policy issues polls which are produced by a much wider range of individuals and organisations). We discuss the media as a whole, though we recognise that different challenges are faced by individual sectors of the media, and that different guidelines and regulations cover traditional print media, broadcasters and online publishers. The details of such regulations are covered in more detail in Chapter 5.

Concerns about media reporting of polls

98.Will Moy, Director of Full Fact, said: “Lots of claims are made about polling at election time, and we have seen a wide variety of nonsense about polls.”108 Dr Lauderdale, from the London School of Economics, also told us that: “There are particular pathologies in the way the media presents polls, which are frequent and well known: overemphasising small changes from the last poll or a poll done by a different pollster, changes that are consistent with the random variation inherent in any kind of survey.”109 In this section, we outline some of the main criticisms of media reporting on polls put to us during the inquiry.

The ‘horse race’

99.A key criticism which emerged from the evidence was that the media often reduces polling to a focus on the ‘horse race’ between the two major parties, and that discussions on policy receive less prominence as a result. Dr Nick Anstead from the London School of Economics and Political Science, told us that “during election campaigns, the media become fixated on who is winning and losing an election, and small movements in the various parties’ level of support, to the exclusion of discussing policy and substantive political issues.” He said that there was some evidence that the pressures of 24-hour broadcast news coverage and online commentary had increased the reliance on such ‘horse race’ coverage in recent years. He added that “such coverage is popular with audiences, so this phenomenon might be demand rather than supply-led.”110

100.Focussing on the ‘horse race’ can be a problem for several reasons. First, it can crowd out discussions on policy matters. Professor Tait, Professor of Journalism at Cardiff University, told us that newspapers and polling companies were producing more and more polls and that this encouraged a focus on the ‘horse race’. In his view, broadcast editors always had to consider how much to spend “on who is winning and who is losing” compared to how much to spend analysing attitudes. He added that “the sheer weight of polls is a factor in determining where newspapers and broadcasters focus their attention.”111

101.Secondly, this focus means that other important factors highlighted in polls are missed. This point was made by polling companies who noted that undue attention could be placed on small and often statistically insignificant movements in vote shares. They suggested that: “This often manifests in a misunderstanding of elections as horse races and misses out on the other data provided by polls.”112

102.Thirdly, paying too much attention to the ‘horse race’ is problematic because it frames other discussions about political events. For example, as previously highlighted, during the 2015 General Election campaign, a lot of media coverage focussed on the possibility of a hung parliament and the various coalition deals that might emerge under that scenario. Dr Anstead put it to us that:

“Arguably, these discussions had a material effect on the election result, with the possibility of a Labour-led ‘coalition of chaos’ providing a powerful rhetorical device for the Conservatives. Different polls, showing a significant Conservative lead over Labour, for example, might have led to a rhetorically very different campaign, with Conservative plans for government facing a much higher level of scrutiny.”113

Such a media focus is particularly problematic when the discourse is based on inaccurate polls.

Lack of reference to important caveats

103.Published polls conducted by reputable polling companies are usually accompanied by information describing the methodologies used, including the sample size, the population represented, question wording, and the margin of error. While many may view these details as unnecessary small print, this information is crucial to assessing the credibility of the poll’s results. Unfortunately, however, this information is not always communicated in media reports, especially secondary reporting.

104.Reporting the margin of error in polling was a particular cause for concern. In general, polling organisations publish estimates for vote shares with a margin of error of plus or minus 3% to take account of sampling variability.114 This means, for example, that if a poll estimates the Labour vote share as 40%, then the true Labour vote share in the population could be anything between 37-43%. If a poll estimates that both the Labour party and the Conservative party each have a vote share of 40%, this could in fact cover any situation ranging from a 43% Labour / 37% Conservative split, to a 37% Labour / 43% Conservative split.

105.Dr Anstead said that few journalists are polling specialists and that they therefore “stress novelty and a dynamic situation.” As an example, he noted that “statistically insignificant changes in the level of support are often recorded as being meaningful. Methodological caveats are sometimes omitted or, if they are included, not made prominent enough.”115 Johnny Heald, Managing Director, ORB International, told us that when journalists review polls:

“Practically, particularly with social media and the need to push something out overnight, online and so on, there is probably not as much rigour as there should be … There is a checklist of things that the industry has, but I would argue in my experience that journalists want the story to justify the agenda or to push something out, and they are not necessarily spending enough time looking at the detail.”116

106.Sky News noted that a large majority of polls are commissioned by media organisations, largely newspapers, whose main interest during election campaigns “is to gain first access to the polling figures”. It noted that “parties that are flat-lining don’t make headlines” and that there was a bias towards reporting headline polling figures that emphasised changes in likely voting intention. Sky News highlighted the fact that such changes often fell within the margins of error, but that the headline figures were reported as increases or decreases, “with essential commentary on sampling errors etc. relegated either to a footnote or not mentioned at all”.117

107.The reporting of trends can also be problematic. Will Moy said that Full Fact had seen examples of reporting which took individual polls out of context, in order “to produce the classic, ‘It is on a knife-edge’” report, rather than looking at the full breadth of the polling evidence. He highlighted one example where a newspaper had “compared two different polls, from two different companies, using two different methods, to claim a bombshell showing ‘May plummeting by 11 points’.”118

Headlines and margins of error

108.The lack of nuance and caveats was a particular problem in relation to headlines. This is because, even if the article does contain the necessary methodological detail, the key message that most readers or listeners will draw will be based on the headline.

109.Johnny Heald explained that the margins of error for a poll were often ignored when they were reported. He noted that there was no real difference, for example, between a poll result showing a 49%/51% split of voters, compared to 48%/52%. However, he added, “if it jumps from 48 for leave to 51 for leave, the headline is not, ‘It’s the same’. The whole agenda of the paper changes. The markets react, and 1.2% is lopped off the pound overnight. That happens on the basis of a question that costs £250 to put to 2,000 people.”119

110.Ben Page, Chief Executive, Ipsos MORI, felt that sub-editors had “a lot to answer for.” He gave us the following example of a misleading headline that did not take account of the margins of error:

“There was one newspaper headline in 2015 where it said, ‘Up 1%’, which is clearly absolute rubbish, but it was seized on in a poll that moved in the direction that that newspaper liked. Being able to control the subs has probably become a little harder. We are making sure that all the details of the surveys are usually in the text and are not too misleading, but there are issues about how much prominence a newspaper will give to a poll that it likes as opposed to one that does not confirm its prejudices.”120

Professor Chris Hanretty, Royal Holloway, University of London, noted that teaching people about statistics was challenging, adding that “it is difficult to give a good summary of what a margin of error is, so a sub-editor might say, ‘That is the first thing to go’.”121

111.Over the course of the 2017 election campaign, Full Fact investigated a number of political claims, including the reporting of a political opinion poll by The Mail on Sunday. On 24 April 2017 the newspaper led with a headline claiming: “Tory lead is slashed in half after tax U-turn: Bombshell Mail on Sunday poll shows May plummeting by 11 points … denting hopes of a landslide”.122

112.Full Fact suggested the paper was not reporting the results of political opinion polling accurately, stating that:

“The paper quotes a recent opinion poll conducted by market researchers Survation at the end of last week, which put the Conservatives on 40% and Labour on 29%—an 11 point lead. The headline gets this slightly muddled, describing it as Theresa May plummeting by 11 points.

It then compares it to another poll from Tuesday last week, from researchers at ICM. It put the Conservatives on 46% compared to Labour’s 25%—a 21 point lead.

That’s a big difference over four days, but not one we can draw a trend from.”123

Full Fact went on to highlight that it is difficult to directly compare different polls because of the sometimes large differences in methodological approaches.124

What can polling organisations do?

113.The industry body for the major polling organisations in the UK is the British Polling Council. The BPC told us that it has two main objectives: “The first is to promote transparency in the publication of polls. The second is to promote public understanding of opinion polls.” In order to do this, its rules of disclosure place certain obligations on its members. In particular, member organisations are obliged to ensure that the following information is included in any initial publicity surrounding the publication of a poll:

114.In addition, the BPC rules oblige members to publish the above information on their own website, together with the following:

The BPC noted that this additional information “should normally be published within two working days of the initial release of the results, though for polls of vote intention conducted during election and referendum campaigns members have committed themselves to publishing this information within 18 hours. In practice, nowadays, most companies publish this information within a few hours, including outside an election period.”126

115.While the BPC provides guidance regarding the release of data, it is not always easy for polling organisations to ensure that this is followed by the organisations which commission their surveys. We were told that polling organisations often checked the text and graphics of media reports to ensure that there were no inaccuracies, but that there was little they could do about the prominence given to particular stories, or to the headlines attached to those articles. Ben Page told us that he was powerless if a newspaper, for example, gave undue prominence to a particular poll. He said:

“We will correct them publicly if they are wrong, as we will any client who is misleading about something, but choosing what to publish and ignoring things such as margin of error, or focusing on tiny changes that are not, under any circumstances, likely to be statistically valid, are things that go on. You can ask them not to do it again. You can complain. I suppose you could stop giving them data. But there is a tension there.”127

116.Furthermore, while polling organisations work with the media outlets which commissioned their polls to try to ensure accuracy of reporting, there is little they can do to prevent the misreporting of their polls by other media sources. Ben Page noted that it was “the secondary coverage and selective coverage by other outlets” that could be a problem.128 Nick Moon, Moonlight Research, also told us that it was the “secondary reporting particularly” which was difficult to control.129

The challenges faced by journalists

117.It is important to note that the evidence we received was not uniformly critical of the media. Anthony Wells, of YouGov and UKPollingReport, noted that newspaper reporting had improved in the last decade and he identified journalists who reported on polls with the appropriate caveats. In his view, this was often a result of “regular discussion between the polling company and the journalists responsible about what a poll means and what can be responsibly concluded from the findings.”130

118.Polling companies acknowledged that there were problems with some media reports of polls, but also recognised that:

“Generally speaking, the media report on opinion polls appropriately. There are many fine political journalists working today who properly recognise, understand and even contribute to the world of political polling. Recent rises in the popularity of data journalism add credence to this trend for responsible journalism.”131

119.Throughout the evidence, there was also a recognition of the challenges that journalists face. Members of the media are tasked with reporting on an increasingly volatile and unpredictable electorate, while also keeping up with changes in the media landscape, including the rise of digital media and a desire for constant updates as part of a 24-hour news cycle. Sue Inglish, Former Head of Political Programmes, Analysis and Research at the BBC, noted that “an election campaign is probably the hardest test of any media organisation, because you are attempting to report developments in the campaign and the issues underlying them in a very fast-paced environment, when the stakes are incredibly high. For most journalists, those are the most difficult things to do.”132

120.We are also aware of the pressure affecting some journalists due to proprietorial and editorial demands, where there may be a desire to generate an exclusive story or to further the cause supported by their particular news outlet, particularly when that outlet might have commissioned the poll in the first place. As Deborah Mattinson, Co-Founder of BritainThinks, said: “Published polling is for editorial purposes—to generate a story. That is its aim. They will be looking for a sensational angle or a big headline.”133

121.Professor Tait noted that while there was greater numeracy and literacy about polls among political correspondents and editors, on the other hand, there were also a lot more polls to contend with. He said: “There seems to be almost an arms race among newspapers and polling organisations to have lots of polls. To me, that encourages a less desirable development—a focus on the horse race in the election, or the referendum, rather than a focus on issues and analysis of policy.”134

122.Compounding these problems is the fact that there is no clear definition as to what constitutes a poll, or any benchmark by which to judge it, as we noted in paragraph 22.

123.In the light of these challenges, most witnesses felt that it was important to provide more information and support in order to improve media coverage of voting intention polling. Professor Ailsa Henderson from the University of Edinburgh pointed out some of the common errors in reporting of polls, but added: “Each of these is an issue of education rather than regulation, though.”135 Polling companies expressed a similar opinion:

“Although considerable efforts are made to ensure fair representation of data on our part (including briefing our media clients and promoting data literacy), it is up to readers of all media to decide whether and what to believe. To regulate the publication of opinion polls rather than any other type of information disseminated via newspapers is to under-estimate the ability of readers to determine such matters for themselves.

It is hard to avoid the conclusion that regulatory intervention or even a code of conduct for reporting political polling would represent an overbearing sledgehammer to crack a nut, let alone that the ramifications for democracy would be wholly negative.”136

Views on what could be done to improve media reporting of voting intention polls

Giving polls less prominence

124.Given the uncertainty surrounding the ability of polls to predict accurately the outcome of elections, there is a strong case for the media to give less prominence to voting intention polls. Some sections of the media have already started to move in this direction.

125.David Jordan, Director of Editorial Policy and Standards at the BBC, told us that the BBC’s guidelines “start from a pretty sceptical position about opinion polling”. Ric Bailey, Chief Adviser, Politics at the BBC, confirmed that the BBC’s guidelines have for some time recommended “never leading or headlining a bulletin with the reporting of a poll.” For the 2017 General Election campaign, the guidelines were strengthened further to say that a news story would not normally be based on a single opinion poll.137

126.Sky News told us that they had moved in the same direction. Jonathan Levy, Director of News Gathering and Operations at Sky News, told us that ahead of the 2017 General Election, their guidance to staff “asked them to be very clear when a poll fell within the margin of error—and to be clear about that to our viewers or readers.” They also ensured that they had a polling expert on hand so that reporters could draw on their expertise.138

127.ITV News also moved away from polls during its 2017 General Election coverage. During the campaign, ITV News did not commission a poll, but instead decided to extend its 10pm bulletin by 10 minutes to allow time for reporters in the field to talk directly to voters.139 ITV News presenter, Tom Bradby, said on Twitter: “It should be abundantly clear by now that the polls are a total waste of time. We have refused to commission any in this campaign.”140

128.We heard that newspapers might also be less inclined to commission polls. Deborah Mattinson told us that:

“… because of the experience of the last election in particular, there will probably be fewer polls. There already are, actually, because quite a lot of newspapers feel that they had their fingers burned a bit and are looking at other ways of tapping into public opinion. But it is not going to stop.”141

Training for journalists

129.We support the move to give voting intention polls less prominence in election coverage, and to focus more on policy issues. However, as the electorate in the UK is so accustomed to voting intention polling, it is unlikely that the demand for reports on such polls will reduce significantly in the near future. The key is therefore to provide journalists with appropriate training and support so that they can report on polls accurately.

130.While virtually all witnesses agreed that more support and guidance for journalists would be beneficial, it was less clear who should take the lead on this.

131.There are already various sources of guidance on the reporting of polls. The BPC has sponsored a number of events on the conduct of polls, often in collaboration with the National Centre for Research Methods at the University of Southampton, in order to enable the public, including journalists, “to come to an informed view about the conduct and effectiveness of polling as it is currently practised.”142 The BPC has also published a ‘Journalist’s Guide to Opinion Polls’ on its website143 (reproduced in Appendix 6 of this report). The Journalist’s Guide provides useful guidance on what makes a poll “scientific”, including details about the selection of respondents and the sampling methods used. The Guide also contains a list of questions which journalists should consider when deciding whether to report on a poll. It does not, however, provide a strict definition of what constitutes a poll.

132.Jane Frost CBE, Chief Executive Officer of the Market Research Society (MRS), told us that the MRS also worked with the Royal Statistical Society and other bodies to provide guidance to the media on reporting of polls.144 The Royal Statistical Society thought that there had been improvements in the reporting of polls since the 2015 General Election, though there were still some simple errors being made. The Society supported “the right training” for journalists, and suggested that: “If journalists had access to more comprehensive training, as well as better links with the statistics community, significant improvements could be made in their reporting of poll findings.”145

133.The Independent Press Standards Organisation (IPSO) does not have any immediate plans to produce specific code-based guidance on the reporting of opinion polls, but said that it would “continue to monitor complaints and may develop such information in the future.” It also said that it would support the development of broader guidance by other specialist bodies, as long as this did not conflict with, or cause confusion about, the application of the Editors’ Code of Practice. It also supported increasing the availability of training opportunities for journalists.146

134.When asked whether the Independent Monitor for the Press (IMPRESS) would welcome more training for journalists in the use of polls, Jonathan Heawood, the Chief Executive Officer, said:

“It is a very good point. It may sit within a wider issue about the reporting of statistics more generally—not just polls, but all sorts of statistics, which are notoriously difficult for people who are non-specialists to understand and communicate. We have already done a number of training modules for our members on aspects of our code of standards. This is the kind of issue we may well want to think about for the future.”147

135.We also asked the Society of Editors the same question and Ian Murray, the Executive Director, said: “If you are asking me whether I think that there should be more training at local level in particular, and whether more thought should be given to it, obviously the answer is yes.”148

136.Dr Lauderdale felt that academics should also play a role in helping the media to present information more accurately. He told us that:

“There is value in trying to help the members of the media who would like to present this information more accurately to do so by providing them models of how you would do it … As academics, we have many things to do. We have many things to do in advance of elections, as people who study elections. I would certainly encourage my colleagues who know about polling to be engaged.”149

137.Recent high profile polling failures can be attributed to a range of methodological challenges, but this is not the whole picture. There are disturbing problems with the way in which voting intention polls are represented by the media. While British Polling Council members are now required to report whether a poll shows a statistically significant change since the previous poll, this information is not always included in media reports. The way in which voting intention polls are represented by the media is often misleading, with a particular tendency to over-emphasise small changes in party fortunes that are indistinguishable from sampling variability. This practice remains largely unchecked.

138.Although the British Polling Council rules require that details of methodological approaches are published, this is insufficient to combat poor reporting practice. This is particularly true of election coverage, where dramatic headlines may not represent the full results of the poll, or may only represent the narrative preferred by a particular editor, which may be misleading.

139.We welcome the efforts which the British Polling Council currently makes to inform journalists and others about polls, including its ‘Journalist’s Guide to Opinion Polls’ published on its website. We recommend that the Guide should be developed to include an authoritative definition of what constitutes a properly conducted poll (as opposed to a small unrepresentative survey), and a list of criteria which must be met for a survey to be recognised as a poll. We recognise that arriving at such a definition will be difficult, but believe that it is essential in order to deliver clarity to members of the public, journalists and others. Once developed, we hope that journalists will be able to use the definition when reporting on polls, and include in their reports a statement as to whether the particular survey met the BPC’s definition of a poll.

140.We also recommend that the British Polling Council should develop its ‘Journalist’s Guide to Opinion Polls’ to include guidance on the types of information that should be included within articles that report on polls. This might include guidance on how to frame headlines to reflect accurately poll results, how to explain the margin of error, and possibly a health warning to remind readers that polls simply represent a snapshot in time, rather than necessarily being predictions of the future. When reporting on particular polls, journalists should be expected to note in their reports whether the organisation which conducted the poll is a member of the British Polling Council or not. To support transparency, journalists should also include in their articles a reference to the published poll.

141.Where relevant, the British Polling Council should make public any examples they find of particularly poor practices of media reporting on polls. The polling companies themselves should also be encouraged to state publicly where they think their polls have been misused or misreported.

142.The British Polling Council should also develop a programme of training opportunities for journalists on how to read, interpret and report on polling data. It would be helpful if this guidance could be produced as part of a collaborative approach in conjunction with the Market Research Society, IPSO, IMPRESS, the Society of Editors, Ofcom, the Royal Statistical Society and academics.


108 Q 47 (Will Moy)

109 Q 19 (Dr Benjamin Lauderdale)

110 Written evidence from Dr Nick Anstead (PPD0018)

111 Q 71 (Professor Richard Tait CBE)

112 Written evidence from ComRes, Opinium, Ipsos MORI, Panelbase, LucidTalk, ORB International, BMG Research and Survation (PPD0014)

113 Written evidence from Dr Nick Anstead (PPD0018)

114 YouGov, ‘Understanding margin of error’ (21 November, 2011): https://yougov.co.uk/news/2011/11/21/understanding-margin-error/ [accessed 20 March 2018]

115 Written evidence from Dr Nick Anstead (PPD0018)

116 Q 154 (Johnny Heald)

117 Written evidence from Sky News (PPD0005)

118 Q 47 (Will Moy)

119 Q 151 (Johnny Heald)

120 Q 154 (Ben Page)

121 Q 37 (Professor Chris Hanretty)

122 Simon Walters, ‘Tory lead is slashed in half after tax U-turn: Bombshell Mail on Sunday poll shows May plummeting by 11 points ... denting hopes of a landslide’, The Daily Mail (24 April 2017): http://www.dailymail.co.uk/news/article-4436044/Tory-lead-slashed-half-tax-U-turn.html [accessed 20 March 2018]

123 Full Fact, ‘Little evidence the Conservatives’ poll lead is narrowing’ (24 April 2017): https://fullfact.org/news/little-evidence-conservatives-poll-lead-narrowing/ [accessed 20 March 2018]

124 Ibid.

125 Written evidence from the British Polling Council (PPD0007)

126 Ibid.

127 Q 154 (Ben Page)

128 Ibid.

129 Q 9 (Nick Moon)

130 Written evidence from Anthony Wells (PPD0015)

131 Written evidence from ComRes, BMG Research, Ipsos MORI, LucidTalk, ORB International, Opinium, ORB International, Panelbase, and Survation (PPD0014)

132 71 (Sue Inglish)

133 Q 46 (Deborah Mattinson)

134 71 (Professor Richard Tait CBE)

135 Written evidence from Professor Ailsa Henderson (PPD0012)

136 Written evidence from ComRes, Opinium, Ipsos MORI, Panelbase, LucidTalk, ORB International, BMG Research and Survation (PPD0014)

137 Q 89 (David Jordan, Ric Bailey)

138 Q 83 (Jonathan Levy)

139 Jake Kanter, ‘TV news broadcasters have ditched general election polls as a ‘total waste of time’’, Business Insider, (2 June 2017): http://uk.businessinsider.com/bbc-itv-channel-4-channel-5-and-sky-news-shun-election-polls-2017-6 [accessed 20 March 2018]

140 Tom Bradby (@tombradby), Tweet on 30 May 2017: https://twitter.com/tombradby/status/869780244403761152 [accessed 20 March 2018]

141 Q 43 (Deborah Mattinson)

142 Written evidence from the British Polling Council (PPD0007)

143 British Polling Council, ‘A Journalist’s Guide to Opinion Polls’: http://www.britishpollingcouncil.org/a-journalists-guide-to-opinion-polls/#q13 [accessed 20 March 2018]

144 Q 160 (Jane Frost CBE)

145 Written evidence from the Royal Statistical Society (PPD0022)

146 Written evidence from the Independent Press Standards Organisation (PPD0021)

147 Q 114 (Jonathan Heawood)

148 Q 97 (Ian Murray)

149 Q 19 (Dr Benjamin Lauderdale)




© Parliamentary copyright 2018