143.Although the core focus of our inquiry was on voting intention polls and their impact on the democratic process, we also encountered a further and important set of problems that are associated more closely with policy issues polls. While most professional polling organisations carry out polls using robust methodologies, many other polls are carried out on poor quality samples, using leading questions. From what we have seen, the most egregious examples of such practices do not attempt to estimate voting intention. They instead aim to measure opinion on political or social issues, with the intention of influencing political discourse. These policy issues polls can cover a wide variety of subjects, ranging from assisted dying, to views on the NHS or opinions on fox hunting.
144.Such polls are potentially of value in shaping the national debate on important issues. What people think on issues such as fox hunting is something which is of interest to policy-makers and politicians, although that is not to say that they necessarily have to be slaves to public opinion on such matters.
145.However, we have concerns about this type of polling, particularly polls which have been commissioned by campaigning institutions who have an interest in demonstrating that they have public opinion on their side. In such circumstances, there will always be a temptation to devise questions more likely to get more favourable answers. We have therefore considered policy issues polling of this nature as part of our inquiry.
146.A variety of people and organisations commission polls and each will have their own aims and objectives. When the commissioner is a campaign group that advocates a certain policy position, inevitably they will be hoping that the results of that poll will show public support for their position. There is therefore a temptation for such commissioners to encourage polling companies to use leading questions that are designed to push poll respondents in a particular direction. This is understandable and it would be naïve to believe that this pressure does not exist.
147.However, it is the duty of responsible polling organisations to resist this pressure. Johnny Heald, Managing Director of ORB, noted: “If you are working for a certain campaign group that wants to promote a particular issue, it will want to ask the question in a particular way. At that point, any upstanding pollster will say, ‘You cannot ask that in that way’.”
148.Damian Lyons Lowe, Chief Executive of Survation, noted that the BPC’s rules on transparency meant that other polling organisations and experts could quickly check the ways in which a poll had been conducted and then “shoot it down” if it had been conducted inappropriately. He told us that it was therefore “a matter of professional reputation” to make sure that questions were framed neutrally. In his view: “Professional reputation is all that a company such as mine has. Where we make mistakes, or where a question has been misframed, it is easy for that to be subject to scrutiny. It is not good for business to be seen as a company that gets the campaign and exactly the result that it wants.” In his experience, he had found that when his company had had to “push back” on question wording, it was “unusual for that not to be taken on board.”
149.Nonetheless, we were concerned that there were some issues with the methodological approaches used in policy issues polling.
150.One of the areas of concern about policy issues polling is the representativeness of the samples used for such polls. In particular, we were concerned that polling on social issues can sometimes rely on small samples which are then claimed, dubiously, to be representative of a wider group.
151.A particular problem is where the overall sample is adequate, but the sizes of particular sub-sample groups are insufficient to allow any firm statistical conclusion to be drawn. For example, in 2005, a Faith Survey conducted by ICM Research for BBC News had 1,019 respondents. To report the findings of the survey, BBC News published an article with the headline “Britons ‘back Christian society’”. The article claimed that “31% of Jews said they knew nothing about their own faith,” and also that “Jews were the least likely to attend services—just over half said they never went to a synagogue.” These conclusions correspond to the data values in the survey, but considering that only five of the 1,019 respondents were Jewish, it could be argued that this was not a reliable sample size to help understand common Jewish beliefs and practice in society. Anthony Wells, of YouGov and UKPollingReport, highlighted this particular survey in an article for Full Fact where he warned about the dangers of misreporting. He said:
“Pay particular caution to national polls that claim to say something about the views of ethnic or religious minorities. In a standard GB poll the number of ethnic minority respondents are too small to provide any meaningful findings. It is possible that they have deliberately oversampled these groups to get meaningful findings, but there have been several instances where news articles have been based on the extremely small religious or ethnic subsamples in normal polls.”
152.Differential response rates are also a serious problem. Some polls—for example those which require people to pay in order to take part—virtually ensure that there will be differential response rates which therefore cause bias in the results. However, they can sometimes be misused to make claims about the views of the population as a whole. Other, more subtle, biases can also creep in. For example, if a survey is sent to a group of people asking about a particular problem, understandably those who believe that this problem exists will be more likely to respond than those who do not.
153.We were concerned about the way in which small informal polls with unrepresentative samples could be reported on with the same significance as a more representative poll.
154.David Jordan said that part of his job as Director of Editorial Policy and Standards at the BBC was to tell people that they could not make general statements about wider populations on the basis of smaller samples. He noted that:
“… all kinds of people have realised that they can try to generate headlines by carrying out surveys with self-selecting samples, particularly online. Essentially, they put up a question online and say, ‘Please respond’. They then publish the results as if it is a bona fide piece of polling.”
155.Ric Bailey, Chief Adviser, Politics at the BBC, made the point that it was not always inappropriate to report on small surveys, as long as they were put into the appropriate context. He told us:
“Surveys of MPs or chief constables are clearly a different animal from polls, but they are something that we also police pretty strictly, particularly if we are going to commission them ourselves. If it is a BBC survey, we have pretty high criteria for what it needs to achieve … Obviously, when reporting other surveys, done by other people, we need to be really careful around the language, to make sure that we are not implying that something is more scientific than it is, and that we put due scepticism into the reporting of surveys of that sort.”
He also suggested that it was important to cover such surveys if they were informing the political debate:
“I would not say that you should not report that sort of thing at all, as it may well be part of a campaign. As long as you put it in the appropriate context and are clear about who has commissioned it and what its basis is, either online or by providing links, that is better than having some sort of prohibition that says, ‘We would never report that sort of survey’.”
156.The problems come, however, when such informal polls are not put into the appropriate context. In the case studies below we highlight examples of some of the methodological problems explained above. In each case, we have used the terminology quoted in the sources, but note that their uses of the terms ‘poll’ or ‘survey’ may not necessarily correspond to the working definitions we set out in Chapter 2 of this report.
157.On 26 July 2016, the Daily Express published an article headlined “98% say no to EU deal” in its print edition and “98 per cent say NO to EU deal: Forget talks with Brussels and quit NOW, urges new poll” in the online version. The print article reported that 98% of people who took part in a phone survey said that the decision to leave the EU should be enacted now, rather than after talks with Brussels. The online article referred to the poll as an “online poll” but it was otherwise substantively similar to the article that appeared in print.
158.IPSO received a complaint from Tony McDonald who argued that the headline was misleading because it did not make clear that the 98% figure had come from a phone survey of Daily Express readers, rather than representing the view of the public at large. He also asserted that the sample in the survey must have been screened or tested in advance, and that a responsible poll would have ensured a representative sample.
159.The Daily Express denied that the article was misleading and said that the headline needed to be read with the text of the article, which made it clear that the result came from a phone survey. The survey question was “Should UK end all talk of deals and quit the EU now?” and was printed on the previous day’s edition of the newspaper. The newspaper explained that readers had to pay to register their response to the question. The online version of the article originally stated that the results came from an “online poll”, but this was later corrected.
160.Mr McDonald argued that the article was misleading because it did not say that participants in the survey had to pay to register their response and that, because of this, it was likely that only people with strong views would have responded. He also added that, in any event, the poll could not even claim to be representative of the newspaper’s readers, as only approximately 1% of its readership had participated.
161.IPSO’s Complaints Committee concluded that “the article gave the impression that it was reporting the significant results of a representative poll carried out by a third-party for the publication. In fact, the poll was conducted through a premium rate phoneline, which allowed a self-selecting sample of the newspaper’s readers to express their views.” The Committee found that the newspaper had breached Clause 1 (Accuracy) of the Editors’ Code of Conduct and ruled that the newspaper should publish an upheld adjudication.
162.On 26 February 2012, The Observer published an article headlined “Nine out of 10 members of Royal College of Physicians oppose NHS Bill”. The article was based on a poll which asked about the proposals contained within the Health and Social Care Bill (now the Health and Social Care Act 2012), introduced by the then Secretary of State for Health, the Rt Hon Andrew Lansley MP. The poll had apparently canvassed the views of members of the Royal College of Physicians (RCP), and the article stated that: “The findings, showing that 92.5% of RCP members want the health and social care bill withdrawn, have been passed to the Observer as the college prepares for an extraordinary general meeting on the reforms on Monday.”
163.However, further down the article, it became clear that the poll had not been conducted by a reputable polling company. Instead, the findings had come from an open-access survey conducted by callonyourcollege.blogspot.com, which the article described as “a website co-ordinating moves by anti-bill medics to persuade the royal colleges … to reject Lansley’s plans”.
164.Anthony Wells blogs about surveys which he describes as ‘voodoo polling’ and put this particular poll into this category. With regard to this poll, he said:
“The survey was open access, so there could have been no attempt at proper sampling and contained no demographic information that could have been used to weight it. It should go without saying that a survey from a website campaigning against the NHS reforms and co-ordinating opposition to it amongst the Medical Royal Colleges is more likely to be found and completed by [those] opposed to the bill …
Any poll actually measuring the opinion of members of the RCP would have needed to randomly sample members, or at least contact members in a way that would not have introduced any skew in those likely to reply. For all we know this may have also shown overwhelming opposition—but we cannot judge that from an open-access survey liable to have obtained an extremely biased sample.”
165.In this instance, the newspaper’s own Readers’ Editor acknowledged that the survey should not have been given such prominence. He said: “I’m not suggesting that the survey is invalid; we know opposition among hospital doctors is extremely high, but readers have a right to expect that things that we proclaim to be polls are properly conducted, using scientifically weighted samples of a population or group.” He added: “In this case, the poll was not conducted by a polling company, but by a group lobbying against a bill … this should have sounded the first alarm bell.”
166.Even where polls are conducted by reputable polling organisations and they have resisted pressure from poll commissioners, there is a potential for the results to be misrepresented accidentally, or manipulated to fit a predetermined agenda. Despite standards set by the BPC and MRS on how poll findings should be used in communications, and standards set by media regulators, we are aware that poll findings can be poorly communicated.
167.Anthony Wells noted that poor reporting of polls tended to be more common where journalists with little experience of polling wrote the stories. He also thought that media coverage tended to be poorer when “covering policy and political issues, particularly those commissioned by advocacy groups pushing a particular angle. Some newspapers will report such polls with findings that coincide with their own political viewpoint in a very uncritical manner.”
Two examples of inaccurate reporting of polls are outlined below.
168.On 23 November 2015, The Sun published an article headlined “1 in 5 Brit Muslims’ sympathy for jihadis”. The article featured on the front page of the printed edition, with further coverage inside the newspaper, and was also published online. The article was based on the results of a poll commissioned by the newspaper from Survation, a member of the BPC. The article reported that “nearly one in five British Muslims has some sympathy with those who had fled the UK to fight for IS in Syria.” A bar chart printed inside the paper showed that respondents to the poll were asked “which of the following statements is closest to your view”, and the results were that 5% of those surveyed had a lot of sympathy, 14% had some sympathy and 71% had no sympathy with “young Muslims who leave the UK to join fighters in Syria”.
169.IPSO received a large number of complaints about the coverage and formally accepted a complaint from Muslim Engagement and Development (MEND) as the lead complainant. MEND argued that the newspaper’s presentation of the poll was misleading. The complainant noted that the question about sympathy had referenced those “who leave the UK to join fighters in Syria” but that the possible answers did not mention IS. The complainant’s argument was that people responding to the question might not have intended their answers to be understood as relating to those joining IS, or as demonstrating sympathy for jihadis. Furthermore, the question had asked about sympathy “with” those leaving the UK, not sympathy “for” them and their ideals.
170.The Sun denied breaching the Editors’ Code of Practice. It emphasised that it had not tried to sensationalise the information which it had obtained, and stressed that the coverage had included the wording of the questions in full. The newspaper argued that the meaning of the question was not ambiguous, and that it had been asked as part of a longer telephone survey which had taken the form of a discussion, and that a number of previous questions had made explicit reference to IS. The newspaper therefore considered that respondents would not have been in doubt about the question’s meaning.
171.Furthermore, The Sun argued that the question would have been understood by respondents as referring to IS because the overwhelming majority of those who left the UK to join fighters did join IS. It also said that the media narrative around such people had focussed on those joining IS. The newspaper argued that the term “jihadis” was commonly accepted to mean those pursuing their religious beliefs via a violent struggle, so it did not consider this to be an inaccurate description of young Muslims fighting in Syria in a conflict inspired by religion. Furthermore, the newspaper suggested that the sentiment of “sympathy” in the sense of sorrow or regret was still sympathy and that it considered sympathy with those who had elected to join an organisation such as IS was improper, regardless of the motivation.
172.IPSO’s Complaints Committee concluded that the newspaper article breached Clause 1 (Accuracy) of the Editors’ Code of Conduct, which states that:
“i) The Press must take care not to publish inaccurate, misleading or distorted information, including pictures.
ii) A significant inaccuracy, misleading statement or distortion once recognised must be corrected, promptly and with due prominence, and—where appropriate—an apology published.
(iii) The Press, whilst free to be partisan, must distinguish clearly between comment, conjecture and fact.”
173.In its Decision of 17 February 2016, IPSO’s Complaints Committee said:
“While the newspaper was entitled to interpret the poll’s findings, taken in its entirety, the coverage presented as a fact that the poll showed that 1 in 5 British Muslims had sympathy for those who left to join ISIS and for ISIS itself. In fact, neither the question nor the answers which referred to “sympathy” made reference to IS. The newspaper had failed to take appropriate care in its presentation of the poll results, and as a result the coverage was significantly misleading, in breach of Clause 1.”
Having upheld the complaint, IPSO required The Sun to publish an upheld adjudication.
174.IPSO was not the only body concerned about this poll. Jane Frost CBE, Chief Executive Officer of the MRS, told us that the MRS had also looked into this particular case. She told us that:
“We looked into the rather notorious Sun poll on Muslims, which was entirely inappropriate. We initiated disciplinary action against the member of MRS that was involved. During the inquiry, that member not only left the business but left the sector entirely. The issue was raised and dealt with in other matters.
In general, we get very good traction if we raise the handling and the reputation of research. We need to be vigilant and to ensure that editors and policymakers know that we are putting consistent attention on them. If we went away, it is very likely that matters would not come to a head.”
175.The polling company, Survation, defended its methodology but noted that it was not responsible for the way in which the poll’s findings had been interpreted. On its website, it said:
“Survation do not support or endorse the way in which this poll’s findings have been interpreted.
Neither the headline nor the body text of articles published were discussed with or approved by Survation prior to publication …
Furthermore, Survation categorically objects to the use of any of our findings by any group, as has happened elsewhere on social networks, to incite racial or religious tensions.”
176.On 20 July 2012, the Daily Mirror published an article which said that one in four young drivers had a crash in the first six months after passing their driving test. This article was based on a survey conducted by the AA in conjunction with Populus, a member of the British Polling Council.
177.Full Fact complained to the Press Complaints Commission (the predecessor to IPSO) to say that the newspaper had not reported on the survey accurately. Full Fact argued that the survey had in fact questioned drivers who had been involved in a crash, and that of the 18–24 year-olds polled, 23% had been involved in a crash within six months of passing their test. Given the discrepancy between the results of the survey and the message given in the article, Full Fact argued that the newspaper had breached Clause 1 (Accuracy) of the Editors’ Code of Conduct.
178.The complaint was resolved when the Press Complaints Commission negotiated the removal of the online article, and the publication of the following correction in print and online:
“In an article reporting on a survey by AA/Populus on page 37 on 20 July we said that 1 in 4 young drivers have a crash in the first six months after passing their driving test. We should have made it clear that the survey was of drivers who had been involved in a crash, and that 23% of the 18 to 24 year-olds polled had been involved in their first crash within 6 months of passing their test.”
179.While the case studies outlined above highlight articles for which the relevant newspapers or polling companies have been criticised or sanctioned, we believe that there are numerous other examples of such misuse or misreporting of policy issues polls, which are never complained about. We expect that many members of the public simply accept such reports as being true without questioning the methodology which lies behind them. Given that all of the possible ‘regulators’ involved in this area operate through a reactive, complaint-driven process, it is not surprising that such stories go unchallenged. This is an issue that we explore in more detail in Chapter 5.
180.We also acknowledge that it is not just the media that are guilty of misreporting, and that it is not a practice which is limited purely to polls. In particular, we note that the use of surveys and statistics by Government departments and political parties should not be immune from criticism. We did not seek specific examples of this from our witnesses, but evidence from the British Social Attitudes survey suggests that there are low levels of public confidence in the way the Government present official statistics, such as unemployment rates and crime levels. Its findings suggested that although four in five (78%) people with an opinion on official statistics believe that they are accurate, only 26% of those who gave an opinion trust the Government to present official statistics honestly when talking about its policies.
181.Numerous polls are conducted every week which affect political discourse in the UK. In some cases, there is a failure by those who publicise such polls to communicate all of the relevant details about the selection and framing of questions to obtain a desired answer. We believe that most of these examples are deliberate attempts to manipulate polling findings, in order to distort evidence around public policy issues. We conclude that there is a case for the British Polling Council to play a greater role in proactively overseeing the conduct and reporting of polls.
182.In the next Chapter, we outline the ways in which the British Polling Council might do this.
150 (Johnny Heald)
151 (Damian Lyons Lowe)
152 ICM, ‘Faith Survey Fieldwork’ (November 2005): [accessed 20 March 2018]
153 ‘Britons ‘back Christian Society’’, BBC News (14 November 2005): [accessed 20 March 2018]
154 Full Fact, ‘How not to report opinion polls: A guide from YouGov’s Anthony Wells (4 July 2012): [accessed 20 March 2018]
155 (David Jordan)
156 (Ric Bailey)
158 The information contained in paragraphs 157 to 161 is taken from IPSO’s Decision, 07016–16 McDonald v Daily Express (January 2017): [accessed 20 March 2018]
159 Denis Campbell and Toby Helm, ‘Nine out of 10 members of Royal College of Physicians oppose NHS Bill’, The Observer (26 February 2012): [accessed 20 March 2018]
161 ‘The Observer and Voodoo polling’, UKPollingReport (27 February 2012): [accessed 20 March 2018]
162 Stephen Pritchard, ‘The readers’ editor on ... when is a poll not a poll?’, The Observer (18 March 2012): [accessed 20 March 2018]
163 Written evidence from Anthony Wells ()
164 The information contained in paragraphs 168 to 174 is taken from IPSO’s Decision, 09324–15 Muslim Engagement and Development (MEND) v The Sun, 17 February 2016: [accessed 20 March 2018]
165 ‘Islamic Identity & Community Relations Survey’, Survation, 20 November 2015: [accessed 20 March 2018]
166 (Jane Frost CBE)
167 Patrick Brione and Damian Lyons Lowe, ‘Statement on Survation’s Poll of Muslims for The Sun’, Survation: [accessed 20 March 2018]
168 The information contained in paragraphs 176 to 178 is taken from the Press Complaints Commission’s list of resolved cases. Complainant name ‘Full Fact’ / Publication ‘Daily Mirror’ (October 2012): [accessed 20 March 2018]