Session 2010-11
Peer reviewWritten evidence submitted by John Gorman (PR 30) 1 Peer review started with the first technical Journal, The Proceedings of the Royal Society, in the 17th century. The first editor, Henry Oldenburg, sent papers to his scientific friends for comment when he felt unable to make a proper evaluation himself. The system has stood the test of time and is used by thousands of technical journals. It does however tend to develop a "groupthink" within a particular scientific community. There are various reasons for this: -- the whole funding and career system, including peer review, tends to keep a researcher on one line of thought. -- citations by other researchers are important for funding and career. This tends to promote agreement between researchers. -- more recently, Google and other search engines use numbers of citations to grade search results. -- the current trend for many papers to be meta-analyses of other research, must predispose towards the current paradigm rather than original thinking. 2 All of this tends to support the current paradigm in an area of research. Probably this is necessary when modern science delves so deeply into areas like DNA, the cell, the nature of matter and thousands of other areas of modern research. Sir Martin Rees, the previous president of the Royal Society gave a telling example in last year's Reith Lecture. When he was an inquisitive teenager, anyone with a good technical mind could take any product to bits and understand how it worked. Now, even an expert in mobile phone displays would be unable to understand the rest of his mobile phone and even an expert in electronics can do nothing when the electronics fail in his car! 3 There are however cases where this support of the current paradigm can be inhibiting or even dangerous. 4 The first I wish to examine is where an area of research is very much in need of some new thinking. The peer review system tends to inhibit this. As Tam Dalyell wrote in the New Scientist while he was still an MP "--young researchers are now required to produce cut and dried projects with end results all neatly packaged. It is a sure way to produce research -- -- of limited practical value. Real research requires speculation –". Prince Charles highlighted the particular problem that I'm referring to: "There are so many cases, in science and technology for instance, of people propounding theories and ideas, which at the time are completely rejected as nonsense by the establishment, which later on everyone claims they thought of the time." This is the risk in the present system. It is not a new problem. Charles Darwin spent the last part of his life trying to understand the mechanism of heredity. Gregor Mendel had already produced the thesis that justifies his being referred to as "the father of genetics". He submitted it as his first-degree thesis and failed. Darwin died without ever hearing about the work. 5 Modern examples are legion particularly in the field of medicine. Dr Barry Marshall received the Nobel Prize for medicine in 2005 for his work on stomach ulcers (with Dr Robin Warren). Their proposal, 15 years previously, of a bacterial cause did not correspond to the then current stress/acid paradigm and was ridiculed. He even had to "do a Pasteur" and drink a glass of helicobacter to prove his point. If the researcher is not a part of the relevant academic community then rejection is even more likely. An example is the peer review of my paper "The Obstetric Reason for Lordosis and the Implications for Lifting and Low Back Pain"(5) Low back pain research is an example of a field very much in need of new ideas. 6 The cases where the peer review system can be dangerous are those cases where science must predict the future and public policy decisions must be based on those predictions. There will always be uncertainty in prediction. The peer review system demands certainty for publication. Only things, which we know will happen, are included. This will be the "best case" scenario. The "most probable" or "worst-case" scenarios are excluded because "we don't know that those things are going to happen"(1) 7 This I suggest is exactly what is happening in the case of global warming. Public policy decisions are being made based on a best-case scenario in cases where the dangers are so large that we really ought to be making decisions on worst-case scenarios. To that extent the system used by the climate academic community is failing in its responsibility to Society. The mentality and peer pressure on a climate academic makes him careful not to be seen as alarmist if he can't prove that something will happen. 8 Last year's Royal Society seminar "Uncertainty in Science" addressed just this problem. Mervyn King gave a presentation showing how graphs had evolved to show the range of uncertainty in economic forecasts. Line graphs of recent history expand into shaded areas in the future with colours used to define probability. Professor Sally Davies, Chief Scientific Advisor to the Department of Health used the swine flu epidemic as a case study. It was already clear that the epidemic would be less severe than predicted but decisions had been taken when necessary, plans made and vaccine and antiviral drugs purchased. A "precautionary" principle had been applied assuming somewhere between most probable and worst-case. The fact, that the "actual" was near "best case", did not make the decisions wrong or the purchases a waste of money. 9 The IPCC report in 2007 did use graphs similar to Mervyn King's to convey uncertainty in temperature predictions. Unfortunately many of these graphs and predictions assumed various scenarios of emissions reduction, all of which looked extremely optimistic in 2007. Copenhagen 2009 and Calcun 2010 have shown that any hope of such early reductions in emissions to be totally unrealistic. 10 Furthermore these graphs and calculations predict temperature. Figures like 0.7°C (now) and 2°C (proposed limit) don't sound particularly serious to most people. Everyone knows that this month, or this season, can be several degrees warmer or colder than last year's average. What we need to know is the implications for important effects like sea level rise. It is here that the peer review mentality took over. The IPCC 2007 headline prediction for sea level rise by 2100 was 40 centimetres.(2) This was in the Summary for Policy Makers and press releases and was picked up by newspapers and other news media, as was the point that this prediction was a reduction from the prediction five years earlier from the IPCC. 11 This figure of 40 centimetres allowed only for the expansion of sea water due to the predicted temperature rise. A fairly trivial calculation. Any allowance for the melting of glaciers or ice sheets was excluded because it could not be accurately predicted. This meant that the headline figure was neither worst-case nor best case. It was simply an unrealistic figure which was challenged almost immediately even by some climate scientists. Most predictions now are in the range 1.5 to 2 meters. Sea level during the last interglacial, 125,000 years ago, was about six metres above present. The worst-case prediction must, therefore, be somewhere in that range. 12 A sea level rise of several metres would of course imply substantial melting of Arctic and Antarctic ice sheets. Although melting of Arctic sea ice does not directly result in sea level rise, such loss will inevitably accelerate melting of the Greenland ice sheet (which would, if total, lead to a seven meter sea level rise.) IPCC 2007 predicted only a 20% loss in summer sea ice coverage by 2100. This figure was actually exceeded within months of publication in the summer of 2007. Current estimates for a totally ice free Arctic in summer vary greatly but most predictions do not see any summer sea ice remaining beyond about 2030 or 2040. 13 This melting of the Arctic, including the permafrost areas of Alaska, Canada, Greenland, Norway, northern Russia and Siberia, raises the possibility of a truly catastrophic amplification of global warming. Vast quantities of methane are trapped in the sea floor as clathrates and will be released by warming seas. Accelerated decay in melting permafrost releases methane (and CO2). Methane is 100 times more powerful as a greenhouse gas. (On short time scales.) There have been many research papers on this recently. It is not possible for these to predict time scales to a peer reviewable level of certainty, but the probability of some worst-case event within about 25 years must be significant. This is surely the future possibility on which public policy decisions should be made rather than only what we can be certain will happen. 14 Another area of global concern is the Amazon. 2010 has seen another "100-year probability" drought, the second in five years, with some major rivers almost dry. A multiyear repeat would leave very large areas vulnerable to a catastrophic forest fire. This would increase global CO2 level instantaneously as well as removing a portion of the world's ability to absorb carbon from the atmosphere. (This is of course on top of, and separate from, deforestation.) Again it cannot be proved that these droughts are a direct result of global warming just as it cannot be proved that any of the following are: -- Central Russian heatwave last year. -- floods, typhoon and fires in Australia. -- UK record cold temperatures and snow (100 year low December 2010) -- East USA record snowfalls down to Kentucky ! (never !) -- record high temperatures in Greenland. -- reduction in Indian monsoon on decadal timescale 15 There are however scientific papers linking each of these, and the Amazon droughts, to the effects of global warming. (3) No one paper can be proof, but at a 10% level, each of these might be caused by global warming. If all are taken together, then it is more likely than not, that global warming is the cause and that these symptoms will get very much worse as overall world warming continues from the present 0.7°C towards 2° and beyond. Surely, again, it is on this basis that public policy decisions should be made and not on the basis that we can't prove that these events are caused by global warming. 16 The IPCC has been heavily criticised for exaggeration in some cases. Most of these, like the Himalayan glaciers, were clerical errors rather than exaggerations. The danger from this (and from the theft of private e-mails written in conversational language,) is that the climate academic community will retreat even more into a "peer review mentality". This would be a failure in their responsibility to society. 17 Maybe this parliamentary committee should consider setting up a small scientific committee to ensure that they to get a "reasonable worst-case" prediction of the effects of global warming. This is surely the basis on which public policy decisions should be made in this most vital of subjects. Such a committee should be instructed to work on a "business as usual" prediction of emissions and to avoid the errors of IPCC Working Group III. (4) It is extremely dangerous for the world if scientific predictions are made on the basis of totally unrealistic political negotiations about emissions reductions. John Gorman 7 March 2011 Declaration of Interests. I am not, and have never been, a part of a university or private research organization. I have had commercial interests (including patents) in office seating and car seats. None are current. I am a semi retired engineer and chiropractor. I have a geoengineering SRM (solar radiation management) proposal for which I seek funding. http://www.naturaljointmobility.info/grantproposal09.htm References in text; (1) Discussion outside parliamentary committee room following hearing by this committee on geoengineering. This was the reason given by a witness (very senior climate scientist) when asked why dangers, such as those mentioned here, had not been put forward as arguments in favour of geoengineering during the hearing. (2) The IPCC 2007, Summary for Policy Makers contained a list of six scenarios with associated predictions for sea level rise. The largest rise was in the A1 F1 scenario and was 26 to 59 cms. The average of these figures is 42.5 giving the usually quoted (at the time) figure of 40 cms. (3) Just one example from the multitude of papers associating these events with global warming. This one in the Philosophical Transaction of the Royal Society associates high Atlantic temperatures with the 2005 Amazon drought. Marengo, J.A. et al (2008). Phil. Trans. R. Soc. B, DOI: 10.1098/rstb.2007.0026 (4) Chris Green, Professor of Economics in the Global Environment and head of the Climate Change Centre at McGill University wrote: "WG III has repeatedly stated that we have the technologies to stabilize atmospheric concentration at almost any desired level and at modest, or even very low, cost. What is lacking according to IPCC WG III is "political will". The WG III statements re available technologies are unsupported by the evidence (see papers by Hoffert et al Nature 1998, Science, 2002, papers for which Tom Wigley was a co-author). These papers were wholly (Nature 1998) or largely (Science, 2002) ignored by IPCC WG III. ----if you are looking for unsound analysis the main place you will find it is in IPCC WG III---an analysis that arguably has had the biggest impact on climate policy and advocacy of any of the IPCC WGs. --- the flawed analysis of IPCC WG III - makes climate stabilization technologically much easier and economically less costly than it will be."
|
|
|
©Parliamentary copyright | Prepared 17th March 2011 |