Welcome to Quadrant Online | Login/ Register Cart (0) $0 View Cart
August 19th 2014 print

John McLean

Parsing the IPCC’s Piffle

Warmist politicians and professional alarmists who tell us man-made global warming is an increasing problem are forever repeating the mantra that "the science is settled". Given the acknowledged flaws and discordant findings of their cited models, that confidence reflects either ignorance or deceit. There is no other explanation

2 + 2Politicians and various activist organisations are fond of telling us that there is no doubt that man-made global warming is an ever-increasing problem.  If you ask them for the source of their information they will almost certainly direct you to the assessment reports by the UN’s Intergovernmental Panel on Climate Change (IPCC). Maybe you should ask them if they have read the latest IPCC report, because it tells a very different story.

All politicians should at least have read the Summary for Policymakers (SPM), the document specifically tailored for them by scientists and then revised by government representatives.  Being ignorant of the contents of the SPM would be major failing of any politician. IPCC reports use language that the lay-person might find difficult to understand, and in fact some government complained that the wording of the SPM was too complex, so where necessary I’ll try to explain what the statements mean.

Let’s start with an important passage:

… the rate of warming over the past 15 years (1998–2012; 0.05 [–0.05 to 0.15] °C per decade) … is smaller than the rate calculated since 1951 (1951–2012; 0.12 [0.08 to 0.14] °C per decade).” [SPM, page 3, section B.1]

While calculating a warming trend might seem straightforward, no one can be certain the data truly represents the global averages; the more data that’s available, the greater the confidence.  To deal with this uncertainty, statisticians apply an accepted formula and calculate the upper and lower limits of the range of values in which, according to conventional statistics, we can be 95% certain the true figure lies.

In the extract above, the range is shown in square brackets following the calculated trend, which is shown as 0.05°C/decade (oddly, Chapter 9 gives the figure as 0.04°C/decade). For the period 1998 to 2012 the range is from a slight cooling (0.05°C/decade) to a slightly stronger warming (0.15°C/decade). What this range means is that no one can be certain whether temperatures rose or fell over the 15-year period.

On the subject of climate models the SPM refers readers to a text box in Chapter 9, where we find:

… an analysis of the full suite of CMIP5 historical simulations (…) reveals that 111 out of 114 realisations show a GMST trend over 1998–2012 that is higher than the entire HadCRUT4 trend ensemble (… CMIP5 ensemble-mean trend is 0.21 ºC per decade).” [chapter 9, text box 9.2, page 769]

In simple terms this passage means that 111 (or 97%) of 114 executions of climate models (with some models run more than once) predicted warming greater than that calculated from temperature measurements.  Compare the trends in the two extracts above and it seems the average prediction of climate models, at 0.21°C/decade, is at least four times higher than the temperature trend calculated from data from temperature observations.

An attempt to explain the failure of the climate models is also given:

There may also be a contribution from forcing inadequacies and, in some models, an overestimate of the response to increasing greenhouse gas and other anthropogenic forcing (dominated by the effects of aerosols).” [SPM, section D.1, page 13]

Given the previous quote, it seems reasonable to wonder whether the word “some” would be better expressed as “the vast majority”, but it is at least an acknowledgement as it stands that models exaggerate the influence of carbon dioxide and other greenhouse gases. A passage from Chapter 10 provides further information:

RF [Radiative forcing] in the simulations including anthropogenic and natural forcings differs considerably among models (…), and forcing differences explain much of the differences in temperature response between models over the historical period (…).” [chapter 10, section, page 880]

It seems that while models differ considerably in how they mimic natural and man-made forces, the overwhelming majority over-estimate the influence of these anthropogenic forces.  Bear in mind that the output of such models forms not just one key plank of the IPCC’s claims but two, as the IPCC says in Chapter 9:

Climate models are the primary tools available for investigating the response of the climate system to various forcings, for making climate predictions on seasonal to decadal time scales and for making projections of future climate over the coming century and beyond.” [chapter 9, section 9.1.1, page 746]

The last part of that sentence certainly refers to making predictions, but the earlier portion is just as important.  The expression “investigating the response of the climate system to various forcings” really means running the models with and without greenhouse gases and attributing the differing results to human activity. It’s an approach that would be plausible if we could be confident that climate models were accurate. Except that, as shown above, even the IPCC admits they are not.

A serious problem that the latest IPCC update doesn’t admit is that its 2007 report showed that climate models had produced a reasonable estimate of average global temperature anomalies from 1951 to at least 1997.  Given that climate models have been found to over-estimate the influence of greenhouse gases, it follows that previous, reasonably correct outputs from those models must have been under-estimating the influence of other climate forces.

Let me summarise all this.  The first quote shows that temperatures were basically flat for the 15 years from 1998 to 2012, when the IPCC report was written.  The second quote shows that 97% of model executions wrongly predicted greater warming than occurred over that period.  The third quote admits that some(!) climate models exaggerate the influence of greenhouse gases on temperature. The fourth concedes that climate models differ in how they mimic both natural and man-made climate forces. The final quote tells us that climate models are not only used for predictions, but also for estimating the magnitude of human influences on temperature (which must mean that failures in prediction are accompanied by failures in estimating that influence).

If you can find any grounds for a supposed scientific consensus in those statements you’re doing better than me.  There’s not even a consensus on how to factor in the influence of greenhouse gases in climate models. The reality is that the IPCC has no clear idea of the magnitude of any human influence on temperature.  Its old estimates were based on the output of climate models, but now we see the IPCC admitting that climate models are seriously flawed and over-estimate the influence of greenhouse gases.

Put that with the IPCC’s acknowledgement that there’s been no warming for about 15 years and it’s clear that after 25 years of operation and five assessment reports the IPCC hasn’t achieved much at all, apart from claiming ever-increasing certainty that humans are responsible for most of the warming since 1950, this despite the absence of any credible evidence to support that position.

Politicians and others who try to tell us that man-made global warming is an increasing problem, endlessly repeating the mantra that “the science is settled” are either ignorant or deceitful. Simply put, there are no other explanations.

John McLean was co-author with Chris de Freitas and Bob Carter of a paper that became the centre of controversy when submitted to the Journal of Geophysical Research. Their experience with the censors of science can be read here