5.  RELIABILITY OF MODEL-BASED WARMING PREDICTIONS
 5.1  Problems with IPCC’s credibility
  5.1.1  Credibility and lowered forecasts
The numerical models used to predict future global warming14 have proven unreliable in the past, and predictions have required substantial reduction over time.  At the 1988 Toronto Conference a warming of 0.8 degrees C/decade was invoked, reducing to 0.3 degrees C/decade for Rio in 1992, and 0.2 degrees C/decade for Kyoto in 1997.

But while reducing the predicted sensitivity of the globe to human-caused greenhouse emissions sounds like good news, it brings with it an obvious problem of credibility.  But there is also a second credibility problem which is less obvious: that of a consequent lowered sensitivity to emission reductions.  The less warming GHG emissions will cause, the less cooling their mitigation will engender.

Thus Wigley (1998) estimates that, if fully implemented and thereafter maintained by all Annex B nations (ie those with a commitment to reduce their emissions), the Kyoto Protocol would reduce the projected warming 0.07 degrees C by 2050 and 0.2 degrees C by 2100.  Reductions of this magnitude would be lost in the ‘noise’ of natural variability.

Quick!  Get the forecast up!

  5.1.2  Models as a psychological weapon
To those of us who are not privy to IPCC’s work, the models on which it bases its predictions of future warming are ‘black boxes’.  Scary numbers are broadcast, and we get scared.  Is this science in action?  Or is it just science-politics?

A recent example is the news in brief item which appears in Nature of 2 November 2000 (v 408 p 10, see also Section 1.3.1 above) under the title “Global warming happening faster than predicted”:
 Global warming could be happening more rapidly than previously estimated,  leading to an average temperature increase of as much as 6 0C over the next  century, according to the latest assessment by the Intergovernmental Panel on  Climate Change (IPCC).  The Report, which was leaked in advance of its  expected completion in January next year, predicts that global warming will be  greater than the IPCC’s earlier assessments in 1990 and 1995.  The panel had  previously estimated the maximum likely temperature increase at around 3 degrees C.

Nemesis, in its familiar form - World Climate Report - has this to say (v 6 no 4) under the title “The IPCC does it again”:

______________________________________________________________________________________
14.  Because of the type of equations involved, numerical climate models convert the assumed exponential increase of anthropogenic greenhouse gas emissions into a prediction of straight-line increase in global temperature.  Hence, the common use of degrees (or parts thereof) per decade when talking of global warming predictions   



 
But the document the IPCC sent out for scientific peer review contained no such  number.  Indeed, after the scientists reviewed it, the maximum value was 4.8 degrees C.   What gives?  In a sad repetition of a 1995 fiasco in which the key phrase “the  balance of evidence suggests a discernible human influence on global climate”  was inserted after the document had circulated among scientific reviewers, the  IPCC has changed its report’s most crucial conclusion at the 11th hour, after the  scientific peer review process had concluded.
and
 After the scientific review process, the IPCC document undergoes a “Government  Review.”  It was at this stage that the 6 degrees C figure was inserted.

We now see IPCC’s much-publicised ‘as much as 6 degrees C over the next century’ initiative for what it is.  But the general question remains:
 Can the numerical models on which IPCC relies foretell the future?

 5.2  The atmosphere fails a 20-year reconciliation
Michaels and Knappenberger (2000) conclude their paper (see the discussion in Section 4.2.3, above) by saying:
 That current generation GCMs do not accurately reproduce the observed  temperature history of the lower troposphere during the MSU era remains  unchallenged. ..... Until GCMs can produce accurate representations of the  known three dimensional climate history, they cannot be relied upon to produce  future scenarios that are accurate enough to serve as the basis for climate impact  assessments.

 5.3  IPCC’s hindcast against the known surface record
  5.3 1  A spurious validation
If IPCC’s projections of future climate-change are to be believed, the successful hindcasting of their models against the known record must be demonstrated.  Credibility demands validation of their models against past climate change.
However, the IPCC continues to suffer from the same old problem: over-predicting models and under-warming world (see Figure 13).  The IPCC Report says (p 33):
 Currently available model simulations of global mean surface temperature trend  over the past half century show closer agreement with observations when the  simulations include the likely effect of aerosol in addition to greenhouse gases.

And misleadingly, the Report also says (p 34) of its models:
 They have been tested with a good degree of success against known climate  variations ..... This provides some confidence in their use for future climate  perturbations caused by human activities.

If we didn’t know better, we might have accepted this statement.

5.3.2  Sulphate aerosols: another credibility problem
As can be seen in Figure 13, the use of GHGs alone cannot validate the IPCC models.

What then of the “closer agreement” achieved by invoking aerosols?  The inclusion of both greenhouse warming and aerosol cooling certainly turns a miss-match into a match - on a globally-averaged basis.  However, there is more (or is it less) to aerosols than meets the eye; and it is not apparent until we look beyond IPCC’s ‘globally-averaged basis’.

The Box in the Report “What drives changes in climate?” (p 14) says of this topic:
 Anthropogenic aerosols (small particles) in the troposphere, derived mainly from  the emission of sulphur dioxide from fossil fuel burning, and derived from other  sources such as biomass burning, can absorb and reflect solar radiation.  In  addition, changes in aerosol concentrations can alter cloud amount and cloud  reflectivity through their effect on cloud properties.  In most cases tropospheric  aerosols tend to produce a negative radiative forcing and cool climate.  They  have a much shorter lifetime (days to weeks) than most greenhouse gases  (decades to centuries) so their concentrations respond much more quickly to  changes in emissions.

So far so good.  But let’s look at the distribution of aerosols.  Figure 14 is reproduced in colour and bound in the front of this paper.  It shows the total surface concentrations of sulphate emissions for July - including ship, land-based anthropogenic, and biogenic.

Most of these short-lived emissions are in the Northern Hemisphere.  In fact, the miss-match between hemispheres would be still greater in the northern winter because of coal-fired home heating, particularly in northern and inland China.

Reality is the problem here.

IPCC arrogates to its reconciliation all the observed warming of the past half-century.  (We know this isn’t justified, because of the pronounced lack of concomitant warming in the lower atmosphere, as illustrated in Figures 9, 10- but let’s ignore this for the present.)  Obviously, if this surface warming were caused by (well-mixed) greenhouse gases, it would be hemispherically symmetric.

Equally obviously, if the aerosols invoked by IPCC are indeed coolants, then the Southern Hemisphere will warm relative to the Northern Hemisphere - because the latter hemisphere is where the countervailing aerosols are, as shown in Figure 14.  But an examination of hemispheric surface temperatures (Figure 15, and in more detail in Figure 6) shows the opposite to be the case.  The same applies in the atmosphere, as shown in Figure 8(b).  With aerosols, theory and practice are in opposition.

Popper’s Law of Empirical Disproof has done its job.  Aerosol cooling is not in evidence; and, as a consequence, IPCC’s supposed model validation is worthless.
 
 5.4  The ‘great triumph’ which never was
Inauspiciously, the recent explanatory news and views (ie invited) article in Nature of 26 October (Dunbar 2000) begins with a paean of miss-placed praise for predictive models:
 In the 34 years since Jacob Bjerknes first proposed a mechanistic link between  ocean temperatures in the equatorial Pacific and much larger-scale patterns of  atmospheric circulation, one of the great triumphs of research into climate  dynamics has been the formulation of predictive models of the ENSO system.

The dynamical general circulation models on which IPCC relies for its alarming projections of future climate change are very expensive to build and run.  In my opinion, they don’t and can’t provide information about the future which is useful - even on a global, let alone regional, scale.  The money would be better spent elsewhere.

However, there are those (see above) who point to the ‘great triumph’ of dynamical models of the ENSO system, where the models predict the timing and magnitude of El Niño events up to a year or so ahead.  Supposedly, this success is evidence that similar models can also produce worth-while projections of global climate, say, 50 years ahead.

A recent paper by Landsea and Knaff (2000) puts the lie to such claims.  (I have not yet seen the original, and here rely on secondary sources 15.)

For instance, Kerr tells us that these authors (from NOAA) give the performance of complex models’ a ‘failing grade’, and compares them with their own simple ENSO-CLIPER model “which runs on a workstation in about a microsecond”.

Kerr continues:
 The NOAA researchers also found that, contrary to initial impressions, the  complex computer models, which are similar to ones developed to predict  greenhouse warming, fared no better in the longer run than much simpler - and  much cheaper - models that are based on statistics.
and
 The use of more complex, physically realistic dynamical models does not  automatically provide more reliable forecasts.

World Climate Report - IPCC’s long-time adversary - is pleased to quote extensively from Landsea and Knaff, saying:
 Towards the paper’s conclusion, the authors offer a remarkably candid (for  science) perspective:
 

___________________________________________________________________________________
15.  One is a What’s Hot story (presumably by editor Patrick J. Michaels) on 9 October 2000: “Forecasters needed.  No skill required”, World Climate Report v 6 no 2.
The other is a News Focus story by staffer Richard A. Kerr on 13 October: “Second thoughts on skill of El Niño predictions”, Science v 290 pp 257, 8.


  [It is] disturbing that others are using the supposed success in dynamical    El Niño forecasting to support other agendas.  As an example, an     overview paper by Ledley et al (1999) to support the American     Geophysical Union’s “Position Statement on Climate Change and     Greenhouse Gases” said the following: “Confidence in [comprehensive    coupled] models [for anthropogenic global warming scenarios] is also    gained from their emerging predictive capability.
and
  An example of this capability is the development of a hierarchy of models    to study the El Niño-Southern Oscillation phenomena .... These models    can predict the lower frequency responses of the climate system, such as    anomalies in monthly and season averages of the seasurface temperatures   in the tropical Pacific.

World Climate Report then continues:
 “On the contrary,” the authors state, their own results suggest we should have  “less confidence in anthropogenic global warming .... The bottom line is that the  successes in ENSO forecasting have been overstated (sometimes drastically) and  misapplied in other arenas.”

 5.5  Nature denies Geoscience!
Denial appears be the chosen way forward.  Just behind the contents pages of each issue of Nature is In this issue, which flags important research papers.  I repeat below the first flag for 5 October 2000:
 Attempts to forecast the response of the climate system to anthropogenic change  have been plagued by the difficulty of estimating the uncertainty involved.  This  weakness has often given the ’anti-greenhouse warming lobby’ ammunition with  which to cast doubt on the generally accepted view of global warming.  Allen      et al. now present an objective analysis of the uncertainty in predictions based on  general circulation models (GCMs).  Expect global mean temperatures in the  decade 2036-2046 to be 1-2.5 K warmer than in pre-industrial times if the  ‘business as usual’ scenario is followed.

While I note that those of us whose views lie outside the dominant paradigm are a ‘lobby’, I have more substantial criticisms:

First, is the assumption that there is some particular level of global mean temperature which applied to ‘pre-industrial times’.

Second, is the implication that, if it were not for ‘anthropogenic changes’ in the interim, that level would still apply in 2036-46.

These mistaken concepts deny the corpus of knowledge which we call ‘geoscience’.

 5.6  Nonlinear transitions denied by Weaver and Zwiers
Another technique adopted by Nature to enhance and emphasise important contributions to its pages is to provide a smaller, and less technical, commentary article earlier in the same issue.  Choice of authors for this invited piece gives the (presumably unsolicited) original research article a context which is within the journal’s control.

Such a piece leads the news and views section on 5 October as introduction to a research paper by Allen et al (2000); and it says at its beginning:
 Governments around the world are investing heavily in coupled-climate models  to project future climate change.  Such models have interacting atmosphere,  ocean, land and sea-ice components, and serve as laboratories for studying the  effects of natural and human influences on the climate system.

This paper by Weaver and Zwiers is largely explanatory, and it continues:
 An atmosphere that is enriched in greenhouse gases may lose less heat to space  and consequently become warmer. ..... Sulphate aerosols have a direct cooling  effect by scattering incoming sunlight back to space.  They may also have an  indirect cooling effect by influencing the lifetime and reflectivity of clouds.   Natural forcing factors, such as volcanic eruptions and variations in solar output,  may influence climate.

The crucial point here, not mentioned by Weaver and Zwiers, is that the atmosphere appears to have not ‘consequently become warmer’ - denial again.  The authors continue:
 Early model simulations driven exclusively by greenhouse gases tended to over- estimate the warming in the twentieth-century record.  In 1995, two studies  appeared which showed better agreement when the direct effects of sulphate  aerosols were also included.  Since then some modelling groups have also  included the effects of sulphate aerosols on cloud properties .....

I have exposed above the false refuge offered to over-predicting climate models by spurious aerosol cooling.  More denial; but Weaver and Zwiers continue undaunted:
 Modellers are now also taking into account the historical variation in natural  influences on climate.  For instance, some studies suggest that solar irradiance  may have been a factor in the warming in the early part of the century.

Happily, this statement is not further denial.  However, it does damn the Sun with faint praise (see Section 3.2.1, above).

Weaver and Zwiers put the main point of their review in the following terms:
 ..... Allen and colleagues’ projected temperature change for the middle of the  twenty-first century appears to be insensitive to whether or not a forcing or  process is missing (such as the effect of sulphate aerosols on clouds, or of  variations in solar or volcanic influences).
and
 At the least, they argue, this should be the case provided that the climate  feedbacks of the missing process or forcing are close to linear for small  departures from the present climate.

But they follow with a crucial caveat
 Of course, all these conclusions fall apart if a sudden, yet highly improbable,  nonlinear transition between climatic regimes occurs.  Such a transition might  occur, for instance, if the North Atlantic ‘conveyor’ were to shut down.  The  conveyor transports warm surface water northwards and cold deep water  southwards, and has a large influence on climate.

Even more denial!

The title of my submission could well have been “Sudden nonlinear transitions between climatic regimes”.  It is my intention herein to show that indeed “these conclusions fall apart”, and therefore the general circulation models on which IPCC relies cannot predict future climate.

 5.7  Bad news from the Moon
Until recently, the Moon has been given little attention in relation to oceanic circulation - and hence, to the nonlinear transitions within the oceans which are involved in climate change.  Therefore, a recent Nature paper by Egbert and Ray (2000) is a welcome step forward.

I here quote selectively from their abstract:
 Historically, the principal sink of tidal energy has been thought to be bottom  friction in shallow seas.  .....  Here we use satellite altimeter data ..... to map  empirically the tidal energy dissipation.  We show that approximately ..... 25-30%  of the total dissipation occurs in the deep ocean, generally near areas of rough  topography.  Of the estimated 2 TW of mixing energy required to maintain the  large-scale thermohaline circulation of the ocean, one half could therefore be  provided by the tides, with the other half coming from action (ie by wind) on the  surface of the ocean.

This research paper was supported by a news and views commentary in the same issue of Nature which puts the conclusions of Egbert and Ray in a broader context.  In the opinion of the commentator (Wunsch, 2000) their paper raises serious doubts about the veracity of today’s climate models.

This is bad news indeed for IPCC; and one wonders what its new Third Assessment Report will say about the Moon.

In particular, Wunsch finds as follows:
 Under the simple assumption that a uniform upwelling of cold water is balanced  by a uniform downward mixing of warmer water throughout the water column, a  steady state is achieved.  Almost all numerical models of the ocean and climate  systems represent this process through spacially constant vertical ‘eddy’-mixing  coefficients, as do the textbook theories.

 Such a fluid system is stable, and in a steady state it cannot produce the vigorous  flow that we observe in the deep oceans.  There cannot be a primarily  convectively driven circulation of any significance.  .....  The reliance of almost  all numerical circulation models on uniform interior-ocean mixing calls into  question inferences about the physics of the circulation based on them.

Wunsch continues:
 But ..... about half of the power required to return the deep waters to the surface  was coming from mixing driven primarily by dissipation of tidal energy -  principally lunar; but with a minor solar component - in the deep ocean.

 ..... there are several implications.  One is that it brings into question the extent to  which uniform-mixing models of the ocean circulation could either reproduce the  present-day circulation or predict responses to external changes.

He concludes:
 It appears that the tides are, surprisingly, an intricate part of the of the story of  climate change, as is the history of the lunar orbit.

The next hypothesis of global climate change may well have to recognise lunar influences.

This might be the time to remind readers that in his National Press Club Address (see Footnote 6) Pearman was reported as saying that:
 ..... complex models showed that by the end of the 21st century there will be  between 2 and 5 degrees of global warming.
and
 ..... scientists were confident in the findings of the models they used because they  were created using equations that represent the physical processes of the climate  system rather than being based on observations of the weather.

You read it first here

© 2001  Bob Foster  Posted   9, April, 2001
www.globalwarming-news.com
Back to "Duel of the Hypotheses"  contents page
Back to Guests Page
Back to Front Page