Illusions of Climate Science
How have we come to a situation where, as some polls suggest, most Australians are so concerned about dangerous climate change that they will put aside the very tools and technologies that have sustained clean air, clean water, nutritious food and long life? More importantly, is the perceived danger real and will the reduction of carbon dioxide emissions avert the perceived danger? Although there are many uncertainties to be resolved, it is clear that the community has been the subject of more than two decades of heavily biased propaganda.
In spite of claims to the contrary, there is no consensus of scientists supporting the findings and recommendations of the Intergovernmental Panel on Climate Change. There exists a large and vocal group of highly qualified dissenters (often denigrated as sceptics, deniers or worse). Published letters and opinions in the press suggest the scientific community is still divided and the community has not succumbed to the propaganda of human-caused global warming. Many in the community, with every justification, are awaiting more information about the costs and the economic and social impacts before lining up to march behind the government’s carbon dioxide reduction banner.
A widely accepted conviction that dangerous climate change is actually pending will be required before the community will support the government’s strategy to shut down fossil-fuel-dependent industries and willingly abandon the energy-dependent and satisfying lifestyle activities they enjoy. After all, in the cause of saving the planet we will all be required to give up a wide range of personal freedoms that we currently take for granted. We will want to be in full agreement that the alleged dangers are real and present, and that the course of government-imposed actions really will avert them.
Are the Dangers of Human-Caused Climate Change Real and Present?
The notion of human-caused global warming has its origins in late-nineteenth-century speculation about the causes of past climate shifts, especially the ice ages when large parts of North America and Europe were under kilometres of ice. Svante Arrhenius of Sweden argued that intermittent volcanic activity, and the injection of huge amounts of carbon dioxide into the atmosphere, had regulated retreats and advances of glaciations, but this theory has now been discarded. The Serbian Milutin Malenkovich’s early-twentieth-century calculations linking the glaciations to changing characteristics of the earth’s orbit around the sun is now in favour. Nevertheless, speculation linking potential global warming to the burning of fossil fuels, based on Arrhenius’ theory, continued through the middle twentieth century.
During the 1960s and 1970s computer modelling was being developed to advance weather prediction. As they advanced, weather prediction models were adapted to crudely simulate climate, and a number of simple “what if?” experiments were carried out. For example, what would happen to Earth’s temperature if the concentration of carbon dioxide in the atmosphere was doubled, or trebled? Some of these crude experiments suggested that increased carbon dioxide might significantly raise the temperature of the earth.
As a consequence of the early modelling experiments, the issue of dangerous human-caused global warming was a consistent underlying theme of a series of international and intergovernmental environmental conferences that preceded the formation of the United Nations Environment Program (UNEP) in the early 1970s.
The 1979 First World Climate Conference held in Geneva played a crucial role in alerting the world community to the need for a better understanding of climate systems, climate change and mitigation of its harmful effects. Global cooling and the possibility of earth slipping into the next ice age had been dominant themes in the years leading up to the conference. However, the possibility of human-caused global warming was recognised and received attention. The Conference Declar-ation noted: “it appears plausible that an increased amount of carbon dioxide in the atmosphere can contribute to gradual warming of the lower atmosphere, especially at high latitudes”.
In 1985, mainly at the instigation of UNEP, but co-sponsored by the World Meteorological Organization (WMO) and the International Council of Sciences (ICSU), a conference was held in Villach, Austria, to review the impact of human-caused carbon dioxide emissions on climate. Following the presentation of invited papers the Conference Statement was forthright: “As a result of the increasing concentrations of greenhouse gases, it is now believed that in the first half of the next century a rise of global mean temperature could occur which is greater than any in man’s history.” It was asserted that planning for the future should not be based on historical data, because human-caused emissions of carbon dioxide were contributing to global warming and climate change. Based on then available computer models, estimates were given of a 1.5°C to 4.5°C temperature rise and a sea level rise of between 20 and 140 centimetres from a doubling of carbon dioxide concentration in the atmosphere.
The Villach Statement was the basis for instigating a series of national and international conferences. The essential purpose of the conferences was to raise community awareness of the danger from burning fossil fuels and raising atmospheric carbon dioxide concentrations. In Australia, the then Commission for the Future and CSIRO were leading players in the local promotion, including sponsoring of the December 1987 conference of invited scientists, “Greenhouse: Planning for Climate Change”. A 1988 international conference in Toronto, of invited environmental bureaucrats and scientists, was the first to specifically call for a 20 per cent reduction of carbon dioxide emissions in order to prevent future dangerous climate change.
The very strong and active role being played by UNEP and the environment movement generally in the promotion of human-caused global warming became of concern to the more conservative science-orientated WMO. The concern was twofold: first that the policy proposals were running far ahead of perceived scientific understanding; and second, the lead in climate matters was being usurped by UNEP. WMO and UNEP agreed that a thorough review of the science associated with carbon dioxide and its impacts on climate should be carried out. The two agencies co-sponsored the formation, in 1988, of the Intergovernmental Panel on Climate Change (IPCC), under UN auspices, as an authoritative source of advice to governments.
The IPCC presented its first assessment report at the 1990 Second World Climate Conference. In essence, the findings confirmed that there is a greenhouse effect and that increasing atmospheric concentrations of carbon dioxide resulting from human activity will enhance the greenhouse effect. However, the report highlighted the many scientific uncertainties and noted it was not possible to predict the timing, magnitude or regional impacts of the enhanced greenhouse effect. Nevertheless, in spite of the uncertainties, the IPCC provided an estimate of between 0.2°C and 0.5°C temperature rise per decade and 6 centimetres per decade sea level rise over the coming century, based on computer models.
The IPCC First Assessment Report was endorsed by the 1990 Second World Climate Conference, and an associated ministerial meeting issued a declaration calling for the negotiation of a treaty to restrict carbon dioxide emissions and prevent dangerous climate change. The UN established an Intergovernmental Negotiating Committee that met six times between February 1991 and May 1992 and presented its Framework Convention on Climate Change for endorsement by governments at the Earth Summit in Rio de Janeiro in June 1992. From the Convention:
“The ultimate objective of this Convention and any related legal instruments that the Conference of the Parties may adopt is to achieve, in accordance with the relevant provisions of the Convention, stabilization of greenhouse gas concentrations in the atmosphere at a level that would prevent dangerous anthropogenic interference with the climate system. Such a level should be achieved within a time frame sufficient to allow ecosystems to adapt naturally to climate change, to ensure that food production is not threatened and to enable economic development to proceed in a sustainable manner.”
What is not generally understood is that the objective of the Convention is not to prevent climate change, dangerous or otherwise, but to prevent dangerous climate change caused by human activity. This is underscored in the Definitions to the Convention: “‘Climate change’ means a change of climate which is attributed directly or indirectly to human activity that alters the composition of the global atmosphere and which is in addition to natural climate variability observed over comparable time periods.”
The negotiations for the UN Framework Convention were carried out against a background of politics rather than science and logic. Many undercurrents could be clearly discerned that reflected the various vested national and regional interests.
Developing countries saw the issue starkly: whether the problem was real or not it was created by industrialised countries; it was for industrialised countries to fix and there should be money for compensation to developing countries that would be affected; and there should be technology transfer to ensure developing countries did not make the same “mistakes”. The association of small island states was most vocal in this respect—their very existence was claimed to be threatened by rising sea levels and they should receive refuge and compensation.
The oil exporting countries of the Middle East were concerned at where these negotiations were heading, seeing a potential drying up of revenues. At every point there was denial of a problem, emphasis on the dangers of nuclear energy as a “clean” alternative, and attempts to water down any agreements for action.
The newly independent countries of the former Soviet Union sought special recognition because economic downturn was leading to the closure of older inefficient and “dirty” factories. Credits for newer technology with fewer emissions were expected to lead to investment through a clean development mechanism.
Most intriguing was the obvious tension between the European Union and the United States. The former had a high investment in nuclear energy and was rapidly converting from coal to offshore natural gas; it could also see offset benefits from the modernisation of industry in the former East Germany. The EU was in a strong position to comply with a low-end requirement for 20 per cent reduction in emissions from the propitious 1990 baseline, especially with the potential of expanding nuclear energy—despite the experience of Chernobyl. In contrast, the much publicised nuclear accident of Three Mile Island had raised very strong public opposition and the USA had little prospect of expanding nuclear energy. Any move to impose energy constraints through reduction in carbon dioxide emissions would hit the US industry base with its reliance on fossil fuels, especially coal. There was clearly potential to shift the comparative advantage for producing goods and services from the USA to the EU, with advantages for European economies.
Now that the first Kyoto period is drawing to a close there are added political complications through the rapid industrialisation of developing countries, especially the economies of China, India and Brazil. Whether or not there is consensus about the underlying scientific assertions, the next round of negotiations for post-2012 commitments is going to be difficult, given that the bulk of new emissions will come from developing countries and that they have emphatically and repeatedly refused to countenance any limitations on their use of carbon-based fuels.
Are the Alleged Impacts of Climate Change Exaggerated?
In Australia the assertions of dangerous human-caused climate change, and even runaway global warming, are being fanned by varying interests who should know better, or who, at least, should check the facts. Drought and problems of the Great Barrier Reef are being promoted as diabolical issues that can only be addressed by emissions reduction, despite the overwhelming evidence that the current climate experience is but a repeat of past droughts when we were far less able to offset the economic losses with income from other industries. The argument put by many, that the drought conditions of the past decade are so severe that they can only be the cumulative outcome of environmentally-unfriendly human activity, cannot be sustained. It ignores a large body of accumulated scientific and historical knowledge to the contrary, and is a resort to illusion.
Australia’s documented history, incorporating its cultural and economic development within the constraints of its relatively harsh climate, is relatively short. Drought and other vicissitudes of climate have been ever-present dangers and have been dominant in shaping the pattern of settlement and development. The twentieth century opened with much of the country in the grip of the “Federation Drought” that commenced in the middle 1890s and continued in parts until about 1905. The decade around the First World War, the late 1930s and through to the early 1940s, and the middle 1960s were all prolonged periods of generally low rainfall, especially in parts of eastern Australia. Although there were short but intense drought periods, such as 1982–83, the second half of the century was generally wetter than the first, until the early 1990s and the beginning of the current dry period.
There is currently a focus on the state of the Murray-Darling Basin and the condition of the lower Murray River, as if the current low river flows had not happened before. However, during the Federation Drought the basin suffered significant rainfall deficiency and by late 1902 the Darling and Murray Rivers had virtually run dry. In 1914 the Murray downstream of Swan Hill was reduced to a series of stagnant pools. Prolonged low rainfall during the 1940s again resulted in stress on the rivers of the Basin and the Murray River ceased to flow at Echuca in April 1945.
One of the great myths that gained currency during the recent debate on human-caused global warming is that higher temperatures will cause more droughts. However, continental rainfall largely has its origins in evaporation from the surrounding oceans. The fact is that evaporation increases by more than 6 per cent with each degree Celsius of sea surface temperature rise and, as a consequence, warmer temperatures will generate more rainfall. The great wind-blown sand dunes of Central Australia did not form during the warmer Holocene Optimum between 5000 and 10,000 years ago but during the colder, drier glacial period more than 20,000 years ago. During the Holocene Optimum the subtropical deserts of the world were blooming and teeming with life. In the past, a warmer world with higher sea surface temperatures has been beneficial and enhanced Australia’s summer monsoon and winter rainfall.
What we find is that maximum temperatures are lower during wetter years, when there is more cloud and wetter soil from which evaporation keeps the surface cooler. During very dry years there is less cloud and very little cooling evaporation; maximum temperatures are consequently higher. It is the clouds, rainfall and soil moisture that regulate temperature. It should be noted that the all-time daily maximum temperatures of Adelaide, Melbourne and Sydney occurred during January 1939, as a heatwave progressed across southern Australia at the culmination of two years of regional drought. The high temperatures followed the drought—the drought did not follow the high temperatures.
Many other mythologies that link possible dire outcomes with global warming have their origins in studies following the major El Niño event of 1997–98. The shift in seasonal weather patterns during the event caused many tropical and middle latitude regions to experience drought, while other regions experienced excessive rainfall and flooding. The uncommon weather sequences and seasonal rainfall patterns resulted in a range of ecological responses, most of which were deemed undesirable because they were outside the boundaries of usual experience. Land and water management systems became stressed, much community infrastructure was damaged or destroyed, and ecological changes promoted the spread of a range of diseases.
Drought in many equatorial and tropical forests and grasslands made these lands susceptible to fire; outbreaks that occurred were generally unmanageable because of inadequate planning and response infrastructure. The accompanying smoke promoted respiratory and eye infections; the stagnation of streams and waterways led to pollution accumulation and to the outbreak of a range of pollutant-related diseases.
Elsewhere, excessive rainfall caused waterlogging of fields, flooding, and destruction of private and community infrastructure. Expansion of insect populations, especially mosquitoes, meant that the carriers could spread disease more readily. Higher incidences of encephalitis, dengue fever and malaria could be linked directly to the changed environmental conditions. As a general rule, the increased incidence of disease occurred in countries that did not have the resources on hand for rapid deployment to control the outbreaks, either through insect control or for medication.
According to a US assessment, the global impact from the 1997–98 El Niño event included 24,000 deaths, 533,000 people suffering illness, 6 million persons displaced, 111 million persons adversely affected, and a direct loss of US$34 billion.
The 1997–98 El Niño event, and information from earlier events, provided a valuable database for linking changed climatological conditions to environmental, community and industrial impacts. The real value of this database is in the formulation of response strategies so that, in the future, resources are marshalled and potential impacts can be mitigated. For example, early response will prevent the build-up of insect populations and reduce the spread of disease; medications will be available for treatment of eye and respiratory diseases where smoke is a problem; and water purifiers can be mobilised to ensure clean water.
Unfortunately, the raw data relating unmanaged ecosystem responses and community impacts from the local and limited duration seasonal climate anomalies of the El Niño events have been extrapolated to give potential impacts of human-caused climate change. However, directly extrapolating future impacts from past experience can be misleading. For example, the expansion of mosquito populations and increased disease incidence to a hypothetical future climate gives a very scary but exaggerated scenario. This, of course, is the intention.
There are two essential pieces of information that the proponents of the theory of dangerous human-caused climate change do not discuss. First, the impact statistics relate to largely unmanaged systems. For example, malaria was endemic throughout northern Europe before the draining of marsh lands and the imposition of good public health regimes. Would it not be important to implement appropriate public health measures in the countries still subject to these diseases in order to reduce their incidence, whether or not the world is getting warmer or cooler? Second, it is legitimate to ask whether reduction in carbon dioxide emissions is the most sensible and cost-effective approach to controlling a range of endemic diseases.
Is Dangerous Human-Caused Global Warming a Reality or an Illusion?
The case for dangerous human-caused global warming rests solely on the projections of computer models. Without such projections, which have consistently been in the range of a 1.5°C to 4.0°C global temperature rise from a doubling of carbon dioxide concentration, there would be no basis for alarm. The low-end computer projections exceed the range of temperature variation of the past 10,000 years—the experience that ranged between the Holocene Optimum and the Little Ice Age. The high-end projections approach the temperature and climate range between the major ice ages, including advance and retreat of continental ice sheets and sea level variation of more than 130 metres. Sediment analysis from ocean cores suggests that the range of tropical temperature variation across the glacial cycles was only about 3°C, although the range was much larger over the polar regions.
There is, however, much observational and theoretical evidence to suggest that the computer projections are fanciful. Even the evolution of computer modelling of climate suggests that the projections should be treated with extreme caution. Importantly, the oceans and their circulations are the thermal and inertial flywheels of the climate system; as the ocean circulation changes, the atmosphere and its climate respond. Our knowledge of subsurface ocean circulations and their variability is limited. Without this vital input, projections of future climate are tenuous at best.
The computer models used as the basis of projections at the 1985 Villach Conference and later for the 1990 IPCC first assessment had no dynamic ocean circulation. The ocean was represented by a water slab with prescribed energy transfers. It was assumed that the ocean surface temperature variations would respond to changing atmospheric temperatures as forced by carbon dioxide increases. To determine the effect of carbon dioxide a model would be brought to equilibrium to generate the “control” climate; the atmospheric carbon dioxide concentration would be doubled and run to equilibrium, giving the “response” climate. The difference between the “response” and the “control” was deemed to be the impact of carbon dioxide.
The IPCC initiative and the negotiations for the UN Framework Convention on Climate Change gave worldwide impetus and increased government funding for the rapid development of climate modelling. The 1995 IPCC second report was able to draw on results from climate models that by then included dynamic ocean circulations. A weakness of these early coupled ocean–atmosphere models was a tendency to warm, even without carbon dioxide forcing. Many of the projections of carbon dioxide forcing came from models that took the difference in warming between control and response where both experienced warming. In other models the ocean–atmosphere energy exchange was constrained and adjusted to maintain a steady global temperature in the control. The same artificial constraints were applied to the response.
The issues surrounding the natural tendency of the coupled models to warm were claimed to have been overcome by the time of the 2001 IPCC third report. Unforced computer simulations extending over a period of 1000 years showed no long-term global temperature trend and no significant periodic oscillations. On this basis, the IPCC asserted, “The warming over the past 100 years is very unlikely to be due to internal variability alone, as estimated by current models.” This was also the report that promoted the infamous “hockey stick” representation of global temperature over the past 1000 years—essentially the straight handle of constant temperature for 900 years followed by the blade of rising temperatures of the past century.
The combination of unvarying computer simulation and apparently steady temperatures before the rapid industrialisation and then temperature rise of the twentieth century was powerful imagery to support the propaganda that the warming of the twentieth century was human-caused. Unfortunately the “evidence” was all a mirage. The statistical analysis underlying the “hockey stick” has been shown to be fatally flawed; a wide range of compelling historical, cultural, archaeological and paleo data support the Medieval Warm period–Little Ice Age–modern warm period climate cycle. Moreover, there is no strong evidence that the current temperatures are warmer than those of the Medieval period from the ninth to the thirteenth centuries when, for example, there were thriving settlements on Greenland.
Although the IPCC case for an unvarying climate, unless externally forced, rests on the performance of computer models, the proposition does not accord with either evidence or logic. The oceans are a relatively large mass of cold dense fluid that is constantly in motion, largely driven by surface wind stress, although tempered by topography, tropical surface heating and salinity variations. The atmospheric circulation responds to tropical heat and moisture exchange from the underlying oceans and the atmosphere transports heat to polar regions. It would be truly remarkable for two interacting fluid layers on a rotating spherical surface not to produce significant periodic variations on a range of time-scales. The observed inter-annual and decadal variations of climate and the multi-centennial time-scale of some overturning circulations contradict the assertion of the IPCC that the climate system has only limited internal variability. The evidence provides support for the view that there is significant internal variability of the climate system that gives rise to variations on a range of time-scales.
The 2007 IPCC fourth report has claimed that, based on contemporary computer model simulations, the temperature rise of the last half-century is very likely caused by human activity, particularly carbon dioxide emissions. This claim is made even though there is only limited correlation between fossil-fuel-based carbon dioxide emissions and twentieth-century global temperature rise. The economic stagnation and limited growth of carbon dioxide emissions in the decades between the world wars was accompanied by significant rise in global temperature; the rapid increase in fossil fuel usage in the decades following the Second World War coincided with declining global temperatures. It was only after the middle 1970s that temperatures again increased, and mainly over the continental areas of the northern hemisphere, but the temperature trend has again plateaued during the last decade.
The IPCC rationale is that emissions of aerosols during the early years of the postwar industrial boom reflected more solar radiation back to space and therefore constrained global temperatures against the enhanced greenhouse effect of increasing carbon dioxide concentrations. According to the rationale, following the implementation of national Clean Air Acts the aerosol emissions were eliminated, allowing the enhanced greenhouse effect to emerge and force up temperatures in the 1980s and 1990s. Models with carbon dioxide and aerosol forcing were able to reproduce the twentieth-century global temperature record with a degree of fidelity.
Credulous supporters have accepted the seemingly plausible IPCC rationale and its associated computer simulations without questioning the underlying foundations. First, as the IPCC report notes, there is a very low level of understanding about the interactions between radiation and atmospheric aerosols. Second, there are no observations for the magnitude and distribution of atmospheric aerosols—the aerosol forcing of the computer models is without validation. Third, there is no explanation as to why the late-twentieth-century warming was mainly over the northern hemisphere land areas despite the carbon dioxide increase being well mixed across both hemispheres. Fourth, there is no reason given for the recent hiatus of temperature increase. The “evidence” is no more than model tuning with plausible parameters. The faith in the computer models, on which the IPCC’s climate predictions are based, is misplaced.
The current global temperatures are relatively warm but not too dissimilar from those before Earth entered the current glacial phase about 5 million years ago; the current global temperature is only marginally cooler than the temperature peaks achieved during each of the interglacials of the last half-million years as earth recovered from successive ice age periods. A casual observer of the record might readily conclude (and not be far wrong) that there is a natural upper limit that earth’s temperature asymptotes towards.
Although not in the context of global warming, in 1966 C.H.B. Priestley (then Chief of CSIRO’s Division of Meteor-ological Physics) wrote of the limitation of temperature by evaporation in hot climates. High daytime maximum temperatures are reached over arid lands of the tropics where only radiation loss and conduction are available to rid the surface of energy absorbed from the sun. Where the land surface is wet or covered in vegetation the temperature is considerably lower because the additional evaporation of latent energy has a powerful cooling effect; evaporation (and latent heat exchange to the atmosphere) increases almost exponentially as temperature rises. The combined radiation, conduction and evaporation losses from the oceans and wet or vegetated surfaces can offset the absorption of solar energy at a lower temperature than when evaporation is absent.
The principle is identical for carbon dioxide forcing and its enhancement of the greenhouse effect. The magnitude of the down-welling long-wave radiation at the surface increases as the concentration of carbon dioxide increases. There is a corresponding rise in surface temperature that is constrained by the increase in surface energy losses (radiation emission, conduction and evaporation of latent heat). The earth’s surface is predominantly water or well-vegetated land; increasing evapor- ation of latent heat is a dominant factor in the additional energy loss under carbon dioxide forcing. It is the additional evaporation of latent heat that will constrain surface temperature response to human-caused carbon dioxide emissions.
The mathematical formulation of surface temperature response to carbon dioxide forcing is straightforward, even considering water vapour feedback. For a doubling of carbon dioxide concentration the global average surface temperature increase from 280 ppm (before industrialisation) to 560 ppm (towards the end of the twenty-first century) will only be about 0.5°C.
Why then, it might be asked, do computer models give projected temperature increases that are nearly an order of magnitude larger; and why are there claims of dangerous “tipping points” and potentially runaway global warming?
The likely answer to these questions is the recent revelation that computer models grossly under-specify the rate of increase of evaporation with temperature, the factor that constrains surface temperature increase. In 2006, US researchers Isaac Held and Brian Soden reported that, on average, the rate of increase of evaporation with temperature in computer models used for the IPCC fourth assessment is only about one third of the expected value. In 2007, Frank Wentz and his US colleagues repeated the earlier finding and, on the basis of satellite analysis of rainfall, confirmed the expected rate of increase of evaporation with temperature as the appropriate value.
The significance of the computer model shortcomings identified by the US researchers can be appreciated from the mathematical formulation of feedback amplification. As surface temperature rises under carbon dioxide forcing, it warms the overlying atmosphere and further enhances the long-wave radiation back to the surface from the atmosphere. The feedback amplification has a term of the form [1 / (1 – r)], where r is less than unity. As r increases the amplification will also increase. The term r is linked to evaporation such that any underestimation in the specification for the evaporation term causes r and the feedback amplification to be anomalously large. As the rate of increase of evaporation approaches zero then r approaches unity and the projected amplification is very large. The gross under-specification of evaporation in computer models gives inflated values of r and exaggerated amplification of global surface temperature to carbon dioxide forcing.
In the more extreme computer models, the erroneous specification of evaporation response means that the models are approaching computational instability and the global temperature projections give the appearance of “runaway global warming”. Of course, the projections are erroneous. The exaggerated surface temperature increase associated with the computer model projections is a direct consequence of the failure of the model specification and does not represent the true sensitivity of the earth’s temperature to carbon dioxide forcing. In reality, surface evaporation from the oceans and vegetated land areas will constrain surface temperature increase to about 0.5°C for a doubling of carbon dioxide concentration, which cannot be considered as dangerous.
Unfortunately, those who are entrusted with building and validating the computer models seem to be blind to the inherent failings—the models have cost so much to build but the presentation of being accurate and useful is fallacious. In many countries, such as Australia and the UK, research funds have been specifically allocated by Environment ministries to generate climate projections in order to underpin and give credence to environmental impositions, including indirect taxation and restriction of a range of personal freedoms targeted solely at reducing carbon dioxide emission.
The CSIRO has taken state funding for the purpose of generating specific predictions for land use planning and water resource management at the regional level. But we should note that the CSIRO has legal disclaimers of responsibility for the truth or veracity of these predictions in case they turn out to be incorrect or misleading. The CSIRO apparently has no confidence in its computer predictions—all the risk is with the user.
Notwithstanding that computer models exaggerate the magnitude of warming, there continue to be NGO commentators and advocates who claim the danger is even more horrific than the IPCC suggests. The claim is that it is the higher temperature projections that are the more realistic and that as the temperature passes a tipping point then irreversible runaway global warming will take off. Such claims are without scientific foundation because of the constraining effect of surface evaporation that is grossly under-specified in computer models.
Is There a Sound Case for Carbon Emissions Reduction?
Australians are now being bombarded by an intense government-funded propaganda campaign to encourage people to accept the reality of dangerous human-caused climate change and support early action for “carbon pollution” reduction. The scaremongering about dangerous climate change is based on the erroneous computer model projections and the unsubstantiated extrapolations of a range of climate impacts that are only realistic if no adaptive or mitigating measures are taken.
In the absence of computer models there would be little credence given to the view that the relatively small warming of the second half of the twentieth century was due to carbon dioxide emissions; there would certainly be no credence given to the possibility of irreversible runaway global warming over the coming century. Cool heads would note that most of the earth’s surface is either ocean or freely transpiring vegetation and that surface evaporation will continue to constrain surface temperature rise, as it always has done.
The likely magnitude of human-caused global warming is so low that it will not be discernible against the background of natural variability in the climate record. Thus national or internationally co-ordinated efforts to impose carbon dioxide emission reduction for the purpose of preventing climate change will be a tremendous waste of resources. The real danger is that government-instigated measures to drastically downsize a wide range of fossil-fuel-dependent industries in order to achieve emission reduction targets will actually be effective. Such success will destroy jobs and will limit future development opportunities, with no discernible impact on climate. Then the government will realise that it is much easier to change the economy than to change the climate, and it will also find that the direction and impacts of change will be equally unpredictable.
William Kininmonth is the former head of Australia’s National Climate Centre. He was an Australian representative and consultant to the World Meteorological Organization on climate issues and is the author of Climate Change: A Natural Hazard (Multi-science Publishing Co., 2004). He will be among the speakers at the Australian Environment Foundation’s annual conference, “A Climate for Change”, in Canberra this month.
Many will disagree, but World War III is too great a risk to run by involving ourselves in a distant border conflict
Sep 25 2024
5 mins
To claim Aborigines have the world's oldest continuous culture is to misunderstand the meaning of culture, which continuously changes over time and location. For a culture not to change over time would be a reproach and certainly not a cause for celebration, for it would indicate that there had been no capacity to adapt. Clearly this has not been the case
Aug 20 2024
23 mins
A friend and longtime supporter of Quadrant, Clive James sent us a poem in 2010, which we published in our December issue. Like the Taronga Park Aquarium he recalls in its 'mocked-up sandstone cave' it's not to be forgotten
Aug 16 2024
2 mins