The financial meltdown, partly driven by risk analysis of the unknown, should serve as a warning beacon to those who believe that modelling the earth’s atmosphere over one hundred years is a believable and useful enterprise.
Let us begin by examining the financial world. An interesting speculation, if that word can still be used in polite company, is that the trouble on Wall Street was due to the “Quants” who calculated risks and estimated volatility and a host of other quantities from alpha to omega that showed how stocks and bonds moved.
The first danger signs of this approach were detected in the 1980s. In Barbarians at the Gate: The Fall of RJR Nabisco there are accounts of running the numbers through the computer, adjusting terms, enhancing the deal and then proceeding.
The seeds of this were sown in the minds of the best and brightest of the business schools by the appearance in 1979 of VisiCalc, the first computer spreadsheet that could be run on an Apple computer. The MBA students adopted this tool as a masterful way of preparing and presenting cashflow and operating statements and balance sheets. It was so easy to use and to see the results of changing operating margins, altering depreciation schedules and trimming general expenses. Of course looking at merging entities was easily accomplished by merging cells of spreadsheets.
This tool developed into the Lotus 123 spreadsheet for the PC. For the young MBA this was the equal of giving an urban guerrilla a Kalashnikov. You could do a great deal of damage with it while your elders and betters marvelled at the crisp delivery. And this happened in the mid-1980s. A wave of mergers and acquisitions took place. Some succeeded but many came undone. In retrospect, Andy Groves, then CEO of Intel, wondered if the invention of the microprocessor by Intel had contributed to the exuberance. The other contributing factor may have been the concentration on numbers and not on how the businesses actually operated. This is a common occurrence for those brought up to use a computer as their primary investigative tool.
Meanwhile, in another part of the forest, a powerful tool had been created. In the early 1970s Robert C. Merton, Myron S. Scholes and Fischer Black developed a pioneering approach for the valuation of stock options. In 1973, the Black–Scholes Option Pricing formula appeared. Thousands of traders and investors now use this formula every day to value stock options in markets throughout the world. Robert Merton devised another method to derive the formula that turned out to have very wide applicability.
Merton and Black were awarded the 1997 Nobel Prize for Economics. In a press release announcing the award the Nobel Foundation stated, “Their methodology has paved the way for economic valuations in many areas. It has also generated new types of financial instruments and facilitated more efficient risk management in society.” Note and remember that last sentence!
Merton put some of his winnings where his mouth, or at least the option pricing model, had been. Together with Scholes, he was among the board of directors of Long-Term Capital Management (LTCM), a hedge fund that failed spectacularly in 1998 after losing $4.6 billion in less than four months. The Federal Reserve was so concerned about the potential impact of LTCM’s failure on the financial system that it arranged for a group of nineteen banks and other firms to provide sufficient liquidity for the banking system to survive.
Undeterred, the Quants continued. When it became possible to slice and dice mortgages, they did. For example, a mortgage could be sliced into top, middle and bottom “drawers”. The bottom drawer of a $90,000 mortgage would be the bottom, say $30,000, fraction of the loan: a safe bet and a sure thing if it needed to be recovered and therefore low risk, with AAA quality and a low interest coupon. The next $30,000 would be riskier so a higher interest coupon, and so to the top drawer! The bottom drawers of many mortgages were then collected, bundled and sold to investors, and similarly for middle and top drawers with the risk and reward matched to the investors’ appetites. The Quants would model the risks, the probability of default and other known risks that could be incorporated in a financial model.
This time when things went wrong it was LTCM multiplied by a factor of one thousand. Down in the foundations of this enormous financial structure was the unfortunate link that the top, middle and bottom drawer of the mortgage were all in the same piece of furniture. They were in fact correlated one with another so you could not assemble all the bottom drawers in one spot, the middle and top drawers somewhere else and claim their independence! When the housing bubble burst all the values collapsed together—not the sort of event built into the risk analysis.
Then, there were the credit default swaps, bond insurance by any other name, and a host of financial devices created without regulation that were no doubt evaluated with risk models. A final extra icing on the already top-heavy cake was the “mark to market” valuation of assets. The collapse of bond prices with distressed selling then got reflected into balance sheets with, as they say about Oscar the Grouch from the Muppets, the intelligence of a speeding bullet.
This has all been held up as an example of greed, a normal trait in biological organisms, some of whom work on Wall Street. But no doubt they will be subject to law-making and further regulation. The proper reaction to this urge is best thought of as the problem of dealing with James Thurber’s iron duke who slaughtered time in The Thirteen Clocks. He remarked to a courtier as they descended the castle staircase, “We all have our faults and mine is wickedness.” It is always difficult to legislate about the general and better to concentrate on the particular.
So the enabling technology for this disaster has been the computer, the spreadsheet and risk modelling—all possibly mixed in with Quants with little experience of the actual assets involved and a belief in computer-generated models of risk. Risk analysis requires knowledge of all possibilities and their frequency of occurrence. The mighty Black–Scholes Option Pricing Model only works for a restricted range of assumptions but its success inspired others to build upon it. Through inexperience or brilliance it was applied to situations for which it was not suited. Black or Scholes is supposed to have replied to the abuse from a Wall Street trader who had applied their option pricing model that “you might just as well think once you have mastered Euclidean geometry that you will be a good billiard player”.
Now are there lessons to be learned in other parts of the forest? An interesting starting point is the collapse of Lehman Brothers. Lehman were setting themselves to be leaders in carbon emissions trading when their world suddenly entered an ice age. They were close to a Nobel Prize winner, Al Gore. Is this a repeat of LTCM, Merton and Scholes?
For climate change many would argue that the only “evidence” for the dangers we may face comes from computer models. It is an example of reality outstripping the imagination even of that most inventive of novelists, Douglas Adams. In his novel The Hitchhiker’s Guide to the Galaxy, the Ultimate Question, the Answer to Life, the Universe, and Everything is numeric. In the story, a “simple answer” to the Ultimate Question comes from the computer Deep Thought, and it is the number forty-two. Our present ultimate question is whether climate change is manmade. And as a surprise to followers of Little Britain we have turned to the computer that has apparently said “yes”!
Here we assess risk helped by the IPCC with statements such as “it is very likely” that the surface temperature will warm by three degrees Celsius in the next one hundred years. This is their characterisation of a 90 per cent probability of that occurring. This is the best they can do after spending tens of billions of dollars. A pharmaceutical company would not go beyond a Phase II trial with that sort of result. In fact it is worse than that. The probability limit for the temperature is a fudge. It is part measuring errors and part expert assessment!
There is a substantial literature on the success rate for experts’ predictions. This is a fertile field for hedgehogs and foxes. The IPCC is an array (some would say a prickle) of hedgehogs with one big idea: that “humans cause global warming”. This colours all their thinking, while the foxes, perhaps the rest of us, pick and choose our ideas. Phillip Tetlock at Berkeley studied expert predictions over twenty years and concluded that hedgehog experts performed significantly worse than fox experts.
The belief in climate modelling parallels the belief in financial risk modelling. The practitioners have been brought up to use computers as a primary tool for investigation. It has become so extreme that one lead IPCC author described the difference between model calculations and an actual measurement where the chance of the two overlapping was less than 5 per cent as “[measurements] consistent with model simulations, just larger than model average”! So now calculations verify measurements, where in classic science it used to be the other way round.
The financial disaster and its enabling by risk analysis should be a wake-up call to our policy makers. Even within the IPCC there are hints. Dr Kevin Trenberth, another lead IPCC author stated: “None of the models used by the IPCC are initialised to the observed state, and none of the climate states in the models correspond even remotely to the current observed climate.”
Reread the last sentence of the Merton and Black Nobel Prize press release and look at what followed. Then think about Al Gore, his Nobel Prize and what may follow. What more need one say!