A new report, The importance of advanced physical and mathematical sciences to the Australian economy, has just been published by Chief Scientist Ian Chubb and the Australian Academy of Sciences. It points to an annual $145 billion value-adding contribution to our economy. The report demonstrates two important things. First, that we are in the midst of a new technology revolution that probably started with the invention of the transistor. Second, that just as in the first Industrial Revolution there are those who do not understand how innovations occur.
The report’s estimate of these sciences’ contributions may be uncertain, but there is no doubt that productivity has been increasing as it did in the Industrial Revolution and over many segments of a modern economy. A study of productivity growth in the United States showed wholesaling, retailing, finance and insurance, administrative services and computer and electronic products accounted for most of the productivity growth in the seven years to 2007. The first two on the list represent two-thirds of the three key sectors in the last part of the twentieth century. There is only one technology-intensive activity in this list. The others are not traditionally identified as R & D-intensive.
Some might question the inclusion of advanced mathematical sciences in the report, as there is ample evidence they may have done more harm than good. The use of mathematical models to estimate risks played a part in the sub-prime mortgage and financial derivatives disaster of 2008. Indeed, there was an early warning of this with the 1998 collapse of the New York-based Long Term Capital Management’s hedge fund. This firm had been built on the use of complex mathematical models and had on its board two Nobel Prize-winning economists, Myron Scholes and Robert Merton, who had developed approaches to pricing options on bonds and stocks. Merrill Lynch observed in an annual report at the time that mathematical risk models “may provide a greater sense of security than warranted; therefore, reliance on these models should be limited.” This caution might well apply to other areas of science.
Innovation is the introduction of new products or processes that meet new or inarticulated needs, or existing market needs. The depth of understanding in the report is captured in the summary with this explanation of the mechanism:
“Advanced scientific knowledge is discovered by research scientists, who are often motivated by their own curiosity. It is translated into ‘useful’ knowledge and then applied to economic inputs. This is done by research scientists and science-trained professionals, who are responding to signals from business. Business managers then use the economic inputs to produce output, in response to signals from consumers.”
This is not what the record shows. There are even Australian statistics for this showing the key sources of ideas are within business, customers, suppliers, meetings and competitors. Universities score less than 10 per cent and R & D enterprises even less!
Occasionally, the reverse of the above knowledge flow occurs. The most famous example is from Bell Labs, where two scientists were trying to remove background noise from their satellite microwave receiver. That noise was cosmic microwave background radiation, an echo of the Big Bang. They were awarded the Nobel Prize for the discovery.
The report appears to have no appreciation of how the world goes about its business. It is as though science is the “light on the hill” from which all advances come whereas it may actually be a “trough on the hill” This apparent centralization may be an echo of the Marquis of Pombal, Secretary of State in the Kingdom of Portugal. Pombal misunderstood the Industrial Revolution and reformed universities with technical programmes while setting up state-controlled enterprises. Claudio Veliz has explored the consequences of these initiatives.
The report gives some examples for the mechanistic explanation — but the devil is in the detail, and this is not explored.
For instance WiFi, or WLAN, was created by CSIRO radio astronomers who had a problem to solve for the radio telescopes they were building. This was not research arising in response to a defined “national need”, rather research supported on the basis of the excellence of the astronomers! Although the CSIRO patented the invention it was a non-exclusive licence to two Macquarie University professors and their company, Radiata, that gave rise to the first WiFi chips and the sale of Radiata for around $600 million to Cisco Systems in 2000. This sale brought in more revenue than the CSIRO revenue of the day, some $400 million, from all the subsequent licensing and legal successes in enforcing their patents. There is supposed to have been some resistance to even patenting the invention and this is understandable as establishing and preserving patent rights over twenty years involves a $200,000 to $300,000 outlay.
The other very important point is the unexpected consequences of leading-edge research that gives rise to innovation.
The internet was started in 1969 as the ARPANET and funded by the US Department of Defence. It was used by scientists in the 1970s to gain access to remote but powerful computers. It was not until 1989 that Tim Berners-Lee at CERN, the nuclear physics laboratory in Switzerland, proposed the hypertext transfer protocol (http) and this gave berth to the World Wide Web. This was not a discovery, but rather the development of a tool that enabled research. This advance alone probably justifies the past spending of some $1 billion a year for twenty years at CERN.
Many successful innovations arise from the development of processes, techniques and instruments that facilitate research. But it is near impossible to predict what might ultimately lead to an innovation. Our mobile phone and car GPS systems derive their accuracy from corrections due to Einstein’s special and general relativity theories of some hundred years ago.
On the other hand, innovative progress is never easy. The most famous case was at the foundation of Intel, when the opinion of the Chief Scientist of the US Department of Defence was sought. He told the founders of Intel that there would be no market for a chip with more than one transistor on it. After all no one had bothered to put multiple thermionic valves under one vacuum jacket! Later, as Intel grew, two bright engineers designing a chip for a Japanese camera company concluded that the best approach was to build a microprocessor that could then be programmed for camera use. It occurred to the company that their might be other uses for a microprocessor and the deal with the Japanese company was renegotiated.
There are important roles for universities and public institutions in education and research. They make key contributions in producing graduates who will invent, implement and deploy innovations for the benefit of their companies. The published research enables a spreading of information that is a public good. But, more important, the direct network of contacts from universities to companies and from companies to universities at many different levels provides what may well be the key link.
There is no guide to how much should be spent on the “advanced sciences” First-class universities with first-class research are great assets for attracting foreign students, and this makes an important cultural contribution to our region. (It is also a great source of foreign exchange!)
But when and where the benefits of “advanced sciences” might appear simply cannot be predicted.
This report recognises the importance of science but misunderstands the very nature of innovation and, therefore, makes no case for how much we should spend on supporting science. The lesson to be drawn from the history of innovation is that it is a random walk. The “national needs” approach offers no solution. The best we can do is support leading-edge research that pushes present technology to its limits and, most importantly, beyond those limits.
 The Centralist Tradition of Latin America Claudio Veliz, Princeton University Press 1980