It’s said that every generation has misgivings about the next. I know I do. It’s probably a natural part of growing older. Not only do the young seem younger than ever before, but those in charge, those now in their forties and fifties, seem, in ways both mysterious and self-evident, to be not quite up to the job.
Logically speaking, of course, this must be an illusion. If it were true, the world would long ago have collapsed in a demoralised heap. Each generation intrigues and irritates the one before it, but (common sense tells us) will be at least its equal in terms of intelligence and skill.
Advancing age does, though, bring a tendency to invest the past with a kind of allure that is impossible to explain to the young. I am convinced the music was better, but maybe that was because I was much younger when I first heard it. Does a lifetime give one any kind of worthwhile perspective on the phenomenon of change?
When I was a kid, change was called “progress” and was an article of faith. We talked about “Australia unlimited”. The Snowy Mountains Hydro-Electric Scheme was the apotheosis of all that was good. As the population grew and the cities and suburbs expanded around us, most adults seemed pleased with the result—“That’s progress”, they would say. There was an assumption that change would always be for the better. Now, opinion is likely to be more nuanced. Most of us would acknowledge that some things get better, others worse, while others—in particular, humanity’s propensity to keep doing stupid things—stay irritatingly the same.
Inevitably, there is a sense of loss. In my lifetime, I have seen the rise of the internet and the precipitous decline of conventional publishing, particularly newspapers. I have lived through the virtual collapse of traditional organised religion in Australia, the rise of political correctness and the decline of the ABC. And I have witnessed, and been part of, the massification of the universities.
It is easy to feel grumpy amidst these changes, especially when we consider those things that have not changed. Despite the dazzling technologies we have at our fingertips, our fundamental nature as humans remains unaltered. We continue to crave love and attention. Most of us find it hard to do the right thing when to do so is inconvenient to our interests. Whistle-blowing remains a hazardous activity.
Corruption, mostly venal, but sometimes more serious, is never far beneath the surface of organisational life. Incompetence is its close companion. By the time either or both are discovered and the wider public alerted, we can almost guarantee that the worst perpetrators will long since have fled the scene. The calls for tougher regulation, stronger penalties and the placing of heads on pikes continue to arrive too late. When they are acted upon, the remedies do more harm than good or, as in the case of the churches, the institution finds the broader society has already largely given up on it.
With these factors in mind, I am always a little suspicious of diagnoses of increasing moral turpitude. “Look at the banks,” people say, in the turbulent wake of the recent Royal Commission. “They never behaved in this way thirty years ago.” And yes, the day of the much-feared local bank manager has long passed. I am not sure that there are even many of these worthy people left, the banks (at least before the Royal Commission) preferring to work through mortgage brokers.
But the good old days were always a bit of an illusion. Back in the day, in the 1960s, the bank manager would grant you a loan only if you didn’t really need one. I remember, back then, one of these panjandrums refused my father, a man who worked for the same company all his life, a loan to extend our family home. He ended up doing the work himself, at weekends, when he should have been resting. Now, of course, the banks have swung to the opposite extreme, lending money to people without accurately assessing their ability to repay it.
If the banks lend to people they should not, whose fault, exactly, is that? We demand someone to blame. Yet the fault may not be in our institutions, but in ourselves. The Royal Commission papers show that the regulators, APRA, were unhappy about what was going on and let senior managers know about it. We know that the banks took no notice. Why not? Was it because they were run by unethical people? The simple answer would seem to be “yes”. But simple answers are rarely right, and still more rarely helpful.
The banking sector was de-regulated by the sainted governments of Hawke and Keating. As the political class knows, or should know, only too well, people respond to the incentives—and disincentives—set before them. If you allow people to do things that previously they could not, it should surprise no one that they over-step the mark, if indeed there is still a mark there.
But surely giving the banks more freedom to compete with each other was not a licence to behave unethically? Where are the morals of these people? There does seem to have been an enhanced propensity to appoint people to senior positions who allowed nothing to stand in the way of profit, and who were paid—in effect, paid themselves—massive bonuses.
But there is a lot more—or a lot less—going on here than meets the eye. Quite a few people need not to do their jobs properly for accountability to be subverted in this way. Chief executives report to boards, who are meant to hold the executives to account when they step out of line. Boards, in turn, are supposed to ensure that the interests of shareholders are protected. The principal shareholders, of course, are represented on the board, and many of these directors, who themselves represent the interests of their own investors, focus on short-term profits, rather than long-term sustainability. But the boards of public companies also include independent directors who are required to take a more objective view of the activities carried on in their name.
To understand why bank boards, in particular, seem not to have intervened very much in the activities of their chief executives, we need to factor in several decades of management-speak about what boards should and should not do. From the mightiest public company to the humblest not-for-profit, over the last several decades, people on boards have been told they should concentrate on the big picture—on strategy—rather than paying too much attention to what executives acting in their name were actually doing. To ask awkward questions at board meetings, particularly if you were an “ordinary” board member, was to make oneself unpopular. And few of us can withstand that sort of opprobrium.
Social psychologists tell us that most of us internalise, from approved social norms, who we think we should be. We mimic the role that is expected of us. Approved roles are reinforced by our peers, by human resource departments, by commentary in the press and on social media. In every organisation, at every level, we will pay a price if we do not conform. If approved behaviour seems to have declined or diminished in moral gravitas, as it may well have done, at least some of the blame must reside with those who, in the name of competition, created an environment where cutting corners and not worrying too much about the customers became the norm.
It is easy to be pessimistic. We have more and more intelligence in our gadgets, but less and less in our political institutions. Governments seem unable to get anywhere near to the problems that confront us. Policies are a grab-bag of electoral candy. As for the current state of politics, politicians and voters seem to be in a hall of mirrors in which each looks, despairingly, for cues from the other. The sage and wise leaders of past years have given way to disastrous parvenus. Where are the Menzieses or even the Howards of the current era? The Hawkes or the Keatings? Decision-making is in disarray.
Or is it? I am reminded, in the current building boom that threatens to destroy trees and gardens in so much of our capital cities, of the period in the 1960s, when immigration was booming and a rash of poorly constructed houses and nondescript units sprang up throughout the inner west in Sydney and intruded into every patch of bushland that developers could find on the outskirts of the city. I know that people at the time fought to keep the urban bush, and they have been fighting ever since.
While the issues may have changed, in some ways, the shape of politics has not. In the 1960s, much as now, there were left-wing people and there were right-wing people and the far more numerous people in between, or nowhere at all. If you were young, you were preoccupied by the Vietnam War. There were demonstrations: each side knew that its opponents must be morally bad. Those on the Left were anti-American, and appalled by big business. Those on the Right hated communism and would do anything to stop it. We harangued each other. In those distant times before political correctness, though, we did not seek to invalidate each other. We might argue that our opponents were deluded, but not that they did not have a right to exist. There was a directness, then, that seems to have vanished.
And here, I think, is the crux of the matter. Critique, dissent, should be the engines of our society. They cannot do their work in a mimetic shadowland. Postmodern analysis showed us the power structures that underlie all our identities. But these structures do not cease to exist because we have constructed newer and more fluid ways of being human. The tendency to marginalise others has not been overcome, it has simply taken new forms.
In part, this need to push imagined boundaries even further reflects the need always to be new. But there is a darker side to boundary-pushing. Once we lose the intellectual crunch of language, we lose the ability of thought to lay bare, not just the shortcomings of our opponents, but our own as well. Political correctness purports to be free-thinking, when in fact it is simply a reflection of the extent to which we have reached the apotheosis of mimetic society.
So we fail to understand oppression when we see it. It is fashionable to decry successful international businesses, like McDonald’s, Google and Facebook. But these companies are far from monolithic and at least in the case of Google, the founders’ motto, “Don’t be evil”, showed that they were not unaware of the need for ethical behaviour. Even Facebook’s onselling of its users’ information owed more to commercial convenience than malevolence. It is hard to find any evidence that these companies, despite their market dominance, have caused anything like the trauma of the aggressive European and American imperialist expansion of the nineteenth and early twentieth centuries, when commercial interests and states were often indistinguishable.
No one is forced to use Google, or Facebook, or for that matter to drink Coca-Cola. Their crime seems to be providing what people want, and making money in the process. Those whose opinions derive from those of others, rather than their own observation and experience, join in the chorus. McDonald’s is, absurdly, blamed for widespread obesity. To demonstrate this culpability, activist and film-maker Morgan Spurlock ate only at McDonald’s for a month, and took the “super-size” option whenever it was offered. It seems that a diet consisting solely of vast quantities of hamburgers and chips makes one obese. Who knew? Fast food is ubiquitous and cheap, but the sedentary character of our lives is also to blame. At least the fast-food companies serve food that is consistent and hygienic. You know exactly what you are going to get, and sometimes that is absolutely fine.
Google’s amazing search engine has enriched the lives of many. It was apparent from the outset that it delivered results far more consistently than those of its competitors. It’s true that advertising now dominates the sites that come up, because that is how Google makes its money. The result is that even if you Google to the edge of blindness, it gets harder and harder to find anything interesting, an effect that no doubt compounds itself over time. On the other hand, its gmail is free and far safer from spam and trolls than any other email program I know of.
It’s true that Google’s research workforce is male-dominated, and the company can’t find enough women computer scientists and software engineers to even up the numbers. It is only a matter of time, though, before things will start to change. Once we women decide we want to get into the information sciences, and turn away from the humanities and social sciences (both of which have been so trashed by silly people they lack interest anyway), there will be no stopping us. Ladies, do the hardest degree you are capable of, preferably one where the personal beliefs of those teaching you intrude as little as possible on the way in which you are assessed, and just get on with it.
If the good old days were handed back to us, we probably wouldn’t recognise them. There is, however, one change that modern media and educational fashion have brought about that we do need to acknowledge, and that is the extent to which rationality appears to have been downgraded in our daily discourse. As a species, we stand or fall on the extent to which we understand what is going on around us (increasingly the result of our own activity) and make prudent decisions accordingly. We know that emotion drives our thought processes, but the degree to which a view is passionately held does not make it truer, or even more helpful.
One puzzling feature of the current zeitgeist is the propensity to demonise risk. We see this tendency most clearly in the climate-change debate. Coal is not evil. Nor is nuclear energy. Neither are sun and wind intrinsically virtuous. Each energy form represents a means to power our lives and has its own costs and benefits. It is the balance between the two that matters.
Politics is not about virtue, but practicality. Few of us would want to do without refrigerators, air-conditioners, washing machines, dishwashers or motor vehicles. It is the size, not the composition, of our economies that is the root cause of environmental problems. Rather than thinking that everything can be painlessly and costlessly powered by renewables, we should be thinking much more holistically about the future. We should be aiming gradually to stabilise GDP, not engaging in never-ending expansion.
Until we understand more about how we perceive change, we will not understand with sufficient clarity what is happening around us. Climate change is very complex, yet because we all experience the weather, we think we understand it. On the other hand, the impacts of relentless over-development on near-at-hand environments tend to be overlooked. We remark the facts of constant demolition and change, but then forget what the affected neighbourhoods used to be like. Something similar seems to happen in relation to ecosystems. What was unthinkable becomes the new normal. It is also a fact that we notice only the small changes, but even then, not for long. Over time, these decrements add up. We may then realise, with a start, how much has been lost.
The first information revolution, the invention of printing, undoubtedly contributed more than any other single development to the rediscovery of the classics, the construction of the public realm and the arrival of the modern world. The internet has made available more “stuff’ to more people—whether it is making us smarter, or dumber, is hard to say. The test will be whether it has improved the quality of our common conversation. At the moment, fewer and fewer of us recall what that phrase actually means. Ironically, the really big changes alter our ways of thinking so much that we no longer know how to notice what has happened to us.
Dr Jenny Stewart is Honorary Professor of Public Policy at UNSW Canberra.