Topic Tags:
0 Comments

The Wise Delinquency of Decision-Makers

Tim van Gelder

Mar 01 2010

13 mins

A few years ago I asked a group of fifty or so military officers to reflect on the weightiest decision they had made, in their professional capacity, in the past year.

Military officers are highly trained professionals. They have, in particular, been thoroughly trained in decision-making. Army officers for example are well versed in the Military Appreciation Process, the official Army doctrine and procedure for making operational decisions.

Did you, in making that decision,” I asked, “use any official, prescribed, formal or textbook methodology or tool to help you make that decision?

Over half indicated that they had not done so.

How then did you go about it?” I asked. Subsequent discussion revealed that, more often than not, officers went through an informal, somewhat haphazard process of pondering options and exploring arguments. They would consult with trusted colleagues, “sleep on it”, and “weigh things up.

The behaviour of these officers, in side-stepping the official decision methods in which they had been trained, is consistent with what happens more widely, particularly in business. There are standard, widely accepted “normative” decision-making methods. These methods are found in textbooks, are taught at business schools, are promulgated by consulting firms, and can be found online in just a few minutes (see, for example, “Decision analysis” on Wikipedia).

However, for the most part, when people make decisions they don’t use these methods, even when the decision may be quite important for themselves, their organisations or even for their countries.

How can this be? After all, decision-making is one of the most critical of all human activities. Our health, wealth and happiness, individually and collectively, depend on making good choices. Why then do we so routinely ignore the officially prescribed methods for doing this?

Many people, it is true, are simply unaware of good decision methods. Having never been educated in formal decision procedures, they “muddle through” as best they can, doing what seems natural and reasonable in their benighted state.

But this “ignorance” explanation is insufficient, because as we saw with the military officers, official methods are often avoided even by those who have had good training. Plausibly, the ignorant masses generally wouldn’t use good decision methods even if they had been introduced to them.

Perhaps then it is just laziness? Certainly, elaborate, rigorous decision methods can involve a lot of tedious work. And while people often are lazy, often they are not, and given what may be at stake, you’d think they might be highly motivated to put in that extra effort.

No, there’s something else going on here—something much more interesting. The problem is not with the people making decisions; it is with the official methods. There is a deep mismatch between the gospel of the decision experts and the decision challenges usually faced by the congregation.

To see this, we need to look more closely at the official methods. The canonical textbook decision method, at its simplest, says that we must canvass all options, specify the criteria for choosing among options, rate the options against the criteria, and add up the results. To make the decision, just take the option with the best score.

The basic idea is straightforward, and very plausible. Of course there are numerous variations on the basic model, some of them quite advanced and requiring dedicated software tools. But all such methods share a common approach: reduce the decision problem to a systematic set of assignments, usually numbers; run some kind of calculation or algorithm over those assignments; and read off the answer.

There are certainly times when these sorts of analytical techniques are just what is needed. They wouldn’t have reached their canonical status if they weren’t at least sometimes useful. But they aren’t the “be all and end all” for decision-making, as their promoters so often suggest. In fact they are at best only part of the story.

So what’s wrong? Why do these methods not “fit” so many decision challenges?

First and perhaps most obviously, in interesting “real world” problems, not every consideration can be reduced to a number or rating. Suppose, like the young man in Sartre’s “Existentialism is a Humanism”, I am wondering whether to go and fight against the Nazis with the Free French Forces, or stay to care for my mother who would otherwise be wholly alone. To neglect my mother is obviously wrong, but how wrong? What score does it get on the “filial duty” criterion? At some fundamental level, breaking the problem down like this seems obtuse, perhaps even obscene.

Second, even in cases where it might make sense to assign ratings, the ratings can be matters for debate. A massive fiscal stimulus should, among other things, promote employment. A particular stimulus package is claimed to be very good on this score. But is it really? Experts could reasonably disagree, and extended disputation might ensue. Often most of the work involved in decision-making is at this level of subordinate argumentation. However, the canonical decision frameworks are of no help at all here.

Third, the problem may be too “wicked” to be shoehorned into any pre-structured decision framework. So-called wicked problems—think Middle East peace, or a divorce settlement—have a cluster of properties making them especially hard to tame through any given structure, set of rules, or complex calculation.

The inapplicability of the textbook techniques is well illustrated by a famous anecdote from decision theorist Percy Diaconis:

‘Some years ago I was trying to decide whether or not to move to Harvard from Stanford. I had bored my friends silly with endless discussion. Finally, one of them said, “You’re one of our leading decision theorists. Maybe you should make a list of the costs and benefits and try to roughly calculate your expected utility.” Without thinking, I blurted out, “Come on, Sandy, this is serious.” ’

Like the military officers described at the outset, what people actually do when trying to deal with these decision challenges is fall back on general-purpose deliberation.

Deliberation is the careful consideration of the options and related issues in terms of the relevant arguments and evidence. When we deliberate, we don’t assign numbers and run algorithms. Rather, we canvass qualitative arguments which point in one direction or another. We clarify issues. We gather detailed information. We countenance potential objections, and perhaps rebut them. We generally do all this in conversation with others, though we can also do it on our own, imagining how the arguments play out.

When people making difficult decisions rely on deliberation, they are not necessarily being delinquent. In fact they will often be wisely, or at least rightly, using a method that is far more appropriate than the ones recommended by the high priests of decision theory.

The trouble is that deliberative decision-making is still a very problematic business. Decisions go wrong all the time. Textbook decision methods were developed, in part, because it was widely recognised that our default or habitual methods are very unreliable. The textbook methods may not be the answer, but there is still an urgent need to improve decision-making.

To improve deliberative decision-making, we have to know why it goes wrong. Fundamentally there are four kinds of problems.

First, there are the intrinsic biases and limitations built into each of our minds by virtue of the fact that our thinking equipment is the result of a long, haphazard and incomplete evolutionary process. Just as everyone, without exception, has a blind spot in each eye’s visual field, so all of us, having brains of the same general design, are prone to make certain errors of inference and judgment. Similarly, having brains of a strictly finite capacity and idiosyncratic design, we are easily overwhelmed by the demands of our contemporary cognitive environments, much as a ten-year-old computer grinds to a halt under the latest operating systems.

Second, there are the ill effects of the passions—temper, envy, pride, fear. Emotions are an unavoidable and often helpful aspect of human decision-making, but they can also be the enemy of wisdom. We have in fact two quite distinct brain systems for decision-making. One is primitive, intuitive, fast and emotional; the other is relatively recent in evolutionary terms, slow, conscious and deliberate. Complex judgments always involve both brain systems; the problem is when the primitive system exerts excessive or unregulated influence outside its proper domain.

Third, there are problems which arise due to the fact that important deliberative decisions are often made not by individuals but by small groups such as boards or committees. The ways groups interact as they work through a decision can introduce a whole new range of errors, even when the group members are as smart, disciplined and dispassionate as we could reasonably hope. The well-known phenomenon of “groupthink”, as characterised by Irving Janis, is a manifestation, though it is only one of the traps inherent in the dynamics of group deliberation.

Finally, compounding the above is the untutored manner in which people engage in deliberation. The regrettable fact is that few people have ever had any training in the art of deliberative decision-making. Most people deliberate in a manner akin to how they would swim if they had never had any coaching and had to figure it out for themselves, learning by instinct, trial and error, and imitation. Some people do manage to become highly skilled, but nobody develops the full range of skills of which they are capable without extensive coaching and practice. This is true in all domains in which high-level expertise has been studied, such as chess, sports and music, and there is no reason to think that deliberative decision-making is any different.

To improve deliberative decision-making, then, we must address these causes. Ideally our decision processes would acknowledge and compensate for our intrinsic biases and limitations; moderate, deflect or channel our emotional impulses; eliminate or avoid the traps of group interaction. Ideally we would all be schooled and skilled in such processes, and those making the most consequential decisions would have benefited from the most elite training.

There may be a world, or a future, in which these ideals are approximated. It would be a part-realisation of the Enlightenment vision of human well-being resulting from the application of Reason. Unlike the Enlightenment philosophers, however, we know better than to hope for some dramatic transformation. Progress will be incremental, partial and slow.

A recent proposal for modest modification of our deliberative decision practices is argument-mapping. To “map” a deliberative decision is to lay out the options and the arguments in a special kind of diagram, similar to a mind map or a flowchart.

Argument-mapping has two major virtues as an aid to deliberation. First, in making the thinking behind and around a decision visible, it allows us to better exploit the massive processing power of our brain’s visual systems. We know, in general, that people can handle complex information structures more effectively if those structures are laid out visually. Argument-mapping just applies this insight to decision-making.

Second, argument-mapping gently imposes some discipline on the decision-making process. There are rules and guidelines for mapping. There are right and wrong, or at least better and worse, ways of mapping an issue. Argument-mapping is “semi-formal”, tightening up the thinking without trying to wrestle it into the straitjacket of some particular formal method.

How might argument-mapping be used in a real context? Corporate boards are one place where important decisions must be made regularly. These decisions are generally made deliberatively, by canvassing and weighing up the arguments. Further, these decisions are all too often made badly—not just in the sense that the wrong choice is made, but also in the sense that the board deliberations were manifestly sloppy, incomplete or biased.

A fairly straightforward change to board practice can improve the quality of board deliberations and thereby its decision “hit rate”. That change is simply to require that management present to board, alongside their other materials such as traditional board reports or PowerPoint presentations, an argument map displaying the major alternatives and the lines of argument and evidence supporting one path over the others.

Board members will then be able to see exactly what the case is, and will be better positioned to determine its strengths and weaknesses. When the decision is made, they can have more confidence in the thinking behind it. And an argument map constitutes an archivable record, cleanly and concisely recording the logic behind the decision.

Much of the benefit will be obtained even before the decision goes before the board. In preparing the argument map the management team will be forced to articulate the logic of their case with an unusual degree of clarity and rigour. This will allow the management team to identify and correct problems early on, thereby avoiding potential criticism of their work or even rejection of their recommendation.

Argument-mapping is not the “one true way” for decision-making. It is one technique among others. There are times when official decision-analytic techniques are just the right thing. The wise decision-maker is familiar with a range of techniques and chooses the most appropriate one at the time.

Argument-mapping has been around since at least the early nineteenth century, but only in the past few decades has it really taken off. This is largely due to the fact that we have the kind of basic computer tools we need to create high-quality maps quickly and easily—personal computers, colour printers, specially designed software.

We have those tools in part due to the efforts and creative genius of one of the pioneers of the information revolution, engineer Douglas Engelbart. Back in the late 1950s, Engelbart wondered how in his career he might best contribute to human progress and quality of life. He decided that the most critical problem to solve was the fact that the intellectual challenges we face increasingly exceed our brain’s cognitive capacities. The solution, Engelbart thought, was to develop tools to complement or enhance our biologically-endowed intelligence. Engelbart went on to develop a range of technologies, such as the computer mouse and hyperlinking, which are so pervasive in the information processing tools we all use today.

Way back in 1962, Engelbart wrote a landmark manifesto entitled “Augmenting Human Intellect”. He there proposed, among other things, a computer system in which ideas could be displayed on screen and linked according to their logical relationships. Creating and interacting with these diagrams would assist knowledge workers to think far more effectively about complex problems than if they were using raw brainpower, or traditional tools such as books and writing.

Contemporary decision- and argument-mapping systems are Engelbart’s vision being realised on a technological foundation he himself partly created. We are finally at the point where such tools are available to help us deal more effectively with our thinking challenges, ranging from the hyper-complex such as what to do about climate change or the global financial crisis, to the merely difficult such as critical decisions in our businesses or personal lives.

It’s fortunate that these tools have arrived, because we’re going to need all the help we can get.

Tim van Gelder is Principal Fellow in the School of Philosophy, Anthropology and Social Inquiry at the University of Melbourne, and managing director and principal consultant, Austhink Consulting.

Comments

Join the Conversation

Already a member?

What to read next

  • Letters: Authentic Art and the Disgrace of Wilgie Mia

    Madam: Archbishop Fisher (July-August 2024) does not resist the attacks on his church by the political, social or scientific atheists and those who insist on not being told what to do.

    Aug 29 2024

    6 mins

  • Aboriginal Culture is Young, Not Ancient

    To claim Aborigines have the world's oldest continuous culture is to misunderstand the meaning of culture, which continuously changes over time and location. For a culture not to change over time would be a reproach and certainly not a cause for celebration, for it would indicate that there had been no capacity to adapt. Clearly this has not been the case

    Aug 20 2024

    23 mins

  • Pennies for the Shark

    A friend and longtime supporter of Quadrant, Clive James sent us a poem in 2010, which we published in our December issue. Like the Taronga Park Aquarium he recalls in its 'mocked-up sandstone cave' it's not to be forgotten

    Aug 16 2024

    2 mins