Big decisions on decision making
Key takeaways:
- Don’t leave all decisions to algorithms—you may lose key skills, and systems can fail.
- Decision-making is a muscle, exercise it with maths problems, but don’t wear it out.
- Algorithms can be as biased as those who program them—and we may not know how.
‘What is the secret of success? Right decisions. How do you make right decisions? Experience. How do you get experience? Wrong decisions.’ These words of former Indian President, Abdul Kalam, serve as a warning to corporate leaders who rely on possibly-faulty algorithms to make decisions.
Decision-making ‘is a muscle that needs to be exercised,’ says Susan Newell, professor of information systems and management at the University of Sussex. ‘Knowledge is from practice. You can’t get knowledge just from reading a book. Would you let a person who has read everything but has never done surgery operate on a member of your family?’
Yet all around us, the responsibility for making decisions is being removed from us. A good example is the increasing automation in cars, with completely self-driving cars not far down the road. (Many drivers have already lost their map-reading skills.) Humans may be expected to take over if systems fail. But as Professor Newell points out, ‘if we give up everything to the automatic driver, then how does a person learn to make the decisions, and when the automatic system stops working, then how can the driver take over? It’s hard to know how much to brake or turn the wheel. You learn by doing,’ she says.
Familiar patterns
Effective decision making depends on one’s ability to recognize patterns instantly, and not be overwhelmed even amid a flood of choices. In Aviation Psychology in Practice, George L. Kaempf and Gary Klein, describe how experienced decision makers are able to see meaning in constellations of cues and changes in clusters of cues (known as cue learning). Novices, unsurprisingly, aren’t able to see such patterns or nuances. Although aviation is highly automated—and the worse conditions become the more pilots depend on automatic pilot—pilots nevertheless must spend many hours in simulators to ensure they can take control if systems fail.
Medicine is another increasingly automated field, especially reading x-rays and CT scans. Artificial intelligence outperforms human experts in spotting brain cancer, breast cancer, melanoma, and more. ‘Computers can learn,’ Newell says, ‘but only humans can find new ideas about disease. You get that from looking at thousands of MRIs. If we totally remove humans from relatively mundane tasks, then we lose the creativity and innovation and intuition that comes from looking at all those records.’
‘If we totally remove humans from relatively mundane tasks, then we lose the creativity and innovation and intuition that comes from looking at all those records.’
None of this is to say that all decisions must be made by humans. Indeed, over-exercising the decision ‘muscle’ leads to decision fatigue. A study of parole hearings found that judges ruled more favourably at the start of the day and just after breaks, irrespective of the facts of the cases. In business, as in law and other fields, decision fatigue saps our ability to self-regulate.
Fortunately, there are ways to lighten the load: by reducing the number of choices; accepting ‘good enough’ outcomes rather than aiming for perfection; drawing upon expert opinions; and setting deadlines. And one can turn to algorithms to make at least some decisions for you, saving yourself for the big choices.
Wrong by choice
But beware—algorithms aren’t perfect. ‘Data will often contain hidden biases or features you don’t understand,’ says James Davenport, professor of computer science and mathematical sciences at the University of Bath.
Machine learning can create a self-reinforcing model when the cost of making a wrong positive decision is higher than the cost of making a wrong negative decision. ‘If you turn away someone who isn’t a good customer, you’ve lost business. But if you take a customer and they default on a loan, you’ve lost a lot of money. You want to know not just whether they will pay back the loan, but also the uncertainty of whether they will pay it back,’ he says.
If a bank has a group of 90% male and 10% female loan applicants, even if they behave identically, it will have less certainty about the women because of their smaller sample size, and so perhaps the bank will lend less to them. That in turn reduces the sample size of women as time goes on, which raises uncertainty about their behaviour, Professor Davenport explains.
Even if the data that feeds an algorithm can be stripped of typical bias markers such as race and sex, hidden factors linked to history and society can re-introduce bias. In creating an algorithm, one must specify what you’re not biased against and collect data about those points, even if monitoring data is kept separate from, say, recruiting data. ‘If you want to be completely unbiased, then toss a coin,’ he says.
In fact, getting computers to make decisions for us requires a lot of judgment calls. ‘It’s a decision of society, not computers, what you’re allowed to account for—male vs. female drivers, for example,’ Davenport says. ‘But if you take nothing into account, you just make random decisions. What you take into account is not a decision a computer can make.’
Advice for decision makers:
Sharpen your general decision making skills. Studies show that ‘statistical numeracy tests tend to be the strongest single predictors of general decision-making skill across wide-ranging numeric and non-numeric judgments and decisions.’
Keep making some decisions. Algorithms should reduce, rather than eliminate, the number of decisions you make. That way you will retain essential skills if the automated systems prove unreliable. And you can test yourself against the machine.
But don’t try to compete with AI. You are not a robot: you will exhaust your ability to make decisions if you make too many. Save yourself for the big choices.
Choose your moment. Don’t schedule big decisions for the end of the morning or end of the day when your judgment may be less sharp.