by Judith Curry
As you do not fight fire with fire, you do not fight complexity with complexity. Because complexity generates uncertainty, not risk, it requires a regulatory response grounded in simplicity, not complexity.
To ask today’s regulators to save us from tomorrow’s crisis using yesterday’s toolbox is to ask a border collie to catch a frisbee by first applying Newton’s Law of Gravity. – Haldane and Madouros
The basic challenge is described in the following way:
Take decision-making in a complex environment. With risk and rational expectations, the optimal response to complexity is typically a fully state-contingent rule. Under risk, policy should respond to every raindrop; it is fine-tuned. Under uncertainty, that logic is reversed. Complex environments often instead call for simple decision rules. That is because these rules are more robust to ignorance.
From Roger Pielke Jr.’s summary of the recommended “Five Commandments” of decision making under uncertainty:
1. “Complex environments often instead call for simple decision rules”
The simplest explanation is that collecting and processing the information necessary for complex decisionmaking is costly, perhaps punitively so. Fully defining future states of the world, and probability-weighting them, is beyond anyone’s cognitive limits. Even in relatively simple games, such as chess, cognitive limits are quickly breached. Chess grandmasters are unable to evaluate fully more than 5 chess moves ahead. The largest super-computers cannot fully compute much beyond 10 moves ahead.
Most real-world decision-making is far more complex than chess – more moving pieces with larger numbers of opponents evaluated many more moves ahead. Simon coined the terms “bounded rationality” and “satisficing” to explain cost-induced deviations from rational decision-making (Simon (1956)). A generation on, these are the self-same justifications being used by behavioural economists today. For both, less may be more because more information comes at too high a price.
2. “Ignorance can be bliss”
Too great a focus on information gathered from the past may retard effective decision-making about the future. Knowing too much can clog up the cognitive inbox, overload the neurological hard disk. One of the main purposes of sleep – doing less – is to unclog the cognitive inbox. That is why, when making a big decision, we often “sleep on it”.
“Sleeping on it” has a direct parallel in statistical theory. In econometrics, a model seeking to infer behaviour from the past, based on too short a sample, may lead to “over-fitting”. Noise is then mistaken as signal, blips parameterised as trends. A model which is “over-fitted” sways with the smallest statistical breeze. For that reason, it may yield rather fragile predictions about the future.
3. “Probabilistic weights from the past may be a fragile guide to the future”
John von Neumann and Oskar Morgenstern established that optimal decision-making involved probabilistically-weighting all possible future outcomes.
In an uncertain environment, where statistical probabilities are unknown, however, these approaches to decision-making may no longer be suitable. Probabilistic weights from the past may be a fragile guide to the future. Weighting may be in vain. Strategies that simplify, or perhaps even ignore, statistical weights may be preferable. .
4. “Other things equal, the smaller the sample, the greater the model uncertainty and the better the performance of simple, heuristic strategies”
The choice of optimal decision-making strategy depends importantly on the degree of uncertainty about the environment – in statistical terms, model uncertainty. A key factor determining that uncertainty is the length of the sample over which the model is estimated. Other things equal, the smaller the sample, the greater the model uncertainty and the better the performance of simple, heuristic strategies.
5. “Complex rules may cause people to manage to the rules, for fear of falling foul of them”
There is a final, related but distinct, rationale for simple over complex rules. Complex rules may cause people to manage to the rules, for fear of falling foul of them. They may induce people to act defensively, focussing on the small print at the expense of the bigger picture.
JC comments: I find this article to be particularly interesting since it is written by authors on the actual front lines of making decisions under uncertainty (rather than merely writing about the topic from an academic perspective).
While financial regulation is the context for this paper, there is relevance also for climate policy. “Robustness to ignorance” is the key point here. In this context, I would like to remind readers of a previous post Can we make good decisions under ignorance?