Think Another Crash Is “Impossible”? Think Again
If there are limits on what we can know of the future–and clearly there are–this sets limits on our ability to quantify and mitigate risk. Longtime correspondent Lew G. submitted this thought-provoking riff on the system’s intrinsic capacity for cascading decisions (for example, selling everything not nailed down) that upend our understanding of risk.
Here is Lew’s commentary on risk:
There is a tradeoff of importance, detail and distance into the future, sort of a Heisenberg’s uncertainty principle. The cost of certainty and detail at any future time goes up as the economic importance goes up because so many more people are trying to understand and control that small part of the future. This introduces more variables and more uncertainty because many of those variables will be linked in unknown ways. (emphasis added by CHS)
Estimating risk is inherently a matter of dealing with those links. The number of paths through a net which must be evaluated to determine risk is exponential, as cascades can start from anywhere and rapidly take down connected nodes in the network.
One would expect that a bank’s internal risk analysis programs will try to do some of this, but they can only know the bank’s point of view, not the exposure of any other financial nodes in the network, or the fantasy level of their accounting.
Because everyone is super-optimistic in a boom, crashes are inevitable, and the size of the crash is proportional to the fuzz/misinformation in information flows, e.g. mark-to-fantasy entries in banks’ assets, details of all the outstanding derivatives, swaps and contracts, etc.
Thank you, Lew, for describing the complex nature of risk. There are a number of ways to uncrate the dynamics of risk addressed here, and I’d like to address two specific instances.
1. Lew conjectures a risk-equivalent of Heisenberg’s uncertainty principle, which holds that the more precisely the position of a particle is determined, the less precisely its momentum can be known, and vice versa.
In a financial setting, a great many analysts and programs are seeking precise forecasts on GDP, capital flows, future profits and revenues, valuations and so on–all the components needed to forecast the likely range of outcomes.
This information is needed to properly hedge risk, and to profit from markets moving in the anticipated direction.
But if Lew is right about a Heisenberg analog in finance, this suggests the greater the certainty in the forecast, the less precisely the risks can be accurately estimated.
This limitation in our knowledge of the future helps us understand why “impossible” cascades of selling and the implosion of “safe” assets occur with regularity, despite the widespread use of sophisticated risk-management tools.
(Given the concentration of talent seeking certainty in a narrow slice of financial data, the potential for group-think, both in humans and in the software written by humans, is another potential source of risk.)
2. If the linkages between variables and nodes are incompletely known or understood, disruptive cascades can arise in areas thought to be low-risk. As the cascade spreads through the network, it starts taking out nodes that participants suddenly discover are connected in unexpected ways to other nodes that were considered only lightly connected to risky nodes.
In a Heisenberg analog network, risk can never be nailed down with any certainty. The confidence that risk can be quantified and mitigated is misplaced. A system riddled with various forms of misinformation and links that are not apparent until the cascade has begun is intrinsically prone to crashes that take down every node in the network.
This is not “impossible;” it is an excellent description of what happened in 2008-09.
Leave a Reply