Cyborgs and the Big Crunch
Everything fails. The known universe started a couple of billion years ago with the Big Bang and will probably end in a couple of billion years, a phenomenon called the Big Crunch. This is not a pessimistic assessment, even though Stephen Hawking was once asked in the Far East not to mention the big crunch because of the effect it might have on the stock market. (The market crashed anyhow.)[i] The big crunch being a couple of billion years out, there is plenty of time to enjoy oneself in the interim.
Our sun was born 4.6 billion years ago and will eventually run out of energy and start to turn into a red giant roughly five billion years from now. Given that Andromeda will collide with our own galaxy in around four billion years, our sun blowing up might actually not be such a big deal. While computer simulations of Andromeda bumping into “us” are fascinating to watch, the cosmic timescales are irrelevant. But failure isn’t.
Will Durant, who together with his wife popularized both history and philosophy between 1935 and 1975, once said:
Civilization is an interlude between ice ages.[ii]
—Will Durant (1885–1981), American writer, historian, and philosopher
Fortunately, our planet is heating up. Thanks to global warming, the next ice age is not in Homo sapiens’ immediate future. Fortunately, all the predictions about a new ice age just being around the corner by atmospheric physicists in the 1970s were wrong.
Ice ages are not particularly friendly for complex life forms. “Too cold” is much worse than “too warm”, both from a humanitarian and a biodiversity perspective. However, the cosmological and earthly geological timescales are irrelevant in avoiding folly in financial matters. Whether we will be finished off by cyborgs long before the next cosmological or geological catastrophe, an idea that both Stephen Hawking and James Lovelock of GAIA fame entertained, is open to debate. (Fun fact: James Lovelock [1919-2022] once held Stephen Hawking [1942–2018] in his arms when Stephen was a baby, and James Lovelock was visiting the Hawkings.) The point is that a law ties together the long-term and the short-term: failure.
Excellence and Failure
Most empires and most businesses fail.[iii] Extinction is common in politics, business, and life. Ninety-nine point ninety-nine per cent of all biological species that have ever existed are now extinct. On a somewhat shorter timescale, empires come and go. On an even shorter timescale, roughly 80 per cent of businesses do not survive beyond the first year. Most of the firms in Tom Peters’s In Search of Excellence from the 1980s have failed by now. They went Kodak. British economist Paul Ormerod, in a very commendable book called Why Most Things Fail, calls this the iron law of failure:
The Iron Law of Failure appears to extend from the world of biology into human activities, into social and economic organisations. The precise mathematical relationship, which describes the link between the frequency and size of the extinction of companies, for example, is virtually identical to that which describes the extinction of biological species in the fossil record. Only the timescales differ.[iv]
—Paul Ormerod (b. 1950), British economist
As Jim Rogers, co-founder of the Quantum fund with George Soros in the 1970s, put it in the afterword to his bestselling Investment Biker, a book on his 65,067-mile motorbike tour around the world from March 1990 to August 1992:
If there’s one thing I’ve learned in going around the world, it’s that societies become rich, swagger around for a few years, decades, or centuries, and then their hour is done.[v]
—Jim Rogers (b. 1942), American investment biker and adventure capitalist
Failure is, of course, not a new concept. Gustave Le Bon, writing in 1895, ends his classic book on a cheerful note:
To pass in pursuit of an ideal from the barbarous to the civilized state, and then, when this ideal has lost its virtue, to decline and die, such is the cycle of the life of a people.[vi]
—Gustave Le Bon (1841–1931), French polymath
In essence, Stein’s law applies: What cannot go on forever won’t. The parallels between species, people, solar systems, firms, governments, political unions, reserve currencies, etc., are striking in terms of failure. They are all complex entities that try to survive in dynamic environments, which evolve over time but eventually fail.
Despite striking parallels between the social and economic world and the world of biology, there is a fundamental difference between the two. Ormerod writes: “The process of evolution in biological species cannot be planned. Species cannot act with the intent of increasing their fitness to survive. In contrast, in human society, individuals, firms and governments all strive consciously to devise successful strategies for survival. They adapt these strategies over time to avoid failure and alter their plans as circumstances change.”[vii]
John Maynard Keynes made the point most succinctly. The following is one of my top ten quotations of all time:
When circumstances change, I change my mind—what do you do?[viii]
—John Maynard Keynes (1883–1946), British economist
Being Wrong and Staying Wrong
Dennis Gartman, a former trader and editor of the Gartman Letter, used to publish his most important trading rules every year. Keynes' rule was nearly always on the list. Responding to change is key. (What change to overlook is key, too. Managing wealth and risk would be too easy otherwise.) The practical relevance is that if circumstances change, so does one’s investment thesis. If it does, it is best to reconsider one’s risk and, potentially, reposition one’s portfolio accordingly. This is how one writer put it when reminiscing about one famous stock operator:
I cannot fear to be wrong because I never think I am wrong until I am proven wrong.[ix]
—Edwin Lefèvre (1871–1943), American journalist and writer
The Keynes quotation says one should respond to changing circumstances before the iron law of failure applies. However, planning has limits. Austrian economist Friedrich August von Hayek, co-winner of the 1974 Nobel Memorial Prize in Economic Sciences, was an early critic of conventional economic analysis.
Human reason can neither predict nor deliberately shape its own future. Its advances consist in finding out where it has been wrong.[x]
—F. A. Hayek (1899–1992), Austrian economist
While most twentieth-century proponents of the dismal science suggest economics should be conducted similarly to physics, where theories depict mechanical systems and mathematics can precisely describe these systems, Hayek’s views were much more rooted in biology. He believed individual behaviour is not fixed like a screw or cog in a machine but evolves in response to the behaviour of others.
According to Paul Ormerod, Hayek, unlike most twentieth-century economists, understood and admired the achievements of other intellectual disciplines, especially anthropology. In Hayek's view, the complex interactions between individuals give rise to inherent limits to knowledge of how systems behave at the aggregate level. No matter how smart the planner is or how much information he gathers, there are inescapable limits to how much can be known about the systems.
The limits of knowledge were not lost on Bruce Lee:
Knowledge will give you power, but character respect.[xi]
—Bruce Lee (1940–73), Hong Kong and American martial artist
Whether Bruce Lee, who had a philosophical bent, was paraphrasing Francis Bacon, I do not know:
The philosopher Francis Bacon once said that knowledge is power. And that’s true. But wisdom is perspective. And that’s even more important than power.[xii]
—Thomas Morris (1952), American philosopher and author of Philosophy for Dummies
Accidents and Cleopatra’s Nose
In a book called Normal Accidents, sociologist Charles Perrow examines failures of man-made systems (power plants, airplanes, etc.). He points out that finding someone to blame for an accident is human nature. We want to know the “cause.” However, Perrow argues that the cause of an accident in a man-made system is to be found in the complexity of the system.
The odd term normal accident is meant to signal that, given the system characteristics, multiple and unexpected interactions of failures are inevitable.[xiii]
—Charles Perrow (1925–2019), American sociologist
An accident that results in a catastrophe is often a series of small events that, viewed by themselves, seem trivial. The interaction of multiple failures can explain the accident.[xiv] Patient accident reconstruction often reveals the banality and triviality behind most catastrophes. In other words, great events can have small beginnings.
Chaos theory suggests, among other things, that meaningful events, accidents, disturbances, etc., can have a trivial beginning. This has been known for a while:
The beginnings of all things are small.[xv]
—Cicero (106–43 bc), Roman politician, orator, and philosopher
In the practitioner’s literature, it is argued that, for example, the flatulence of a butterfly in the Amazon can cause a tornado in Texas. (In the academic literature, a butterfly’s wing flap causes the disturbance in Texas.) Cleopatra’s nose was trivial; the actions taken by Julius Caesar and Mark Antony to win her over were not.
Cleopatra’s nose, had it been shorter, the whole face of the world would have changed.[xvi]
—Blaise Pascal (1623–62), French mathematician
The idea of chaos theory suggests that what appears to be an overly complex, turbulent system (origins of life on Earth, weather, financial markets, etc.) can begin with simple components (amino acids, water, day traders, etc.), operating under a few simple rules (photosynthesis, evaporation, buy low/sell high, etc.). One of the characteristics of such a system is that a small change in the initial conditions, often too small to measure, can ultimately lead to radically different outcomes or behaviour.
Sensitivity to initial conditions is popularly known as the butterfly effect, so-called because of the title of a paper given by Edward Lorenz, the American mathematician, meteorologist, and pioneer of chaos theory, in 1972 to the American Association for the Advancement of Science, in Washington, DC, entitled “Predictability: Does the Flap of a Butterfly’s Wings in Brazil Set Off a Tornado in Texas?” The flapping wing represents a small change in the system's initial condition, which can cause a chain of events leading to large-scale phenomena. Had the butterfly not flapped its wings, the system's trajectory might have been vastly different.
The butterfly effect is very much applicable to financial systems. Richard Bookstaber, who had chief risk officer roles on both the buy-side at Moore Capital and Bridgewater and on the sell-side at Morgan Stanley and Salomon, and from 2009 to 2015 also served in the public sector at the SEC and the US Treasury, writing on systemic risk in a book published before the 2008 financial crisis, put it simply:
Systems with high levels of interactive complexity are subject to failures that seem to come out of nowhere or that appear unfathomably improbable.[xvii]
—Richard Bookstaber (b. 1950), American risk manager and risk researcher
Markets and economies are systems with high levels of interactive complexity. The next time you hear someone predict the stock market, interest rates, or inflation one year from now, you will know how seriously to take the forecast(er).
The Devil and the Details
Big-picture thinking is all laudable and grand, but the devil is in the details. However, when spending all one’s energy on the details, there is the risk of missing the big picture, as in not seeing the wood for the trees. Andrew Haldane from the Bank of England, who in 2014 was named among the one hundred most influential people by Time magazine, summed it up well in a 2012 Jackson Hole speech with the wonderful apt title “The Dog and the Frisbee,” and subtitle of “Ignorance is Bliss”:[xviii]
The general message here is that the more complex the environment, the greater the perils of complex control. The optimal response to a complex environment is often not a fully state-contingent rule. Rather, it is to simplify and streamline. In complex environments, decision rules based on one, or a few, good reasons can trump sophisticated alternatives. Less may be more.[xix]
—Andrew G. Haldane (b. 1967), chief economist at the Bank of England
In biology, we know extinction will occur in the future. We can elaborate on extinction’s probabilities. However, we do not know which species will become extinct and when. I believe this applies to finance. We know there will be a future failure, collapse, market mayhem, and wealth destruction. We can also assess probabilities. However, we cannot pinpoint the next failure precisely. We can avoid errors by becoming savvier. That said, avoiding folly is a survival technique. Ayn Rand, author of Atlas Shrugged, founder of Objectivism, enthusiastic stamp collector, and long-term friend of Alan Greenspan, has some advice regarding avoiding folly:
You can avoid reality, but you cannot avoid the consequences of avoiding reality.[xx]
—Ayn Rand (1905–82), Russian American novelist and philosopher
Failure and survival are two sides of the same coin. Who will survive? It is not entirely random who survives in stressful situations or hostile environments. In mountaineering, it is not the best climbers who survive an accident but those who are best prepared and have no mismatch between perceived risk and true risk.
Avoiding reality is folly in mountaineering and elsewhere. As with nearly everywhere else in the universe and human affairs, chance also plays a role. Louis Pasteur, whose scientific breakthroughs in vaccination and disease prevention reduced human suffering on an astronomical scale, put it well:
Chance favours only the prepared mind.[xxi]
—Louis Pasteur (1822–95), French chemist and microbiologist
Readiness also holds true for any economic entity, company, asset manager, bank, etc. Any institution can get into dire straits under stress (or the other way around; get under stress in dire straits) or have the market “turn against them,” wherein the environment becomes hostile. Institutions can also either fail or endure, but those that have an edge in aligning true risk with perceived risk may improve their chances of survival. Those with an edge have the optionality. The conservationist behind the Svalbard Global Seed Vault in Norway, a vault to preserve a wide variety of plant seeds from large-scale disasters, most likely understands options:
In the game of life, less diversity means fewer options for change. Wild or domesticated, panda or pea, adaptation is the requirement for survival.[xxii]
—Cary Fowler (b. 1949), American agriculturalists
Bottom line
The iron law of failure states that everything eventually fails. There is a gravitational pull to becoming toast. The universe, your government, your life, your toaster, etc., all will eventually fail. One key insight of the law is that it takes some of the uncertainty out of the equation: there is no uncertainty about the if, only about the when.
Circumstances change. Changing circumstances can be analyzed to determine whether they are bringing a system or entity closer to failure or moving it away from failure. Paying attention to changing circumstances is paramount in the game of survival, followed by adapting to them.
Some man-made systems are chaotic, meaning a small disturbance can have a large impact. If a butterfly's wing flap in the Amazon can cause a tornado in Texas or the nose of an Egyptian goddess can change world history, forecasting outcomes in a chaotic system is foolhardy.
[i] The big crunch anecdote is from Stephen Hawking, Brief Answers to the Big Questions (London: John Murray Publishers, 2018), 63–64.
[ii] Will Durant, The Story of Civilization, Volume 1: Our Oriental Heritage (New York: Simon & Schuster, 1935). The full quotation is worth your consideration, suggesting, of course, that it is global cooling, not global warming, that is the big killer: “Certain factors condition civilization, and may encourage or impede it. First, geological conditions. Civilization is an interlude between ice ages: at any time the current of glaciation may rise again, cover with ice and stone the works of man, and reduce life to some narrow segment of the earth.”
[iii] I have been applying Ormerod’s iron law of failure to finance since around 2007. Some of the material in this article draws on earlier work from the 2000s. Parts of this article are from Applied Wisdom (2021).
[iv] Paul Ormerod, Why Most Things Fail…And How to Avoid It: Evolution, Extinction, and Economics (London: Faber & Faber, 2005), x.
[v] He then adds: “The other thing I’ve learned is that even when all the wealth is gone, life goes on.” From Jim Rogers, Investment Biker: Around the World with Jim Rogers (Chichester: John Wiley & Sons, 2000), 381. First published in 1994 by Beeland Interest (Miami).
[vi] Gustave Le Bon, The Crowd: A Study of the Popular Mind, 2nd ed. (Atlanta: Cherokee Publishing Company, 1982), last paragraph of book, 219. First published in 1895 in French (Psychologie des Foules).
[vii] Paul Ormerod, Why Most Things Fail…And How to Avoid It: Evolution, Extinction, and Economics (London: Faber & Faber, 2005), xi.
[viii] Parliamentary Debates: House of Commons Official Report, vol. 317 (1936), 423.
[ix] Edwin Lefèvre, Reminiscences of a Stock Operator, Investment Classics (New York: John Wiley & Sons, 1993), 228. First published in 1923 by George H. Doran and Company (New York).
[x] F. A. Hayek, The Constitution of Liberty (Chicago: University of Chicago Press, 1960), The Common Sense of Progress (Ch. 3).
[xi] As quoted in John Little, Striking Thoughts: Bruce Lee’s Wisdom for Daily Living (Clarendon: Tuttle Publishing, 2000).
[xii] Tom Morris, Philosophy for Dummies (New York: Wiley Publishing, 1999), 307.
[xiii] Charles Perrow, Normal Accidents: Living with High-Risk Technologies (Princeton: Princeton University Press, 1999), 5. Emphasis in the original. First published in 1984 by Basic Books (New York).
[xiv] Charles Perrow, Normal Accidents, 7.
[xv] From Cicero, De Finibus Bonorum et Malorum (45 bc), book V, chapter 58. Variant translation: “Everything has a small beginning.” Another variant, sourced from Orationes Philippicae V, reads: “The most important events are often determined by very trivial causes.”
[xvi] Blaise Pascal, Pensées (1658), no. 32.
[xvii] Richard Bookstaber, A Demon of Our Own Design: Markets, Hedge Funds, and the Perils of Financial Innovation (Hoboken: John Wiley & Sons, 2007), 155.
[xviii] “Ignorance is bliss” is a proverb. It means the lack of knowledge results in happiness; it is more comfortable not to know certain things. A Hungarian proverb, somewhat related, states: “The believer is happy; the doubter is wise.”
[xix] Andrew G. Haldane, “The Dog and the Frisbee,” speech at the Federal Reserve Bank of Kansas City’s thirty-sixth economic policy symposium, “The Changing Policy Landscape,” August 31, 2012, Jackson Hole, WY
[xx] The quotation used here is a variant or a derivative from a 1961 speech Ayn Rand gave at a symposium titled “Ethics in Our Time” held at the University of Wisconsin in Madison. The original reads: “He is free to make the wrong choice, but not free to succeed with it. He is free to evade reality, he is free to unfocus his mind and stumble blindly down any road he pleases, but not free to avoid the abyss he refuses to see. Knowledge, for any conscious organism, is the means of survival; to a living consciousness, every ‘is’ implies an ‘ought.’ Man is free to choose not to be conscious, but not free to escape the penalty of unconsciousness: destruction.” From Quote Investigator (May 15, 2017). Sometimes, the quotation I use is paraphrased as “You can ignore reality, but you cannot ignore the consequences of ignoring reality.” This is from her interestingly titled book The Virtue of Selfishness (New York: New American Library, 1964).
[xxi] There are variations to this quote. Wikiquote (December 1, 2016) quotes the original as “Dans les champs de l’observation le hasard ne favorise que les esprits prepares,” translates this to “in the fields of observation chance favors only the prepared mind,” and references the quotation to a lecture at the University of Lille on December 7, 1854.
[xxii] Cary Fowler, “Of Pandas and Peas: Saving the Diversity Within Species,” Huffington Post, June 11, 2010.