New York, January 15, 2009: US Airways flight 1549, with one hundred and fifty-five people on board, takes off from LaGuardia Airport at 3:26 p.m. Two minutes later, a flock of Canadian geese collides with the plane’s two engines and kills both the engines and the geese. Four minutes later, Captain Chesley Sullenberger landed the plane on the Hudson River. What saved the lives of one hundred and fifty passengers and five crew members was, among other things, checklists. In the four minutes between being hit by the geese and landing the plane safely on the Hudson River, there was good decision-making, i.e., the risk management procedure worked. Checklists played a central role. Gerd Gigerenzer, author of the very commendable book titled Risk Savvy, summarized the miracle as follows:
It was the combination of teamwork, checklists, and smart rules of thumb that made the miracle possible.[i]
—Gerd Gigerenzer (b. 1947), German psychologist and author
The gaze heuristic was the first rule of thumb, or heuristic, which Captain Sullenberger used when the geese killed both engines. (Fix your gaze on an object, and adjust your speed so that the angle of gaze remains constant). Dogs use it to catch Frisbees. Pilots are trained to use the heuristic, too. In the case of US 1549, it allowed the pilots to decide fast that flying back to LaGuardia was not an option. Again, Gigerenzer, who argues complex problems do not require complex solutions, sums it up well:
Intelligent decision making entails knowing what tool to use for what problem. Intelligence is not an abstract number such as an IQ, but similar to a carpenter’s tacit knowledge about using appropriate tools….Experts often search for less information than novices do, using heuristics instead… The important point is that ignoring information can lead to better, faster, and safer decisions.[ii]
—Gerd Gigerenzer, director at the Max Planck Institute for Human Development
One aspect of this comparison is related to the skin in the game concept. Pilots have skin in the game. In commercial aviation, pilots are not allowed to have parachutes on board for obvious reasons. Medical staffers have no or much less skin in the game. According to Gigerenzer, who works with doctors in a consulting capacity, doctors can have highly inflated egos. They are well-trained. The extensive training can cause the ego to inflate and end up as overconfidence. Gigerenzer, in Risk Savvy, tells some interesting stories of how doctors, who are among the well-respected professionals in nearly any society, can be extremely risk illiterate.
Failure and Persistence
It goes without saying that the learning-from-mistakes idea goes far beyond the realm of finance. It is probably a truism with a very wide application. Survival is nature’s number one rule. Adapting to changing circumstances is the key survival strategy. Learning from nonfinal mistakes plays a key part. (Dying from one’s mistakes also has a learning effect; it is just that the lessons are not applicable.) Michael Jordan, the billionaire athlete who has a fear of water, on the topic:
I’ve missed more than 9000 shots in my career. I’ve lost almost 300 games. 26 times, I’ve been trusted to take the game winning shot and missed. I’ve failed over and over and over again in my life. And that is why I succeed.[iii]
—Michael Jordan (b. 1963), American basketball player
The relaxed American attitude toward failure—sometimes referred to as an “error culture”—and inherent optimism is captured well in the following quotation.
Fail early, fail often and fail forward….You fail your way to the top.[iv]
—Will Smith (b. 1968), American actor
Dishonor from failure is limited, and self-sacrifice is not required for atonement. Failure is progress:
Even if you fall on your face, you’re still moving forward.
—Victor Kiam (1926–2001), American entrepreneur
It is persistence that matters to great achievers:
Do not judge me by my successes, judge me by how many times I fell down and got back up again.[v]
—Nelson Mandela (1918–2013), South African politician and anti-apartheid revolutionary
Life, including investment life, sometimes punches one metaphorically in the face. So what? Failure that one survives is often not all bad. The failure to learn from failure is wrong. The idea of not staying down is old. It is even a Japanese proverb:
Nana korobi ya oki. (Fall seven times, stand up eight.)
—Japanese proverb
Ali and Confucianism
The idea of learning from failure was not lost on Confucius:
Our greatest glory is not in never falling, but in rising every time we fall.[vi]
—Confucius (551–479 bc), Chinese philosopher
Cassius Marcellus Clay Jr. was quoting Confucius then:
Inside of a ring or out, ain’t nothing wrong with going down. It’s staying down that’s wrong.[vii]
—Muhammad Ali (1942–2016), American boxer
Resilience, perseverance, and the endurance of pain are key characteristics of complex Japanese culture. In a chapter called “How to Be Stupid,” Nassim Taleb says what is required:
If you “have optionality,” you don’t have much need for what is commonly called intelligence, knowledge, insight, skills, and these complicated things that take place in our brain cells. For you don’t have to be right that often. All you need is the wisdom to not do unintelligent things to hurt yourself (some acts of omission) and recognize favorable outcomes when they occur.[viii]
—Nassim Nicholas Taleb (b. 1960), Lebanese American risk analyst and author
George Soros, who has been learning the craft of investment management for roughly six decades, put it as follows:
Once we realize that imperfect understanding is the human condition, there is no shame in being wrong, only in failing to correct our mistakes.[ix]
—George Soros (b. 1930), Hungarian-born American hedge fund manager
There is no folly in being wrong and taking a punch. The folly is staying wrong and watching the losses grow, or worse, adding to a losing position. Remember:
A step backward after making a wrong turn, is a step in the right direction.[x]
—Kurt Vonnegut (1922–2007), American writer
There is a saying that one ought not to throw good money after bad; a saying that appears in many lists of investors’ “golden rules.” In the early part of his career, Paul Tudor Jones, the American hedge fund manager, kept a handwritten note on his office wall behind his desk, a reminder of the saying:
Losers average losers.
—Paul Tudor Jones (b. 1954), founder of Robin Hood Foundation
The term “average” means averaging down, i.e., reducing the entry level by adding to the losing position at a lower price. Averaging down does not change the initial investment thesis, and the averaging down is probably closer to investor hubris than investment wisdom. (That said, there are exceptions to almost any rule.)
One prominent example was Enron. Once the stock had halved, one ought not to have doubled the position, even if the president had endorsed the stock. Losers averaging losers is also part of poker wisdom.
Positive and Negative Error Culture
According to the Bureau of Aircraft Accidents’ archives, the average annual death toll from commercial aviation accidents per year globally is one thousand twenty-two deaths.[xi] The one thousand twenty-two deaths from commercial airline accidents compare to one hundred and sixty-one thousand five hundred deaths from avoidable errors in US hospitals alone per year.[xii]
Gigerenzer distinguishes between positive and negative error cultures. A positive error culture learns from past mistakes, while a negative error culture does not. The contrast between commercial aviation and hospitals is quite extreme. The commercial aviation industry is an extreme example of a positive error culture, while hospitals are an extreme example of a negative error culture. The following one-liner is a good summary of Gigerenzer’s findings:
If we had the safety culture of a hospital, we would crash two planes a day.[xiii]
—Head of risk management of an international airline
Checklists can alleviate risk illiteracy, laxity, and overconfidence. Commercial airline pilots have checklists and hospitals to a much lesser degree. And if they do have checklists, they are not followed, either due to time pressure, laxity, or the ego of the medical staff. Case in point of a hospital’s checklist:
Doctors are supposed to:
1. wash their hands with soap;
2. clean the patient’s skin with chlorhexidine antiseptic;
3. put sterile sheets over the entire patient;
4. wear a sterile mask, hat, gown, and gloves; and
5. put a sterile dressing over the catheter site once the line is in.
According to research cited in Gigerenzer’s Risk Savvy, at least one step is skipped for around a third of all patients. A checklist allows for a positive error culture.
In commercial aviation, every single error is analyzed and shared with peers. It is central for the commercial aviation industry that these errors are not repeated. If a short circuit of the onboard coffee machine caused a cable fire in the cockpit that brought down a plane, this never ought to happen again. The failure, i.e., the causal links, will enter a new safety rule or maintenance checklist. This is the reason why the pieces of a crashed plane are painstakingly put back together after the crash: to learn from what went wrong. We err and learn:
We are human. We fail. And, crucially, we keep learning.[xiv]
—Richard Branson (b. 1950), English serial entrepreneur
It is for this reason that I introduced checklists into my risk management research a couple of years ago. It institutionalizes a positive error culture. I probably got the idea from Charlie Munger:
How can smart people so often be wrong? They don’t do what I’m telling you to do: use a checklist to be sure you get all the main models and use them together in a multimodular way.[xv]
—Charlie Munger (1924-2023), American investor and vice chairman of Berkshire Hathaway
A checklist allows for a positive error culture. Akin to learning from past errors, a checklist about risk management in finance institutionalizes past warning signals. It allows for more discipline and more efficiency in asset allocation, portfolio rebalancing, and risk management. Below is an example of a checklist related to US economy.
All these factors have a warning characteristic, i.e., the red flag needs to pop up prior to market mayhem. In a perfect world all factors would work all of the time.
We do not live in a perfect world. Checklists still helped Sullenberger land safely on the Hudson River, though.
[i] Gerd Gigerenzer, Risk Savvy: How to Make Good Decisions (New York: Viking, 2014), 29.
[ii] Gerd Gigerenzer, Risk Savvy, 31.
[iii] As quoted in Robert Goldman and Stephen Papson, Nike Culture: The Sign of the Swoosh (Thousand Oaks, CA: Sage Publications, 1998), 49.
[iv] Michael Klein, “Alternative Investment Summit: Fresh Prince Rocks the Kimpton,” Cayman Compass, February 11, 2018. The quotes are from Will Smith’s YouTube channel.
[v] As quoted in Ann Kannings, Nelson Mandela: His Words (Morrisville, NC: Lulu Press, 2014).
[vi] Garson O’Toole at Quote Investigator (December 23, 2020) has some reservations to sourcing the quote to Confucius. He credits Irish author Oliver Goldsmith in the 1760s as the earliest appearance of the content of the quote. Goldsmith wrote a series of letters under the pseudonym of an imaginary Chinese traveler based in London named Lien Chi Altangi in the Public Ledger magazine of London.
[vii] As quoted in Wally Phillips, Way to Go: Surviving in This World until Something Better Comes Along (New York: William Morrow, 1985), 200.
[viii] Nassim Nicholas Taleb, Antifragile: Things That Gain from Disorder (New York: Random House, 2012) 180. Emphasis in the original.
[ix] George Soros, Soros on Soros: Staying Ahead of the Curve (New York: John Wiley & Sons, 1995), 11.
[x] Kurt Vonnegut, Player Piano (New York: Macmillan, 1952).
[xi] Average from 2000 to 2019, from Wikipedia, https://en.wikipedia.org/wiki/Aviation_accidents_and_incidents, retrieved December 23, 2020.
[xii] Matt Austin and Jordan Derk, “Lives Lost, Lives Saved: An Updated Comparative Analysis of Avoidable Deaths at Hospitals Graded by the Leapfrog Group,” Armstrong Institute for Patient Safety and Quality Johns Hopkins Medicine (May 2019).
[xiii] Gerd Gigerenzer, Risk Savvy: How to Make Good Decisions (New York: Viking, 2014), 51.
[xiv] Richard Branson, @richardbranson, Twitter (June 20, 2017).
[xv] Charles T. Munger, Poor Charlie’s Almanack: The Wit and Wisdom of Charles T. Munger, ed. Peter D. Kaufman, expanded 3rd ed. (Virginia Beach: The Donning Company Publishers, 2008), 320.