In our highly technological society we cannot do without experts. We accept this fact of life, but not without anxiety. There is much truth in the definition of the specialist as someone who “knows more and more about less and less.” But there is another side to the coin of expertise. A really great idea in science often has its birth as apparently no more than a particular answer to a narrow question; it is only later that it turns out that the ramifications of the answer reach out into the most surprising corners. What begins as knowledge about very little turns out to be wisdom about a great deal.
So it was with the development of the theory of probability. It all began in the seventeenth century, when one of the minor French nobility asked the philosopher-scientist Blaise Pascal to devise a fair way to divide the stakes in an interrupted gambling game. Pascal consulted with lawyer-mathematician friend Pierre de Fermat, and the two of them quickly laid the foundation of probability theory. Out of a trivial question about gambling came profound insights that later bore splendid fruit in physics and biology, in the verification of the causes of disease, the calculation of fair insurance premiums, and the achievement of quality control in manufacturing processes. And much more.
The service of experts is indispensable even if we are poor at ascertaining under which circumstances they add value, when they add noise, and when they are harmful. Hardin cautions that each new expertise introduces “new possibilities of error.”
It is unfortunately true that experts are generally better at seeing their particular kinds of trees than the forest of all life.
Thoughtful laymen — that’s us — can, however, “become very good at seeing the forest, particularly if they lose their timidity about challenging the experts. … In the universal role of laymen we all have to learn to filter the essential meaning out of the too verbose, too aggressively technical statements of the experts. Fortunately this is not as difficult a task as some experts would have us believe.”
Filters Against Folly is Hardin’s attempt “to show there …. (are) some rather simple methods of checking on the validity of the statements of experts.”
In his book Applied Minds: How Engineers Think, Guru Madhavan explores the mental tools of engineers that allow engineering feats. His framework is built around a flexible intellectual tool kit called modular systems thinking.
The core of the engineering mind-set is what I call modular systems thinking. It’s not a singular talent, but a melange of techniques and principles. Systems-level thinking is more than just being systematic; rather, it’s about the understanding that in the ebb and flow of life, nothing is stationary and everything is linked. The relationships among the modules of a system give rise to a whole that cannot be understood by analyzing its constituent parts.
*** Thinking in Systems
Thinking in systems means that you can deconstruct (breaking down a larger system into its modules) and reconstruct (putting it back together).
The focus is on identifying the strong and weak links—how the modules work, don’t work, or could potentially work—and applying this knowledge to engineer useful outcomes.
There is no engineering method, so modular systems thinking varies with contexts.
Engineering Dubai’s Burj Khalifa is different from coding the Microsoft Office Suite. Whether used to conduct wind tunnel tests on World Cup soccer balls or to create a missile capable of hitting another missile midflight, engineering works in various ways. Even within a specific industry, techniques can differ. Engineering an artifact like a turbofan engine is different from assembling a megasystem like an aircraft, and by extension, a system of systems, such as the air traffic network.
*** The Three Essential Properties of the Engineering Mind-Set
1. The ability to see structure where there’s nothing apparent.
From haikus to high-rise buildings, our world relies on structures. Just as a talented composer “hears” a sound before it’s put down on a score, a good engineer is able to visualize—and produce—structures through a combination of rules, models, and instincts. The engineering mind gravitates to the piece of the iceberg underneath the water rather than its surface. It’s not only about what one sees; it’s also about the unseen.
A structured systems-level thinking process would consider how the elements of the system are linked in logic, in time, in sequence, and in function—and under what conditions they work and don’t work. A historian might apply this sort of structural logic decades after something has occurred, but an engineer needs to do this preemptively, whether with the finest details or top-level abstractions. This is one of the main reasons why engineers build models: so that they can have structured conversations based in reality. Critically, envisioning a structure involves having the wisdom to know when a structure is valuable, and when it isn’t.
Consider, for example, the following catechism by George Heilmeier—a former director of the U.S. Defense Advanced Research Projects Agency (DARPA), who also engineered the liquid crystal displays (LCDs) that are part of modern-day visual technologies. His approach to innovation is to employ a checklist-like template suitable for a project with well-defined goals and customers.
What are you trying to do? Articulate your objectives using absolutely no jargon.
How is it done today, and what are the limits of current practice?
What’s new in your approach and why do you think it will be successful?
Who cares? If you’re successful, what difference will it make?
What are the risks and the payoffs?
How much will it cost? How long will it take?
What are the midterm and final “exams” to check for success?
This type of structure “helps ask the right questions in a logical way.”
2. Adeptness at designing under constraints The real world is full of constraints that make or break potential.
Given the innately practical nature of engineering, the pressures on it are far greater compared to other professions. Constraints—whether natural or human-made—don’t permit engineers to wait until all phenomena are fully understood and explained. Engineers are expected to produce the best possible results under the given conditions. Even if there are no constraints, good engineers know how to apply constraints to help achieve their goals. Time constraints on engineers fuel creativity and resourcefulness. Financial constraints and the blatant physical constraints hinging on the laws of nature are also common, coupled with an unpredictable constraint—namely, human behavior.
“Imagine if each new version of the Macintosh Operating System, or of Windows, was in fact a completely new operating system that began from scratch. It would bring personal computing to a halt,” Olivier de Week and his fellow researchers at the Massachusetts Institute of Technology point out. Engineers often augment their software products, incrementally addressing customer preferences and business necessities— which are nothing but constraints. “Changes that look easy at first frequently necessitate other changes, which in turn cause more change. . . . You have to find a way to keep the old thing going while creating something new.” The pressures are endless.
3. Understanding Trade-offs The ability to hold alternative ideas in your head and make considered judgments.
Engineers make design priorities and allocate resources by ferreting out the weak goals among stronger ones. For an airplane design, a typical trade-off could be to balance the demands of cost, weight, wingspan, and lavatory dimensions within the constraints of the given performance specifications. This type of selection pressure even trickles down to the question of whether passengers like the airplane they’re flying in. If constraints are like tightrope walking, then trade-offs are inescapable tugs-of-war among what’s available, what’s possible, what’s desirable, and what the limits are.
We live in a digital time which Schwartz and Loehr capture so eloquently:
We live in digital time. Our rhythms are rushed, rapid fire and relentless, our days carved up into bits and bytes. We celebrate breadth rather than depth, quick reaction more than considered reflection. We skim across the surface, alighting for brief moments at dozens of destinations but rarely remaining for long at any one. We race through our lives without pausing to consider who we really want to be or where we really want to go. We’re wired up but we’re melting down.
Most of us are just trying to do the best that we can. When demand exceeds our capacity, we begin to make expedient choices that get us through our days and nights, but take a toll over time. We survive on too little sleep, wolf down fast foods on the run, fuel up with coffee and cool down with alcohol and sleeping pills. Faced with relentless demands at work, we become short-tempered and easily distracted. We return home from long days at work feeling exhausted and often experience our families not as a source of joy and renewal, but as one more demand in an already overburdened life.
We walk around with day planners and to-do lists, Palm Pilots and BlackBerries, instant pagers and pop-up reminders on our computers— all designed to help us manage our time better. We take pride in our ability to multitask, and we wear our willingness to put in long hours as a badge of honor. The term 24/ 7 describes a world in which work never ends.
“Energy, not time, is the fundamental currency of high performance.”
“Every one of our thoughts, emotions and behaviors has an energy consequence,” they write. “The ultimate measure of our lives is not how much time we spend on the planet, but rather how much energy we invest in the time that we have.”
There are undeniably bad bosses, toxic work environments, difficult relationships and real life crises. Nonetheless, we have far more control over our energy than we ordinarily realize. The number of hours in a day is fixed, but the quantity and quality of energy available to us is not. It is our most precious resource. The more we take responsibility for the energy we bring to the world, the more empowered and productive we become. The more we blame others or external circumstances, the more negative and compromised our energy is likely to be.
To be fully engaged, we need to be fully present. To be fully present we must be “physically energized, emotionally connected, mentally focused and spiritually aligned with a purpose beyond our own immediate self-interest.”
Conventional wisdom holds that if you find talented people and equip them with the right skills for the challenge at hand, they will perform at their best. In our experience that often isn’t so. Energy is the X factor that makes it possible to fully ignite talent and skill.
*** You Must Become Fully Engaged
Here are the four key energy management principles that drive performance.
Principle 1: Full engagement requires drawing on four separate but related sources of energy: physical, emotional, mental and spiritual.
Human beings are complex energy systems, and full engagement is not simply one-dimensional. The energy that pulses through us is physical, emotional, mental, and spiritual. All four dynamics are critical, none is sufficient by itself and each profoundly influences the others. To perform at our best, we must skillfully manage each of these interconnected dimensions of energy. Subtract any one from the equation and our capacity to fully ignite our talent and skill is diminished, much the way an engine sputters when one of its cylinders misfires.
Energy is the common denominator in all dimensions of our lives. Physical energy capacity is measured in terms of quantity (low to high) and emotional capacity in quality (negative to positive). These are our most fundamental sources of energy because without sufficient high-octane fuel no mission can be accomplished.
The importance of full engagement is most vivid in situations where the consequences of disengagement are profound. Imagine for a moment that you are facing open-heart surgery. Which energy quadrant do you want your surgeon to be in? How would you feel if he entered the operating room feeling angry, frustrated and anxious (high negative)? How about overworked, exhausted and depressed (low negative)? What if he was disengaged, laid back and slightly spacey (low positive)? Obviously, you want your surgeon energized, confident and upbeat (high positive).
Imagine that every time you yelled at someone in frustration or did sloppy work on a project or failed to focus your attention fully on the task at hand, you put someone’s life at risk. Very quickly, you would become less negative, reckless and sloppy in the way you manage your energy. We hold ourselves accountable for the ways that we manage our time, and for that matter our money. We must learn to hold ourselves at least equally accountable for how we manage our energy physically, emotionally, mentally and spiritually.
Principle 2: Because energy capacity diminishes both with overuse and with underuse, we must balance energy expenditure with intermittent energy renewal.
We rarely consider how much energy we are spending because we take it for granted that the energy available to us is limitless. … The richest, happiest and most productive lives are characterized by the ability to fully engage in the challenge at hand, but also to disengage periodically and seek renewal. Instead, many of us live our lives as if we are running in an endless marathon, pushing ourselves far beyond healthy levels of exertion. … We, too, must learn to live our own lives as a series of sprints— fully engaging for periods of time, and then fully disengaging and seeking renewal before jumping back into the fray to face whatever challenges confront us.
Principle 3: To build capacity, we must push beyond our normal limits, training in the same systematic way that elite athletes do.
Stress is not the enemy in our lives. Paradoxically, it is the key to growth. In order to build strength in a muscle we must systematically stress it, expending energy beyond normal levels. … We build emotional, mental and spiritual capacity in precisely the same way that we build physical capacity.
Principle 4: Positive energy rituals—highly specific routines for managing energy— are the key to full engagement and sustained high performance.
Change is difficult. We are creatures of habit. Most of what we do is automatic and nonconscious. What we did yesterday is what we are likely to do today. The problem with most efforts at change is that conscious effort can’t be sustained over the long haul. Will and discipline are far more limited resources than most of us realize. If you have to think about something each time you do it, the likelihood is that you won’t keep doing it for very long. The status quo has a magnetic pull on us.
Look at any part of your life in which you are consistently effective and you will find that certain habits help make that possible. If you eat in a healthy way, it is probably because you have built routines around the food you buy and what you are willing to order at restaurants. If you are fit, it is probably because you have regular days and times for working out. If you are successful in a sales job, you probably have a ritual of mental preparation for calls and ways that you talk to yourself to stay positive in the face of rejection. If you manage others effectively, you likely have a style of giving feedback that leaves people feeling challenged rather than threatened. If you are closely connected to your spouse and your children, you probably have rituals around spending time with them. If you sustain high positive energy despite an extremely demanding job, you almost certainly have predictable ways of insuring that you get intermittent recovery. Creating positive rituals is the most powerful means we have found to effectively manage energy in the service of full engagement.
To make money in the markets, you have to think independently and be humble. You have to be an independent thinker because you can’t make money agreeing with the consensus view, which is already embedded in the price. Yet whenever you’re betting against the consensus there’s a significant probability you’re going to be wrong, so you have to be humble.
Early in my career I learned this lesson the hard way — through some very painful bad bets. The biggest of these mistakes occurred in 1981–’82, when I became convinced that the U.S. economy was about to fall into a depression. My research had led me to believe that, with the Federal Reserve’s tight money policy and lots of debt outstanding, there would be a global wave of debt defaults, and if the Fed tried to handle it by printing money, inflation would accelerate. I was so certain that a depression was coming that I proclaimed it in newspaper columns, on TV, even in testimony to Congress. When Mexico defaulted on its debt in August 1982, I was sure I was right. Boy, was I wrong. What I’d considered improbable was exactly what happened: Fed chairman Paul Volcker’s move to lower interest rates and make money and credit available helped jump-start a bull market in stocks and the U.S. economy’s greatest ever noninflationary growth period
What’s important isn’t that he was wrong, it’s what the experience taught him and how he implemented those lessons at Bridgewater.
This episode taught me the importance of always fearing being wrong, no matter how confident I am that I’m right. As a result, I began seeking out the smartest people I could find who disagreed with me so that I could understand their reasoning. Only after I fully grasped their points of view could I decide to reject or accept them. By doing this again and again over the years, not only have I increased my chances of being right, but I have also learned a huge amount.
There’s an art to this process of seeking out thoughtful disagreement. People who are successful at it realize that there is always some probability they might be wrong and that it’s worth the effort to consider what others are saying — not simply the others’ conclusions, but the reasoning behind them — to be assured that they aren’t making a mistake themselves. They approach disagreement with curiosity, not antagonism, and are what I call “open-minded and assertive at the same time.” This means that they possess the ability to calmly take in what other people are thinking rather than block it out, and to clearly lay out the reasons why they haven’t reached the same conclusion. They are able to listen carefully and objectively to the reasoning behind differing opinions.
When most people hear me describe this approach, they typically say, “No problem, I’m open-minded!” But what they really mean is that they’re open to being wrong. True open-mindedness is an entirely different mind-set. It is a process of being intensely worried about being wrong and asking questions instead of defending a position. It demands that you get over your ego-driven desire to have whatever answer you happen to have in your head be right. Instead, you need to actively question all of your opinions and seek out the reasoning behind alternative points of view.
The big general objection to economics was the one early described by Alfred North Whitehead when he spoke of the fatal unconnectedness of academic disciplines, wherein each professor didn’t even know of the models of the other disciplines, much less try to synthesize those disciplines with his own … The nature of this failure is that it creates what I always call ‘man with a hammer’ syndrome. To a man with only a hammer, every problem looks pretty much like a nail. And that works marvellously to gum up all professions, and all departments of academia, and indeed most practical life. So, what do we do, Charlie? The only antidote for being an absolute klutz due to the presence of a man with a hammer syndrome is to have a full kit of tools. You don’t have just a hammer. You’ve got all the tools.
The more models you have from outside your discipline and the more you iterate through them when faced with a challenge in a checklist sort of fashion, the better you’ll be able to solve problems.
Models are additive. Like LEGO. The more you have the more things you can build, the more connections you can make between them and the more likely you are to be able to determine the relevant variables that govern the situation.
And when you learn these models you need to ask yourself under what conditions will this tool fail? That way you’re not only looking for situations where the tool is useful but also situations where something interesting is happening that might warrant further attention.
Now for the final step in the design of the mentally choiceful stance: the search engine, as in ‘How did I solve these problems?’ ‘Obviously,’ you will answer yourself, ‘I was using a simple search engine in my mind to go through checklist style, and I was using some rough algorithms that work pretty well in many complex systems.’ What does a search engine do? It searches. And how do you organize an efficient search? Well, algorithm designers tell us you have to have an efficient organization of the contents of whatever it is you are searching. And a tree structure allows you to search more efficiently than most alternative structures.
Extreme success is likely to be caused by some combination of the following factors: a) Extreme maximization or minimization of one or two variables. Example[:] Costco, or, [Berkshire Hathaway’s] furniture and appliance store. b) Adding success factors so that a bigger combination drives success, often in nonlinear fashion, as one is reminded of the concept of breakpoint or the concept of critical mass in physics. You get more mass, and you get a lollapalooza result. And of course I’ve been searching for lollapalooza results all my life, so I’m very interested in models that explain their occurrence. [Remember the Black Swan?] c) an extreme of good performance over many factors. Examples: Toyota or Les Schwab. d) Catching and riding some big wave.
A good search algorithm allows you to make your mental choices clear. It makes it easier for you to be mentally choiceful and to understand the reasons why you’re making these mental choices.
Now, what should go on the branches of your tree of mental models? Well, how about basic mental models from a whole bunch of different disciplines? Such as: physics (non-linearity, criticality), economics (what Munger calls the ‘super-power’ of incentives), the multiplicative effects of several interacting causes (biophysics), and collective phenomena – or ‘catching the wave’ (plasma physics). How’s that for a science that rocks, by placing at the disposal of the mind a large library of forms created by thinkers across hundreds of years and marshalling them for the purpose of detecting, building, and profiting from Black Swans?
The ‘tree trick’ has one more advantage – a big one: it lets you quickly visualize interactions among the various models and identify cumulative effects. Go northwest in your search, starting from the ’0’ node, and the interactions double with every step. Go southwest, on the other hand, and the interactions decrease in number at the same rate. Seen in this rather sketchy way, Black Swan hunting is no longer as daunting a sport as it might seem at first sight.
You can’t deal with ignorance if you can’t recognize its presence. If you’re suffering from primary ignorance it means you probably failed to consider the possibility of being ignorant or you found ways not to see that you were ignorant.
You’re ignorant and unaware, which is worse than being ignorant and aware.
The best way to avoid this, suggests Joy and Zeckhauser, is to raise self-awareness.
Ask yourself regularly: “Might I be in a state of consequential ignorance here?”
If the answer is yes, the next step should be to estimate base rates. That should also be the next step if the starting point is recognized ignorance.
Of all situations such as this, how often has a particular outcome happened. Of course, this is often totally subjective.
and its underpinnings are elusive. It is hard to know what the sample of relevant past experiences has been, how to draw inferences from the experience of others, etc. Nevertheless, it is far better to proceed to an answer, however tenuous, than to simply miss (primary ignorance) or slight (recognized ignorance) the issue. Unfortunately, the assessment of base rates is challenging and substantial biases are likely to enter.
When we don’t recognize ignorance the base rate is extremely underestimated. When we do recognize ignorance, we face “duelling biases; some will lead to underestimates of base rates and others to overestimates.”
Three biases come into play while estimating base rates: overconfidence, salience, and selection biases.
So we are overconfident in our estimates. We estimate things that are salient – that is, “states with which (we) have some experience or that are otherwise easily brought to mind.” And “there is a strong selection bias to recall or retell events that were surprising or of great consequence.”
Our key lesson is that as individuals proceed through life, they should always be on the lookout for ignorance. When they do recognize it, they should try to assess how likely they are to be surprised—in other words, attempt to compute the base rate. In discussing this assessment, we might also employ the term “catchall” from statistics, to cover the outcomes not specifically addressed.
It’s incredibly interesting to view literature through the lens of human decision making.
Crime and Punishment is particularly interesting as a study of primary ignorance. Raskolnikov deploys his impressive intelligence to plan the murder, believing, in his ignorance, that he has left nothing to chance. In a series of descriptions not for the squeamish or the faint-hearted, the murderer’s thoughts are laid bare as he plans the deed. We read about his skills in strategic inference and his powers of prediction about where and how he will corner his victim; his tactics at developing complementary skills (what is the precise manner in which he will carry the axe?; what strategies will help him avoid detection) are revealed.
But since Raskolnikov is making decisions under primary ignorance, his determined rationality is tightly “bounded.” He “construct[s] a simplified model of the real situation in order to deal with it; … behaves rationally with respect to this model, [but] such behavior is not even approximately optimal with respect to the real world” (Simon 1957). The second-guessing, fear, and delirium at the heart of Raskolnikov’s thinking as he struggles to gain a foothold in his inner world show the impact of a cascade of Consequential Amazing Development’s (CAD), none predicted, none even contemplated. Raskolnikov anticipated an outcome in which he would dispatch the pawnbroker and slip quietly out of her apartment. He could not have possibly predicted that her sister would show up, a characteristic CAD that challenges what Taleb (2012) calls our “illusion of predictability.”
Joy and Zeckhauser argue we can draw two conclusions.
First, we tend to downplay the role of unanticipated events, preferring instead to expect simple causal relationships and linear developments. Second, when we do encounter a CAD, we often counter with knee-jerk, impulsive decisions, the equivalent of Raskolnikov committing a second impetuous murder.
Bringing you Farnam Street took thousands of dollars and nearly 1,500 hours in 2013. If you find any value in it, I’d greatly appreciate your support with a modest donation. For extra Karma points, become a member with a monthly donation.
References: Ignorance: Lessons from the Laboratory of Literature (Joy and Zeckhauser).
Think of how we make decisions in organizations — we often do what standard decision theory would ask of us.
We create a powerpoint that identifies the future desired state, identify what might happen, attach weighted probabilities to said outcomes, and make a choice. Perfectly rational. Right?
One of the problems with this approach is the risk charts and matrices that accompany this analysis. In my experience these charts are rarely discussed in detail and become more about checking the ‘I thought about risk’ box than anything else. We conveniently pin things into categories of low, medium, or high risk with a corresponding “impact” scale.
What gets most of the attention is high-risk, high-impact. Perhaps deservedly so. But you have to ask yourself how did we arrive at these arbitrary scales? Is one person’s look at risk the same as someone else’s? Are there hidden incentives to nudge risk one way or another? What biases come into play?
Often we can’t even identify everything. Rarely do people ever go back and look at what happened and how accurate those “risk” tables were. From the ones I’ve seen, the “low risk” stuff happens a lot more often than people imagined. And a lot of things happen that never even made the chart in the first place.
On the occasion when people do go back, and I’ve seen this firsthand, hindsight bias creeps in. “Oh, we discussed that but it didn’t make it in the document. But we knew about it.” Yes, of course you did.
Ignorant and unknowing.
We’re largely ignorant, that is, we operate in a state of the world where some possible outcomes are unknown. However, we’ve prepared for a world where outcomes and probabilities can be estimated. There is a mis-match between our training and reality. You can’t even hope to accurately estimate probabilities if the range of outcomes is unknown.
There are two types of ignorance.
The first category is when we do not know we are ignorant. This is primary ignorance. The second category is when we recognize our ignorance. This is called recognized ignorance.
[The Empty Suit/Fragilista] defaults to thinking that what he doesn’t see is not there, or what he does not understand does not exist. At the core, he tends to mistake the unknown for the nonexistent.
That my friends is primary ignorance. And it’s not limited to empty suits and fragilistas. Consider Anna Karenina:
Primary ignorance ruins the life of one of fiction’s most famous characters, Anna Karenina. Readers of Anna Karenina (1877/2004) know that, in this novel, a train bookends bad news. Anna alights from one train as the novel begins and throws herself under another one as it ends. As she enters the glittering world of pre-Revolutionary Saint Petersburg, Anna catches the eye of the aristocratic bachelor Count Vronsky and quickly falls under his spell. But there is a problem: she is married to the rising politician Karenin, the two have a son Seryozha, and society will not take kindly to the conspicuous adultery of a prominent citizen. Indulging in an extra-marital affair, especially when one’s husband is a respected member of society, promotes the likelihood of unpleasant (events). But her passion for Vronsky dulls Anna’s capacities for self-awareness. She becomes pregnant out of wedlock, a disastrous condition for a woman in nineteenth-century Russia. Anna consistently displays an unfortunate propensity to take action without recognizing that a terrible consequential outcome is possible. That is, she operates in primary ignorance.
Anna demonstrates all the characteristics of primary ignorance. She fails to consider all the possible scenarios that will occur from her impulsive decision making. She risks her marriage with Karenin, a kind if undemonstrative husband, who is willing to forgive and even offers to raise her illegitimate child as his own. Leaving Seryozha with Karenin, she and Vronsky escape to Italy and then to his Russian country estate. Ultimately, she finds that while Vronsky continues to be accepted socially, living his life exactly as he pleases, the door of society slams shut in her face. No one will associate with her and she is insulted as an adulterer wherever she goes. It is only when she is completely isolated socially and cut off from her beloved son that Anna recognizes the dangers of primary ignorance: she risked her family and her reputation for too little. … She realizes she was ignorant of the possible outcomes that jumping headlong into an illicit relationship would bring.
Ignorance, primary or recognized, is only important if the expected consequences are significant. Otherwise we can be ignorant without consequence.
While human irrationality factors into all decisions, it hits us most when we are unknowingly ignorant. Rational decision making becomes harder as we move along the continuum: outcomes are known —> risk —> uncertainty/ignorance.
If we can not consider all possible outcomes, preventing failure becomes nearly impossible. Further complicating matters, situations of ignorance often take years to play out. Joy and Zeckhauser write:
One could argue … that a rational decision maker should always consider the possibility of ignorance, thus ruling out primary ignorance. But that is a level of rationality that very few achieve.
If we could do this we’d always be in the space of recognized ignorance, better, at least, than primary ignorance.
“Fortunately,” write Joy and Zeckhauser, “there is a group of highly perceptive chroniclers of human decision-making who observe individuals and follow their paths, often over years or decades. They are the individuals who write fiction: plays, novels, and short stories describing imagined events and people (or fictional characters.)”
Joy and Zeckhouser argue these works have “deep insights” into the way we approach decisions, “both great and small.”
In the Poetics, a classical treatise on the principles of literary theory, Aristotle argues that art imitates life. We refer here to Aristotle’s ideas of mimesis, or imitation. Aristotle claims one of art’s functions is the representation of reality. “Art” here includes creative products of the human imagination and, therefore, any work of fiction. Indeed, a crevice, not a canyon, separates faction and fiction.
For centuries, authors have attempted to depict situations of ignorance. In Greek literature, Sophocle’s King Oedipus and Creon, and Homer’s Odysseus all seek forecasting skills of the blind prophet Tiresias who is doomed by Zeus to “speak the truth no man may believe.”
For its status as one of literature’s most enduring love stories, Jane Austen’s Pride and Prejudice begins rather unpromisingly: the hero and the heroine cannot stand each another. The arrogant Mr. Darcy claims Elizabeth Bennet is “not handsome enough to tempt me”; Elizabeth offers the equally withering riposte that she “may safely promise …never to dance with him.” Were we to encounter them after these early skirmishes, we (like Elizabeth and Darcy themselves) would be ignorant of the possibility of an ultimate romance.
In Gustave Flaubert’s Madame Bovary (1856/2004), Charles Bovary is a stolid rural doctor who is ignorant of the true character of the woman he is marrying. Dazzled by her youth and beauty, he ends up with an adulterous wife who plunges him into debt. His wife Emma, the titular “Madame Bovary,” is equally ignorant of the true character of her husband. Her head filled with romantic fantasies, she yearns for a sophisticated partner and the glamor of city life, but finds herself trapped in a somnolent marriage with a rustic man.
K., the land surveyor and protagonist of Franz Kafka’s The Castle, attempts, repeatedly and unsuccessfully, to gain access to the mysterious authorities of a castle but is frustrated by an authoritarian bureaucracy and by ambiguous responses that defy rational interpretation. He begins and ends the novel (as does the reader) in ignorance.
Joy and Zeckhouser use stories to study ignorance, which makes sense.
Stories offer “simulations of the social world,” according to Psychologists Raymond Mar and Keith Oatley, through abstraction, simplification, and compression. Stories afford us a kind of flight simulator. We can test run new things and observe and learn, with little economic or social cost. Joy and Zeckhouser believe “that characters in great works of literature reproduce the behavioral propensities of real-life individuals.”
While we’ll likely never uncover situations as fascinating as we find in stories, this doesn’t mean they are not a useful tool for learning about choice and consequence.
“In a sense,” Joy and Zeckhauser write, “this is why great literature will never get dated: these stories observe the details of human behavior, and present such behavior awash with all the anguish and the splendor that is the lot of the human predicament.
Characters in a fictitious world do exactly what our intelligence allows us to do in the real world. We watch what happens to them and mentally take notes on the outcomes of the strategies and tactics they use in pursuing their goals.
If we assume we live in a world where we are, to some extent, ignorant then the best course is “thoughtful action or prudent information gathering.” Yet, when you look at the stories, “we frequently act in ways that violate such advice.”
So reading fiction can help us adapt and deal with the world of uncertainty.
For the sake of argument, let’s break them down into a few categories.
There are decisions where:
Outcomes are known. This is the easiest way to make decisions. If I hold out my hand and drop a ball, it will fall to the ground.
Outcomes are unknown, but probabilities are known. This is risk. Think of this as going to Vegas and gambling. Before you set foot at the table, all of the outcomes are known as are the probabilities of each. No outcome surprises an objective third party.
Outcomes are unknown and probabilities are unknown. This is uncertainty.
We often think we’re making decisions in #2 but we’re really in #3.
Ignorance is a state of the world where some possible outcomes are unknown: when we’ve moved from #2 to #3.
One way to realize how ignorant we are is to look back, read some old newspapers, and see how often the world did something that wasn’t even imagined.
Some examples include the Arab Spring, the collapse of the Soviet Union, the financial meltdown.
We’re prepared for a world much like #2 — the world of risk, with known outcomes and probability that can be estimated, yet we live in a world with a closer resemblance to #3.
Part of the argument that Fooled by Randomness presents is that when we look back at things that have happened we see them as less random than they actually were.
It is as if there were two planets: the one in which we actually live and the one, considerably more deterministic, on which people are convinced we live. It is as simple as that: Past events will always look less random than they were (it is called the hindsight bias). I would listen to someone’s discussion of his own past realizing that much of what he was saying was just backfit explanations concocted ex post by his deluded mind.
*** The Courage of Montaigne
Writing on Montaigne as the role model for the modern thinker, Taleb also addresses his courage:
It certainly takes bravery to remain skeptical; it takes inordinate courage to introspect, to confront oneself, to accept one’s limitations— scientists are seeing more and more evidence that we are specifically designed by mother nature to fool ourselves.
Fooled by Randomness is about probability, not in a mathematical way but as skepticism.
In this book probability is principally a branch of applied skepticism, not an engineering discipline. …
Probability is not a mere computation of odds on the dice or more complicated variants; it is the acceptance of the lack of certainty in our knowledge and the development of methods for dealing with our ignorance. Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem or a brain teaser. Mother nature does not tell you how many holes there are on the roulette table , nor does she deliver problems in a textbook way (in the real world one has to guess the problem more than the solution).
“Outside of textbooks and casinos, probability almost never presents itself as a mathematical problem” which is fascinating given how we tend to solve problems. In decisions under uncertainty, I discussed how risk and uncertainty are different things, which creates two types of ignorance.
Most decisions are not risk-based, they are uncertainty-based and you either know you are ignorant or you have no idea you are ignorant. There is a big distinction between the two. Trust me, you’d rather know you are ignorant.
This problem manifests itself most frequently in the lucky fool, “defined as a person who benefited from a disproportionate share of luck but attributes his success to some other, generally very precise, reason.”
Such confusion crops up in the most unexpected areas, even science, though not in such an accentuated and obvious manner as it does in the world of business. It is endemic in politics, as it can be encountered in the shape of a country’s president discoursing on the jobs that “he” created, “his” recovery, and “his predecessor’s” inflation.
These lucky fools are often fragilistas — they have no idea they are lucky fools. For example:
[W]e often have the mistaken impression that a strategy is an excellent strategy, or an entrepreneur a person endowed with “vision,” or a trader a talented trader, only to realize that 99.9% of their past performance is attributable to chance, and chance alone. Ask a profitable investor to explain the reasons for his success; he will offer some deep and convincing interpretation of the results. Frequently, these delusions are intentional and deserve to bear the name “charlatanism.”
This does not mean that all success is luck or randomness. There is a difference between “it is more random than we think” and “it is all random.”
Let me make it clear here : Of course chance favors the prepared! Hard work, showing up on time, wearing a clean (preferably white) shirt, using deodorant, and some such conventional things contribute to success— they are certainly necessary but may be insufficient as they do not cause success. The same applies to the conventional values of persistence, doggedness and perseverance: necessary, very necessary. One needs to go out and buy a lottery ticket in order to win. Does it mean that the work involved in the trip to the store caused the winning? Of course skills count, but they do count less in highly random environments than they do in dentistry.
No, I am not saying that what your grandmother told you about the value of work ethics is wrong! Furthermore, as most successes are caused by very few “windows of opportunity,” failing to grab one can be deadly for one’s career. Take your luck!
That last paragraph connects to something Charlie Munger once said: “Really good investment opportunities aren’t going to come along too often and won’t last too long, so you’ve got to be ready to act. Have a prepared mind.”
Taleb thinks of success in terms of degrees, so mild success might be explained by skill and labour but outrageous success “is attributable variance.”
*** Luck Makes You Fragile
One thing Taleb hits on that really stuck with me is that “that which came with the help of luck could be taken away by luck (and often rapidly and unexpectedly at that). The flipside, which deserves to be considered as well (in fact it is even more of our concern), is that things that come with little help from luck are more resistant to randomness.” How Antifragile.
Taleb argues this is the problem of induction, “it does not matter how frequently something succeeds if failure is too costly to bear.”
…the literary mind can be intentionally prone to the confusion between noise and meaning, that is, between a randomly constructed arrangement and a precisely intended message. However, this causes little harm; few claim that art is a tool of investigation of the Truth— rather than an attempt to escape it or make it more palatable. Symbolism is the child of our inability and unwillingness to accept randomness; we give meaning to all manner of shapes; we detect human figures in inkblots.
All my life I have suffered the conflict between my love of literature and poetry and my profound allergy to most teachers of literature and “critics.” The French thinker and poet Paul Valery was surprised to listen to a commentary of his poems that found meanings that had until then escaped him (of course, it was pointed out to him that these were intended by his subconscious).
If we’re concerned about situations where randomness is confused with non randomness should we also be concerned with situations where non randomness is mistaken for randomness, which would result in signal being ignored?
First, I am not overly worried about the existence of undetected patterns. We have been reading lengthy and complex messages in just about any manifestation of nature that presents jaggedness (such as the palm of a hand, the residues at the bottom of Turkish coffee cups, etc.). Armed with home supercomputers and chained processors, and helped by complexity and “chaos” theories, the scientists, semiscientists, and pseudoscientists will be able to find portents. Second, we need to take into account the costs of mistakes; in my opinion, mistaking the right column for the left one is not as costly as an error in the opposite direction. Even popular opinion warns that bad information is worse than no information at all.