Skip to content

Super Thinking – Book Notes

Super Thinking: The Big Book of Mental Models
Gabriel Weinberg and Lauren McCann

thinking about a problem from an inverse perspective can unlock new solutions and strategies.

Antifragility is beyond resilience or robustness. The resilient resists shocks and stays the same; the antifragile gets better.

The central mental model to help you become a chef with your thinking is arguing from first principles. It’s the practical starting point to being wrong less, and it means thinking from the bottom up, using basic building blocks of what you think is true to build sound (and sometimes new) conclusions. First principles are the group of self-evident assumptions that make up the foundation on which your conclusions rest—the ingredients in a recipe or the mathematical axioms that underpin a formula.

First principles is kind of a physics way of looking at the world. . . . You kind of boil things down to the most fundamental truths and say, “What are we sure is true?” . . . and then reason up from there.

When arguing from first principles, you are deliberately starting from scratch. You are explicitly avoiding the potential trap of conventional wisdom, which could turn out to be wrong. Even if you end up in agreement with conventional wisdom, by taking the first-principles approach, you will gain a much deeper understanding of the subject at hand.

Just as the concept of first principles is universally applicable, so is de-risking. You can de-risk anything: a policy idea, a vacation plan, a workout routine. When de-risking, you want to test assumptions quickly and easily.

Unfortunately, people often make the mistake of doing way too much work before testing assumptions in the real world. In computer science this trap is called premature optimization, where you tweak or perfect code or algorithms (optimize) too early (prematurely). If your assumptions turn out to be wrong, you’re going to have to throw out all that work, rendering it ultimately a waste of time.

“If you’re not embarrassed by the first version of your product, you’ve launched too late.”

Ockham’s razor helps here. It advises that the simplest explanation is most likely to be true. When you encounter competing explanations that plausibly explain a set of data equally well, you probably want to choose the simplest one to investigate first.

You not only have a natural tendency to think something specific is more probable than something general, but you also have a similarly fallacious tendency to explain data using too many assumptions. The mental model for this second fallacy is overfitting, a concept from statistics. Adding in all those overly specific dating requirements is overfitting your dating history. Similarly, believing you have cancer when you have a cold is overfitting your symptoms.

Another concept you will find useful when making purchasing decisions is anchoring, which describes your tendency to rely too heavily on first impressions when making decisions. You get anchored to the first piece of framing information you encounter. This tendency is commonly exploited by businesses when making offers.

Just having the print-only option—even though no one chooses it—anchors readers to a much higher value for the print-and-web version. It feels like you are getting the web version for free, causing many more people to choose it and creating 43 percent more revenue for the magazine by just adding a version that no one chooses!

Anchoring isn’t just for numbers. Donald Trump uses this mental model, anchoring others to his extreme positions, so that what seem like compromises are actually agreements in his favor. He wrote about this in his 1987 book Trump: The Art of the Deal: My style of deal-making is quite simple and straightforward. I aim very high, and then I just keep pushing and pushing to get what I’m after. Sometimes I settle for less than I sought, but in most cases I still end up with what I want.

Availability bias stems from overreliance on your recent experiences within your frame of reference, at the expense of the big picture.

When you put many similar filter bubbles together, you get echo chambers, where the same ideas seem to bounce around the same groups of people, echoing around the collective chambers of these connected filter bubbles. Echo chambers result in increased partisanship, as people have less and less exposure to alternative viewpoints. And because of availability bias, they consistently overestimate the percentage of people who hold the same opinions.

Another tactical model to help you have greater empathy is the veil of ignorance, put forth by philosopher John Rawls. It holds that when thinking about how society should be organized, we should do so by imagining ourselves ignorant of our particular place in the world, as if there were a veil preventing us from knowing who we are. Rawls refers to this as the “original position.”

Individuals still hang on to old theories in the face of seemingly overwhelming evidence—it happens all the time in science and in life in general. The human tendency to gather and interpret new information in a biased way to confirm preexisting beliefs is called confirmation bias.

Confirmation bias is so hard to overcome that there is a related model called the backfire effect that describes the phenomenon of digging in further on a position when faced with clear evidence that disproves it. In other words, it often backfires when people try to change your mind with facts and figures, having the opposite effect on you than it should; you become more entrenched in the original, incorrect position, not less.

When our bathroom scale delivers bad news, we hop off and then on again, just to make sure we didn’t misread the display or put too much pressure on one foot. When our scale delivers good news, we smile and head for the shower. By uncritically accepting evidence when it pleases us, and insisting on more when it doesn’t, we subtly tip the scales in our favor.

F. Scott Fitzgerald once described something similar to thinking gray when he observed that the test of a first-rate mind is the ability to hold two opposing thoughts at the same time while still retaining the ability to function.

using mental models over time is a slow and steady way to become more antifragile, making you better able to deal with new situations over time.

One way to accelerate building up useful intuition like this is to try consistently to argue from first principles. Another is to take every opportunity you can to figure out what is actually causing things to happen. The remaining mental models in this chapter can help you do just that.

When you try to incentivize behavior by setting a measurable target, people focus primarily on achieving that measure, often in ways you didn’t intend. Most importantly, their focus on the measure may not correlate to the behavior you hoped to promote.

The Streisand effect applies to an even more specific situation: when you unintentionally draw more attention to something when you try to hide it.

Sometimes collateral damage can impact the entity that inflicted the damage in the first place, which is called blowback.

consequences of short-term decisions. For any decision, ask yourself: What kind of debt am I incurring by doing this? What future paths am I taking away by my actions today?

[Peter] used to insist at PayPal that every single person could only do exactly one thing. And we all rebelled, every single person in the company rebelled to this idea. Because it’s so unnatural, it’s so different than other companies where people wanted to do multiple things, especially as you get more senior, you definitely want to do more things and you feel insulted to be asked to do just one thing. Peter would enforce this pretty strictly. He would say, I will not talk to you about anything else besides this one thing I assigned you. I don’t want to hear about how great you are doing over here, just shut up, and Peter would run away. . . . The insight behind this is that most people will solve problems that they understand how to solve. Roughly speaking, they will solve B+ problems instead of A+ problems. A+ problems are high-impact problems for your company, but they are difficult. You don’t wake up in the morning with a solution, so you tend to procrastinate them. So imagine you wake up in the morning and create a list of things to do today, there’s usually the A+ one on the top of the list, but you never get around to it. And so you solve the second and third. Then you have a company of over a hundred people, so it cascades. You have a company that is always solving B+ things, which does mean you grow, which does mean you add value, but you never really create that breakthrough idea. No one is spending 100% of their time banging their head against the wall every day until they solve it.

“What is important is seldom urgent and what is urgent is seldom important.”

In personal situations, most people discount the future implicitly at relatively high discount rates. And they do so in a manner that is not actually fixed over time, which is called hyperbolic discounting. In other words, people really, really value instant gratification over delayed gratification, and this preference plays a central role in procrastination, along with other areas of life where people struggle with self-control, such as dieting, addiction, etc.

Professor Leon Megginson, paraphrasing Darwin, put it like this in a 1963 speech to the Southwestern Social Science Association: “It is not the most intellectual of the species that survives; it is not the strongest that survives; but the species that survives is the one that is able best to adapt and adjust to the changing environment in which it finds itself.” That is, you need to change color like the peppered moth did.

That’s because as the company grows, what is required of its executives changes, moving initially from building a product (design, creation, etc.) to building a company (managing people, defining structure, etc.), to building a sustainable business (financial models, managing managers, etc.).

The most successful (and adaptive) people and organizations are constantly refining how they work and what they work on to be more effective.

Think of the stodgy person at your office or school who is always talking about the “way it’s always been done,” constantly anxious about change and new technology. That person embodies the Shirky principle. You do not want to be that person.

On a shorter time scale, any personal or professional project can be viewed from the perspective of a flywheel. It is slow when you get started on the project, but once you gain some momentum, it seems easier to make progress. And as we discussed in Chapter 3, when we multitask, we never get enough momentum on any one task for it to start to feel easier. Instead, we are constantly spending energy starting and restarting the wheel rather than taking advantage of its momentum once we get it to start spinning.

The flywheel model tells you your efforts will have long-term benefits and will compound on top of previous efforts by yourself and others. It’s the tactical way to apply the concepts of momentum and inertia to your advantage.

Adopt an experimental mindset, looking for opportunities to run experiments and apply the scientific method wherever possible. Respect inertia: create or join healthy flywheels; avoid strategy taxes and trying to enact change in high-inertia situations unless you have a tactical advantage such as discovery of a catalyst and a lot of potential energy. When enacting change, think deeply about how to reach critical mass and how you will navigate the technology adoption life cycle. Use forcing functions to grease the wheels for change. Actively cultivate your luck surface area and put in work needed to not be subsumed by entropy. When faced with what appears to be a zero-sum or black-and-white situation, look for additional options and ultimately for a win-win.

However, the answer is not to dismiss all statistics or data-driven evidence as nonsense, leaving you to base decisions solely on opinions and guesses. Instead, you must use mental models to a get a deeper understanding of an issue, including its underlying research, enabling you to determine what information is credible.

You can also use data from your life and business to derive new insights. Insights based on true patterns, such as those found in market trends, customer behavior, and natural occurrences, can form the basis for major companies and scientific breakthroughs. They can also provide insight in everyday life.

Probability and statistics are the branches of mathematics that give us the most useful mental models for these tasks. As French mathematician Pierre-Simon Laplace wrote in his 1812 book Théorie Analytique des Probabilités: “The most important questions of life are indeed, for the most part, really only problems of probability.”

These are all examples of drawing incorrect conclusions using anecdotal evidence, informally collected evidence from personal anecdotes. You run into trouble when you make generalizations based on anecdotal evidence or weigh it more heavily than scientific evidence. Unfortunately, as Michael Shermer, founder of the Skeptics Society, points out in his 2011 book The Believing Brain, “Anecdotal thinking comes naturally, science requires training.”

What is often overlooked when this fallacy arises is a confounding factor, a third, possibly non-obvious factor that influences both the assumed cause and the observed effect, confounding the ability to draw a correct conclusion. In the case of the flu vaccine, the cold and flu season is that confounding factor. People get the flu vaccine during the time of year when they are more likely to get sick, whether they have received the vaccine or not.

If you set out to collect or evaluate scientific evidence based on an experiment, the first step is to define or understand its hypothesis, the proposed explanation for the effect being studied (e.g., drinking Snapple can reduce the length of the common cold). Defining a hypothesis up front helps to avoid the Texas sharpshooter fallacy. This model is named after a joke about a person who comes upon a barn with targets drawn on the side and bullet holes in the middle of each target. He is amazed at the shooter’s accuracy, only to find that the targets were drawn around the bullet holes after the shots were fired.

A similar concept is the moving target, where the goal of an experiment is changed to support a desired outcome after seeing the results.

In figuring out who goes where, you must appreciate the nuanced differences between people, and in particular, appreciate each individual’s unique set of strengths, goals, and personality traits so you can craft roles for them that best utilize those characteristics and motivate them.

As Thiel wrote in his 2014 book, Zero to One: Great companies can be built on open but unsuspected secrets about how the world works. Consider the Silicon Valley startups that have harnessed the spare capacity that is all around us but often ignored. Before Airbnb, travelers had little choice but to pay high prices for a hotel room, and property owners couldn’t easily and reliably rent out their unoccupied space. Airbnb saw untapped supply and unaddressed demand where others saw nothing at all. The same is true of private car services Lyft and Uber. Few people imagined that it was possible to build a billion-dollar business by simply connecting people who want to go places with people willing to drive them there. We already had state-licensed taxicabs and private limousines; only by believing in and looking for secrets could you see beyond the convention to an opportunity hidden in plain sight.

Unfortunately, even knowing a secret at the right time still isn’t enough to guarantee success. People with great, timely insights often fail to achieve great returns due to poor execution. In this section we will explore mental models that can improve your chances of successful execution. The title of this section is a modern take on an old Japanese proverb, “Vision without action is a daydream. Action without vision is a nightmare.”

Successful, world-changing ideas almost always involve changing the behavior of a large group of people: how they live, work, entertain themselves, or even how they think. For example, as noted earlier, Airbnb has changed the way many people travel. Whether your idea is business-focused or not, you can think of the people whose behavior it seeks to change as your “customers.”

In this context, your secret is the insight you have on how the behavior of your customers should be changed, e.g., people should be able to rent out rooms directly from one another. Your “product” is therefore how you specifically are using your secret to cause a behavioral change in your customers, e.g., creating a marketplace of rentable rooms over the internet.

One way to increase your chances of getting to product/market fit is through customer development, a product development model established by entrepreneur Steve Blank that focuses you on taking a customer-centric view. Customer development’s goal is to help you find a sustainable business model by applying the scientific method (see Chapter 4) through rapid experimentation with your customers. You set up a quick feedback loop with them to learn as much as you can about their needs, resulting in a repeatable process to acquire and retain them.

you want to de-risk an idea by testing your assumptions as cheaply as possible. Customer development is one way to do that, by talking directly to customers or potential customers. As Blank says, “There are no facts inside the building so get the hell outside!” If you can ask the right questions, you can find out whether you have something people really want, signaling product/market fit.

Of course, you probably won’t make something people really want on the first shot. That’s why you build an MVP (again, see Chapter 1) and run experiments with customers to see how it is actually used (if at all), continually refining your product as you incorporate real-world feedback via this rapid experimentation process.

Customer development works in a wide variety of situations: Talk to residents before you move somewhere. Interview current employees before you take a job. Poll a community before enacting a new policy. For any idea you have, think about who the “customer” is and then go talk to them directly about your “product.” Think focus groups, surveys, interviews, etc.

When you are trying to act on a secret by delivering a product or service, you are in a race against your competition for product/market fit. To give yourself the best chance of winning this race, you must engage in customer development the fastest. A model from the military can help: the OODA loop, which is a decision loop of four steps—observe, orient, decide, act (OODA).

What Type of Customer Are You Hunting? Janz notes that to get to $100 million in revenue, a business would need 10 million “flies” paying $10 per year, or 1,000 “elephants” paying $100,000 per year. Believe it or not, there are successful $100 million revenue businesses across the entire spectrum, from those seeking “amoebas” (at $1 per year) to those seeking “whales” (at $10 million per year).

Janz’s framing steers you toward a particular quantitative evaluation: How many “customers” will it take to achieve success? And what exactly do you need them to “pay” (or do)? Once you answer these questions, you can then ask whether there are enough of these types of customers out there. If not, you might consider pivoting toward bigger or smaller types of customers.

What kind of people are your customers exactly—what are their demographics, likes versus dislikes, and hobbies? If you did customer development right, your personas should be modeled on characteristics of real people you’ve already met. Once constructed (say Bob and Sally are your personas), you can ask yourself: Would Bob and/or Sally do X?

A good founder is capable of anticipating which turns lead to treasure and which lead to certain death. A bad founder is just running to the entrance of (say) the “movies/music/filesharing/P2P” maze or the “photosharing” maze without any sense for the history of the industry, the players in the maze, the casualties of the past, and the technologies that are likely to move walls and change assumptions.

It doesn’t matter where the missile is aimed pre-launch. Successful entrepreneurs are constantly collecting data—and constantly looking for bigger and better targets, adjusting course if necessary. And when they find their target, they’re able to lock onto it—regardless of how crowded the space becomes.

These same moat types can apply to your personal place in an organization or field as well. For example: You can have the biggest personal network (exclusive access to relationships). You can build a personal following (strong, trusted brand). You can become the expert in an in-demand area (unique qualifications). You can create a popular blog (substantial control of a distribution channel). Each of these and more can create a moat that protects your place in a competitive landscape.

As Richard Feynman famously wrote in his 1988 book, What Do You Care What Other People Think?, “I learned very early the difference between knowing the name of something and knowing something.”

In my whole life, I have known no wise people (over a broad subject area) who didn’t read all the time—none, zero. You’d be amazed at how much Warren [Buffett] reads—and how much I read. My children laugh at me. They think I’m a book with a couple of legs sticking out. Since the really big ideas carry 95% of the freight, it wasn’t at all hard for me to pick up all the big ideas in all the disciplines and make them a standard part of my mental routines. Once you have the ideas, of course, they’re no good if you don’t practice. If you don’t practice, you lose it. So I went through life constantly practicing this multidisciplinary approach. Well, I can’t tell you what that’s done for me. It’s made life more fun. It’s made me more constructive. It’s made me more helpful to others. It’s made me enormously rich. You name it. That attitude really helps.