5 Mental Models To Get Better At Decision-Making

Give yourself the leverage you need into this maze world

Othmane Senhaji Rhazi
10 min readFeb 1, 2021
Photo by Nina Mercado on Unsplash

In most areas of life, you need to get the right tools to get the job done. Want to fix a clogged toilet? Get a plunger. Want to write a novel? Grab a pen or a laptop. Want to build a homemade rocket? Take some baking soda, vinegar, pencils & a plastic bottle.

As an entrepreneur, the more complex my job is the more tools I’d need because the more factors I’d have to deal with.

The same goes for decision-making — only here, the tools I need are mental models.

Mental models are frameworks mapped in my mind that enable me to better navigate reality and ultimately act on it by taking the optimal decision.

In this article, I am going to share with you my toolbox of mental models to make better decisions.

You should read it if:

  • You want to get better at decision-making
  • You are keen on how our brain works
  • You want to load up tools to get things done

You’ll learn to:

Understand your blind spots through common biases

Solve creatively by reasoning from First Principles

Use probabilistic thinking to weigh your decisions more precisely

Deal with complexity and cut through the noise

Account for temporality in your decision-making

Before getting into it, if you don’t want to miss my future publications, make sure to hit the follow button

1. Cognitive biases

Photo by Robina Weermeijer on Unsplash

Biases are patterns of deviation from norm or rationality.

They influence our minds and affect our decision-making process.

Cognitive biases are blind spots for our subjective reality that dictate our behavior.

Knowing them is therefore the first step to clearing them.

Hereunder is an overview of the main biases we all share and tips to correct them:

  • Confirmation bias

What a man wishes, he also believes.

We tend to believe information that confirms our preconceptions (moral, social, financial…) and discredit the one that does not support our views.

We often only listen to or respect the data that aligns with our own viewpoints.

This leads us to reject any information that opposes our beliefs.

Confirmation bias is deeply rooted in our minds because it is both energy-conserving (no further research to be done) and comfortable (reinforces what we think is right or true).

To fight it, ensure you take information from varied sources, consider different perspectives, and discuss your opinions with people who think differently

  • Availability bias

The availability bias is misjudging the frequency and magnitude of events that have happened recently.

Availability bias, as theorized by Daniel Kahneman, is a cognitive shortcut saving us time when calculating risk.

We tend to heavily weigh our judgments towards more recent information, making new opinions biased toward that latest news.

Just because we read an article about lottery winners, we should not overestimate our own likelihood of winning the jackpot.

Just because there are rising shark attacks on the shore, we should not think that such incidences are relatively common or a strong cause of death.

Just because the last 5 occurrences on Roulette in a casino were black, we should not expect the next one to be red.

… Anyways, you got it ;)

To fight it, pause your immediate answer and challenge it with more research to better understand the scope of your shortcut and the exhaustiveness of your risk assessment.

  • First-conclusion bias

As Charlie Munger, Warren’s Buffet partner, famously pointed out, the mind works a bit like a sperm and egg: the first idea gets in and then the mind shuts off.

Like many other tendencies, this is probably an energy-saving device. Our tendency to settle on first conclusions leads us to accept many erroneous results and cease asking questions; it can be countered with some simple and useful mental routines.

To fight it, the trick is to doubt your first conclusion, no matter how good it seems. Send it back and consider other alternatives, even if you end up reverting back to your original conclusion. This is often a framework used by Chess World Champion Magnus Carlsen when making moves in his games.

  • Hindsight bias

It is the “I-knew-it-all-along” bias, i.e the inclination to see past events as being predictable.

Once we know the outcome, it’s nearly impossible to turn back the clock mentally. Our narrative instinct leads us to reason that we knew it all along when in fact we are often simply reasoning ex-post with information not available to us before the event.

To fight it, it’s wise to keep a journal of important decisions for an unaltered record and to re-examine our beliefs when we convince ourselves that we knew it all along.

  • Overconfidence bias

Our inclination to trust our capabilities by overrating our abilities and skills as decision-makers.

It is also the syndrome of thinking that your contribution is more important than it really is.

This bias occurs when people hold a false and misleading assessment of their skills, intellect, or talent.

It is dangerous as it can mask you from the truth and cause you to take risks, certain that you’re correct in your assumptions.

Overconfidence bias can go hand-in-hand with anchoring; with limited knowledge or experience, an idealistic faith in your own decisions can lead you to act hastily or on hunches.

If you walk into Wall Street and ask 1000 analysts about their ability to predict the market’s evolution, most of them will think their analytical skills to be above average. Yet, it is obviously a statistical impossibility for most analysts to be above the average analyst.

To fight it, try to be self-aware and ask yourself questions: what information did you use to form your decision? Was this information based on facts or intuition? Was it gathered methodically? If the data was in any way compromised, how can you make it objective?

2. First Principles Thinking

Photo by Michael Dziedzic on Unsplash

What does Aristotle’s dialectic, Gutenberg’s printing method or Musk’s “affordable” rockets have in common?

They all rely on a fundamental approach of thinking: First Principles.

A First Principle is a basis from which a thing is known

Its underlying is that everything we do is underpinned by a foundational belief or First Principles.

It is an assumption that can’t be deduced any further.

The fact is that, when solving problems, our brains are naturally wired to reason by analogy, i.e by copying what others do with slight variations relying therefore on multiple-layers of assumptions.

We project the current form (how we do things) forward rather than the function (why we do things).

And the truth to say, we ought to do so. Otherwise, our mental charge would rapidly be overwhelming and therefore more of a handicap than a solution.

Yet, adopting a First Principles’ approach by understanding the problem’s underlying rather than addressing its effects is useful for coming up with a creative solution to fix its root cause.

We breakdown a problem into its granular pieces and then put them back together in a more effective/different way.

Let’s take Elon Musk’s example.

In his quest to send rockets to Mars, he was faced with the astonishing $65 million rocket’s price tag making it less economically viable for his ultimate purpose: settling life in Mars.

So he went thinking, “what is a rocket made of? Aluminum, titanium, copper & carbon fiber. What’s the cost of these materials? 2% of the total price tag. Can I access a supply at scale? Yes. Let’s build our own”.

Hence, SpaceX was born.

In practice, it took thousands of experimentations, failures, and tests but it ultimately worked.

Besides, it doesn’t always take an atomic-level granularity to reap the benefits of such First Principles, often 2 to 3 layers of abstraction gets you closer to disruption than most people.

First Principles’ approach challenges conventions, forms & dogmas that are often accepted without question, thus setting up boundaries to our creativity.

Abandoning our constructed opinions to optimize the function - i.e the outcome of why we do things - is how we can think for ourselves and make decisions that lead to creative solutions and therefore greater impact.

3. Bayesian Updating

Photo by Nick Hillier on Unsplash

Imagine opening tomorrow’s newspaper and reading “Violent Crimes Skyrocketing for the Past Year”. Would you lock yourself up and stay at home?

If you’re a Bayesian thinker you will remember that violent crimes have been declining for the past decade thanks to economic prosperity, better living standards, and social consciousness.

As a result, even if the crime doubled this past year, it probably just moved from 0,01% to 0,02%, making it unfortunate for 2 people out of 10 000 but for sure not alarming enough for you to panic or lock yourself up.

Bayesian thinking is therefore your ability and desire to assign probabilities of truth and accuracy to anything you think you know and then being willing to update those probabilities when new information comes in.

Going back to our example, the headline should only slightly diminish that the crime rate is low, a tiny bit higher than you thought it was; therefore adjusting your belief by a tiny bit as well. Yet, if that headline comes in more regularly you should adjust your belief accordingly.

We live in a world where we’re constantly flooded by information and data. Therefore our brain has developed a hardwired tendency to believe what we hear or read with little to no questioning.

A simple hack to fight this cognitive shortcut is to learn Thinking In Bets, as stated by Annie Duke, and challenge your beliefs with the following question: “Would you bet a $100 on it?”

Bayesian updating helps to fight our tendency to either dismiss new evidence or embrace it as though nothing else matters.

Bayesians try to weigh both the old hypothesis and the new evidence sensibly, hence, avoiding the availability heuristic bias.

4. Occam’s Razor

Photo by Nacho Fernández on Unsplash

Imagine waking up to a terrible headache, a runny nose, and fatigue. Naturally, you’d google up your symptoms — which you discover can either be caused by flu, COVID-19 or, Ebola. Which explanation should you believe?

The answer is important as it’s the difference between going back to bed for rest or going to quarantine.

In a situation like this, you’re no longer just weighing a single piece of evidence as you’d do in probabilistic thinking but you need to decide between two different explanations. To cut through a situation like that, you can use Occam’s razor.

Occam’s razor is a principle of logic and problem solving that suggests: when given two explanations that account for facts equally well, the simpler one has a higher likelihood of being true.

Actually, the more complicated your explanation the more variables it would have to account for, and for each added variable the explanation is less and less likely to be true.

If your friend doesn’t show up at your party he might have got into a car accident, been kidnapped, or even died, but he might as well simply forgot to glance at his watch and is running late.

Occam’s razor is, therefore, a law of parsimony, that shaves off unlikely explanations.

IMPORTANT: statistically, Black Swan events happen, and their likelihood shouldn’t be dismissed. Using this mental model doesn’t mean disregarding any event or situation involving complexity (multiple variables), it is simply a heuristic, a rule of thumb, that suggests what might likely be true. Whenever faced with multiple “ifs”, Occam’s razor should trigger your alarm bells to investigate further.

5. Temporal Discounting or Regret Minimization Framework

Photo by Murray Campbell on Unsplash

I describe myself as a “night guy”. I used to like staying up late without having to worry about getting little to no sleep to run the next day. I used to think that is Morning Othmane’s problem, not Night Othmane.

No wonder, Morning Othmane hates Night Othmane as he always screws him over.

As funny and common as that might be, it embodies something we all do, temporal discounting.

Temporal discounting is making decisions that favor our immediate desires at the expense of our future selves.

Thankfully, there are a few things we can do to remedy this tendency.

Imagining future outcomes is one of them. Imagined futures aren’t random, they are based on past memory. That means when our brains imagine what the future will be and the consequences linked to that from our past experiences — weighing therefore in the decision we’re about to make.

Staying up late is linked to waking up tired and being unproductive and might therefore nudge us to go earlier to bed.

Another way to project it is to use Suzy Welch’s 10–10–10 rule, bringing out the future into the immediate present when faced with a decision to make. We’ll ask ourselves how we’d feel about a decision in 10 min, 10 months & 10 years.

We imagine being accountable in the future and motivate ourselves for any potential regret we might feel because the present moment and immediate future are always more vivid and thus powerful to us.

On the same note, Jeff Bezos walked us through his regret minimization framework here, as a mental model for temporal discounting that led him to start what is one of today’s greatest companies: Amazon.

We can never control uncertainty but we can discount it with our projections and backcast our roadmap to achievement.

Reality is a complex maze.

Mental models are maps that help us navigate through it.

Each model has its specific uses, strengths, and limitations. Their purpose is to sharpen our thinking, develop our knowledge, increase our understanding of the world, get to a better version of ourselves, and ultimately get better at decision-making.

If you enjoyed reading this, make sure to hit the clap button as that gives me the motivation to share furthermore.

Feel free to reach out on Linkedin, Twitter.

Always happy to connect and share.

With love,

O

--

--

Othmane Senhaji Rhazi

Entrepreneur | Web3/NFTs Advocate | Life Hacker | Tech Biased | I write and share about my learnings and failures as an Entrepreneur