Masterclass Series

A Digital Field Guide to
Cognitive Bias & Heuristics.

" In situations where the possible consequences are large, try to be as reasonable and rational as possible when making a decision. In a situation where the consequences are small, let intuition take over. "
— Rolf Dobelli

Module 01

Foundations of Thought

Before analyzing specific errors, we must understand the academic and evolutionary framework that dictates how human beings process information.

Behavioral Economics

The study of why humans consistently deviate from "rational" or "optimal" decisions. Unlike neoclassical economics, which assumes we are logical actors with perfect self-control, this field recognizes that we are subject to emotion, impulsivity, and environmental "nudges." Pioneered by Kahneman, Tversky, and Thaler, it identifies our "predictable irrationality."

Big Data vs. Evolution

Big Data processes are exceptional at codifying the past, but they are incapable of inventing the future. They rely on historical patterns, which can often bake in past biases. Human decision-making, while flawed, possesses the unique virtue of being able to evolve and consciously break from the past.

System 1 vs. System 2

Thinking clearly requires navigating our dual-process brain. System 1 (gut response) is fast and intuitive but falls for Simple Logic traps. System 2 (analytical reflection) is slow and necessary for accuracy. The Cognitive Reflection Test (CRT) measures the tendency to override a gut response and engage in further reflection. Lower scores play it safe intuitively; higher scores can differentiate impulse control and resist instant gratification. For high-stakes decisions, use System 2 to override impulsive first impressions.

Module 02

The Master Latticework

An interactive syllabus of 60+ cognitive biases and logical fallacies. Use the sidebar to navigate topics, or search directly for a concept.

1. Ego & Self-Perception

How we protect our self-image, overestimate our abilities, and fabricate our motivations.

Success in complex fields is often a matter of luck masquerading as skill. Beginner's Luck occurs when early success is misunderstood as advanced talent in the domain, creating dangerous overconfidence. This feeds the Illusion of Skill, where we credit a person's "genius" for a result that was actually the product of random chance. True skill can only be claimed if success is repeated consistently across many different environments over time.

Catharsis is the misconception that venting your anger (shouting, punching a bag) reduces stress. In reality, studies show that venting often increases aggressive behavior. By "blowing off steam," you keep the anger "on the stove," reinforcing the neural pathways for aggression. Additionally, you can become accustomed to blowing off steam, resulting in a dependence on it. The more effective approach is to simply stop and allow the emotion to dissipate naturally.

We have an innate drive to protect our self-image. Cognitive Dissonance is the mental pain we feel when our actions don't match our beliefs. This usually gets resolved by "reinterpreting" the facts (e.g., "I didn't want that promotion anyway, those grapes were probably sour anyway"). This is reinforced by Consistency Bias, where we assume we've always felt the way we do now, and Self-Serving Bias, where we take credit for success but blame failure on external factors. These biases prevent us from taking accountability and learning from our mistakes. To improve, we must be brave enough to admit when we've failed rather than rewriting reality to save face.

Mark Twain captured this bias perfectly when he stated, "If the only tool you have is a hammer, every problem looks like a nail." This is the tendency to view the world through the narrow lens of our own specific expertise. To be effective, we must build a "latticework of mental models" from many different disciplines so we don't try to solve every problem with the same limited tool.

Domain Dependence is the surprising inability to transfer knowledge or logic from one field to another. A person might be a disciplined investor at work but a highly impulsive gambler in their personal life, or a doctor can be a chain smoker despite warning their patients to reduce their reliance on tobacco. Our brains often store lessons in "silos," and we must work hard to build bridges between them.

The Dunning-Kruger Effect is the misconception that we can accurately estimate our own competence in a subject or skill. In reality, the less you know about a domain, the more confident you tend to be, because you lack the expertise to even recognize your own ignorance. Confidence often dips as actual skill increases and you realize the complexity of the task. The opposite of this effect is Imposture Syndrome.

The Forecast Illusion is the delusional belief that "experts" can accurately predict the future of complex systems like the economy. Research shows expert predictions are often no better than "monkeys throwing darts." Forecasts tend to get blurrier and less reliable as the complexity of a topic increases and the timeline moves further away. Reliance on “expert” forecasts is exacerbated by the Overconfidence Effect, which is the systematic tendency to overestimate our own knowledge and skills. We should assume our projects will take longer than we think (see planning fallacy) and view any "certain" prediction with deep skepticism.

The Introspection Illusion is the belief that we understand our motivations and desires, our likes and dislikes. It is fed by the belief that we can find "truth" by looking inward. In reality, our conscious mind often has no access to the origin of certain emotional states. When this happens, we simply fabricate a logical story to explain our subconscious impulses—a process called Confabulation.

Learned Helplessness is a state of mind that results from being stuck in a seemingly impossible situation. When you feel like you aren’t in control of your destiny, you will give up and accept whatever situation you are in. Experiments have shown that those who couldn't escape shocks eventually stopped trying even when an exit was provided. When you are able to succeed at easy tasks, hard tasks feel possible.

Not-Invented-Here Syndrome is the tribal ego trap where we fool ourselves into thinking that anything we create is unbeatable and superior to external ideas. Take a step back and examine your ideas to see which were truly amazing, putting aside pride to adopt the best practices of others.

Reactance is the urge to do the opposite of what someone is trying to make you do. When we feel our liberty is being constrained, our inclination is to resist; however, in doing so, we often overcompensate and lose our own objectivity, acting in ways counter to our own wellbeing. Wisdom springs from reflection, not reaction.

Self-Handicapping happens when we often create conditions for failure ahead of time to protect our ego. It is an "anticipatory rationalization" (e.g., partying the night before a test) so that if we fail, we can blame the circumstance rather than our own ability.

We are prone to believing vague, positive statements are true if they address us personally. The Forer Effect (or Barnum Effect) shows that if a statement is ambiguous, we will boil away the ambiguity by finding ways to match the info with our own traits. This explains why pseudoscience (astrology, tarot) works. Be wary of broad general statements.

2. Social & Tribal Dynamics

How we interact with, conform to, and judge others within our groups.

We are evolutionarily trained to defer to rank and status. Authority Bias (or the HiPPO effect) leads us to follow leaders even when data contradicts them. This is made dangerous by Chauffeur Knowledge—the ability to "put on a show" without deep understanding. This is often masked by the Twaddle Tendency, where reams of words disguise intellectual laziness or underdeveloped ideas. "You would not believe how difficult it is to be simple and clear. People are afraid that they may be seen as a simpleton. In reality, just the opposite is true." - Jack Welsh. Mastery is marked by clarity, curiosity, and the ability to say "I don't know."

"Because" Justification reveals that when behavior is justified, you encounter more tolerance and helpfulness, regardless of the excuse's actual value. Research shows that using the word "because" dramatically increases compliance—even if the reason is hollow (e.g., "Can I cut in line because I need to get to the front?"). The word "because" greases the wheels of human interaction. Be wary of people who use hollow justifications to bypass your critical thinking.

The Bystander Effect is the social phenomenon where individuals are less likely to offer help to a victim when other people are present. This is caused by a "diffusion of responsibility"—everyone assumes someone else will act. In an emergency, the best strategy is to point to a specific person and give a direct command ("You in the blue shirt, call 911") to break the cycle of deindividuation.

The Curse of Knowledge is a cognitive bias that occurs when an individual assumes that others have the background to understand. Once we know something, it is remarkably difficult to imagine a state of not knowing it. This leads to a breakdown in communication, especially in management and education, as the expert cannot fathom why the 'obvious' is not apparent to the novice.

Envy is fueled by proximity; we feel resentment when a colleague gets a slightly better office, not when a billionaire gets a yacht. This leads to Social Comparison Bias, the tendency to withhold help from someone who might outdo us. This cycle eventually cripples organizations as managers hire people who are less qualified than themselves, watering down the talent across the team.

The False-Consensus Effect is the belief that everyone else thinks and feels similarly to us. An implication is that when people do not share our opinion, we categorize them as "abnormal" or uninformed. Don't assume your worldview is shared by others, and be open to those who have a differing opinion to maintain clear judgment.

We judge people through interconnected ego-driven errors. The Fundamental Attribution Error (or Correspondence Bias) comes from the inclination to attribute people’s behavior to the way they are rather than to the situation they are in. This is magnified by the Halo Effect, where one striking positive trait—like beauty—"outshines" all other characteristics. These feed into Liking Bias, our tendency to help those we find attractive or similar to us.

We have a drive to conform to our "tribe." Social Proof makes us feel a behavior is correct if others are doing it (As W. Somerset Maugham stated, “If 50 million people say something foolish, it is still foolish.”). This leads to the Bandwagon Effect where ideas grow as more people adopt them. Within a team, this becomes Groupthink, where members suppress doubts to maintain "harmony." To prevent this, appoint a "Devil's Advocate."

In-group Out-group Bias is the instinct to favor our own group and view outsiders with suspicion. This manifests as "out-group homogeneity," where we see "those others" as a monolithic mass while perceiving our own group as diverse. This is the foundation of stereotypes and prejudice. This also results in surprise when members of your own group demonstrate contrasting opinions.

The Just-World Fallacy is the tendency to believe that the world is inherently fair—that "good things happen to good people." This bias leads us to blame victims for their misfortune and ignore the massive role that random chance and unfair circumstances play in life.

The Law of Triviality (coined by C. Northcote Parkinson) suggests that members of an organization give disproportionate weight to trivial issues while avoiding more complex ones. The term "Bike-Shedding" comes from a committee tasking itself with designing a nuclear power plant, yet spending the majority of its time debating the materials for the employees' bike shed because it is a topic everyone understands.

The Public Goods Game illustrates that without regulation, "slackers and cheaters" will crash economic systems because people don't want to feel like suckers. The "tragedy" is that a tiny amount of greed from one exploiter can deplete a shared resource for everyone.

Reciprocity is the human drive to return a favor—the "glue" of society. It is frequently exploited by marketers (e.g., a "free" pen or calendar) trying to incentivize a purchase or donation. Learn to accept a gift as a gift without the feeling of obligation.

Just believing a future event will happen can cause it to happen if the event depends on human behavior (Thomas Theorem). This manifests in Labeling Theory: if a teacher believes a student is a "genius," they give them extra attention, which leads the student to actually perform better.

In a group, individuals put in less effort because they feel their contribution won't be measured. This "diffusion of responsibility" means that as a team gets larger, its efficiency often drops. Every project should have clear, individual responsibilities. This was famously recounted in the story of the village residents who all contributed water instead of wine for a festival, thinking their lack of contribution would not be noticed.

The Ultimatum Game shows that we base decisions on status and fairness rather than pure logic. We are willing to get nothing (refusing a low offer) if it ensures fair treatment in the future.

3. Logic, Probability, & Statistics

How our brains misinterpret math, randomness, and scale.

Most people judge success based on the single path they took, but a rational analysis must account for Alternative Paths—the invisible outcomes that could have happened but didn't. This failure to account for hidden risk leaves us blindsided by The Black Swan (a concept popularized by Nassim Nicholas Taleb), an unthinkable, high-impact event that is impossible to predict but rationalized as "obvious" after the fact. We must build robust, "antifragile" systems that survive chaos rather than relying on the "luck" of our current path.

We are prone to drawing massive conclusions from tiny sets of data. Base-Rate Neglect occurs when a vivid description entices us to overlook general probability (e.g., assuming a man is a professor because he looks like one, ignoring that there are far more truck drivers than professors). This is exacerbated by The Law of Small Numbers, where we draw universal certainties from a small sampling pool that is statistically volatile.

Assuring that because Event B followed Event A, A must have caused B is a fundamental logical error (for example, does the swimmer have their body because they swim? Or did they choose to swim because they had the right body for it?). Causation vs. Correlation reminds us that linked events may both be caused by a third factor. False Causality is the root of many superstitions and medical quackery. Sequence is not the same as consequence.

The Conjunction Fallacy is the error of believing that a "specific" condition is more likely than a "general" one. This is driven by the Representativeness Heuristic, where we jump to conclusions based on how closely a person or event matches a preconceived character type. The more details we hear that match our mental models, the more "probable" the story feels, even though extra details make it mathematically less likely.

Our brains think linearly, but the world often moves exponentially, leading us to underestimate how fast a virus spreads or a debt spirals. This is compounded by the Planning Fallacy, our habit of underestimating the time of future tasks, and Declinism, the tendency to remember the past as better than it was while expecting the future to be worse than evidence suggests.

Events are typically blamed on a single factor, but in reality no such Smoking Gun exists. Attributing a complex result to a single factor is a dangerous oversimplification. "When an apple ripens in fall, what makes it fall? Is it that it is attracted to the ground, is it that the stem withers, is it that the sun has dried it up, that it has grown heavier, that the wind shakes it, that the boys standing underneath that wants to eat it? No one thing is the cause." - Tolstoy (War and Peace).

Induction Thinking is the process of drawing a universal rule from individual observations (e.g., "The sun has risen every day, so it will always rise"). While necessary for survival, it can lead to catastrophic failures when we assume the future will always mirror the past. We must always be prepared for the "Black Swan."

This is a statistical error where failures are "removed" from the data so they don't drag down the success rate. If a company claims its new hires are all stars but ignores those who quit in the first week, they are committing this error. To get an honest measurement, you must include everyone who started the process.

Neglect of Probability occurs when we respond to the expected magnitude of an event, not its likelihood. We are drawn to the potential results (size of a jackpot or severity of a disaster) vs the actual odds of winning or it occurring. To make rational decisions, we must force ourselves to calculate the percentages, not just fixate on the emotional weight of the outcome.

Extreme performance is usually a "lucky" combination of factors that will eventually return to the average (Regression to the Mean). A common misunderstanding of this is the Gambler's Fallacy, the belief that if an independent event happens more frequently than normal, it will happen less frequently in the future. The universe has no memory; every spin is a fresh start.

We often feel singled out for bad luck. This is Self-Selection Bias: if you are in the traffic, you are by definition part of the group that is stuck. You don't notice the thousands of times you weren't stuck because there was nothing to notice.

Stage Migration is a statistical illusion where moving an element from one set to another increases the average of both sets. We must be wary of "improvements" that are merely the result of moving data points between different buckets. Will Rogers captured this in the play Oklahoma with the line, “Oklahomans who move to California raise both states' average IQ."

We systematically overestimate our chances of success by focusing only on the "winners." A classic example is the WWII RAF planes: they tried to armor the areas with bullet holes, but mathematician Abraham Wald realized they should armor the areas without holes—the planes hit there were the ones that never came back.

The Problem with Averages arises because averages are risky, as they often mask the underlying distribution. The power law (where a few extreme outliers dominate the distribution) renders averages useless. If Bill Gates walks into a bar of ten people, the "average" person in that bar suddenly becomes a billionaire. With averages, always take the underlying distribution or the median into account.

4. Memory, Perception, & Patterns

How we misremember the past, see illusions, and let sensory data cloud our thinking.

Our brains are "hyperactive pattern seekers." The Clustering Illusion causes us to see trends in random noise, while Pareidolia leads us to see meaningful patterns in stimuli. We then package these into a neat Story Bias, editing messy facts to fit a narrative. Additionally, Personification reveals that statistics don't stir action; we are moved by a situation more when we can relate to it. When you encounter human stories, be careful to check if the facts and statistical distributions measure up to the emotional value of the story.

Confirmation Bias is the tendency to seek out and prioritize information that supports our existing beliefs. Myside Bias is a specific flavor of this where we overlook flaws in our own arguments while easily noticing them in others. In its most aggressive form, this leads to the Backfire Effect, where being presented with contradictory evidence actually causes us to double down on our original beliefs rather than questioning them.

Contagion Bias is our irrational inability to ignore the historical or emotional "essence" we perceive in objects. Even if an item is physically identical, we feel it has been "infected" by its past (e.g., refusing to wear a sweater once owned by a murderer). This "magical thinking" leads us to value original artifacts over identical replicas based purely on association.

The Einstellung Effect occurs when the first idea that comes to mind—triggered by familiar features of a problem—prevents a better solution from being found. Our past success with a specific method becomes a trap, making us "set" in our ways and blind to more innovative approaches.

Embodied Cognition is the truth that our environment "primes" our thoughts. We translate physical sensations into words and then believe them (e.g., holding a warm cup of coffee makes you perceive a stranger as "warmer" and more generous). Texture, temperature, and weight all influence our subconscious judgments.

Expectations are thoughts that have tangible effects on reality. Our happiness is determined not by the outcome, but by the "gap" between the outcome and our expectations. The secret to happiness is high standards for your own character but low expectations for things you cannot control.

Our memories are "Wikipedia pages" that we edit at any time. Falsification of History describes how our brains rewrite memories to match current beliefs. This creates Hindsight Bias (the "I knew it all along" effect), where an event seems inevitable once it has happened.

Framing occurs when the way information is presented profoundly alters how it is received. A surgery with a "90% survival rate" sounds much better than one with a "10% mortality rate," even though the data is identical. We respond to the spin and context surrounding a fact rather than the fact itself. To think clearly, one must strip away the narrative frame and look strictly at the raw, unspun information. Many marriages would go smoother if the sentence was framed as “Please empty the bin” rather than the more ambiguous statement, “the trash is full.”

We believe we see everything and control everything, but both are illusions. Illusion of Attention (inattentional blindness) means we miss obvious things right in front of us (the Invisible Gorilla). Illusion of Control is the belief that we can influence random events (like throwing dice harder).

Information Bias is the belief that "more data is always better." Extra information is often merely "noise." This is best seen in the News Illusion: news is to the mind what sugar is to the body—easy to consume but lacking nutritional value. It focuses on sensational events that distract us from reality.

The Misinformation Effect shows that our memories are highly permeable to influences from the present. We easily incorporate the memories or suggestions of others into our own heads, evolving the narrative of our past to match what we hear today.

The Negativity Bias is the evolutionary hard-wiring that causes bad news, bad emotions, and negative feedback to have a much greater impact than positive ones. We are biologically designed to prioritize threats for survival, meaning we remember one insult longer than ten compliments.

Neomania is the obsession with "the new." We assume the newest gadget is the most important. However, the Lindy Effect suggests that technology that has lasted X years, will likely last X more years. Future forecasts include modern trends, but new flashy gadgets will come and go. 50 years in the future will look a lot like today.

Normalcy Bias is stalling during a crisis and pretending everything will continue to be fine. In a disaster, many people become abnormally calm and "wait and see" rather than acting. Those who defeat this bias move when others are still considering if they should.

Optimism Bias is the tendency to overestimate the likelihood of positive outcomes, while Pessimism Bias is the tendency to overestimate negative ones. Pessimism is often a defense mechanism against disappointment, while optimism can lead to risky judgments. Ironically, rational judgments often provide more genuine reasons for a positive outlook than blind optimism.

The Placebo Effect is when a fake treatment produces a real improvement because the patient expects it to work. While it can be effective for conditions influenced by the mind (like pain or stress), it cannot cure viruses or broken bones. We must rely on evidence-based medicine rather than just positive expectations.

Experiences are not recorded evenly. The Primacy Effect (first impression) and the Recency Effect (most recent impression) dominate our memory. This "peak-end" bias causes us to override everything that happened in the middle.

Priming is the process where a past stimulus affects your current behavior or perception without you realizing it. Every perception sets off a chain of related ideas in your unconscious mind (e.g., hold a warm cup, feel "warmly" toward a stranger).

The Sleeper Effect is where a message from an untrustworthy source becomes more credible over time because we remember the "fact" but forget the "liar." Over time, the source fades while the message remains. Always ask, "Where did I first hear this?"

Tachypsychia is the distortion of time perception during a high-stress event. While it feels like the world is moving slowly, it is actually just the brain recording more information than usual.

The Current Self is happy while experiencing things in the flow, while the Remembering Self is happy when reflecting on a life that feels content. To be truly happy, you must satisfy both; go get the ice cream (Current) but do so in a way that creates a meaningful memory (Remembering).

The Zeigarnik effect is a psychological phenomenon where people tend to remember unfinished or interrupted tasks better than completed ones, due to a sense of tension or cognitive dissonance caused by the incomplete state. To "close the loop" and relax, you just need a concrete, written plan for how you will finish the task. Once the brain sees a "path to completion," it stops nagging you.

5. Economic & Value Judgments

How we irrationally value time, money, and action.

These two biases represent opposite ends of the same spectrum: our inability to judge the value of doing something versus doing nothing. Action Bias is the compulsion to "do something" in high-pressure or new situations, even when inaction is more rational, fueled by the fear that being still will be seen as incompetence ("All of humanity's problems stem from man's inability to sit quietly in a room alone" - Blaise Pascal). Conversely, Omission Bias is the tendency to see harmful inaction as less morally wrong than a harmful action that produces the same result. We prefer to let a problem fester rather than risk an intervention that might fail (this is tested in the Trolley Problem).

Alternative Blindness is the tendency to view choices as a binary "Yes or No" rather than comparing a specific offer against the "next best alternative." This lack of imagination is often paralyzed by the Paradox of Choice, where an abundance of options leads to "analysis paralysis." We must learn to be "satisficers"—finding an option that meets our criteria—rather than "maximizers" who obsessively search for a perfection that doesn't exist.

Ambiguity Aversion shows we will often choose a worse deal just because the odds are clearly defined. This is compounded by the Fear of Regret, where we behave conservatively to avoid the potential shame of being wrong, and Zero-Risk Bias, where we irrationally prefer the absolute elimination of a tiny risk (e.g., going from 1% to 0%) over a much larger reduction in total risk.

Our sense of value is never absolute; it is always relative to the first information we receive. The Anchor Effect is a bias where an initial piece of information—even a random number—serves as a mental benchmark for all subsequent judgments. This is amplified by the Contrast Effect, where we judge beauty, cost, or size in relation to what is nearby rather than in isolation.

Automation Bias is the tendency to favor suggestions from automated systems (like algorithms or GPS) even when they are flawed. This is paired with The Google Effect (Digital Amnesia), our tendency to forget information that can be easily accessed online. Because we know we can "look it up," our brains de-prioritize internal encoding, leading to a state where we "know" less and rely entirely on external systems.

Making decisions uses up mental "willpower." Decision Fatigue leads us to either make impulsive choices or take the path of least resistance (the default) as the day goes on. This can lead to Decision Paralysis, where an abundance of options freezes us, causing us to retreat to a default plan or avoid the decision entirely.

The Decoy Effect occurs when consumers change their preference between two options when presented with a third option that is asymmetrically dominated. Marketers use 'the decoy' (Option C) not for purchase, but to make Option A look like a superior value compared to Option B. It exploits our relative judgment to steer behavior.

The Default Effect explains our powerful tendency to stick with the standard option provided to us. Humans are remarkably reluctant to exert the mental energy required to "opt-out" of a pre-set path. This is a massive lever for policy-makers (e.g., organ donation rates), as the "path of least resistance" usually wins. (see Richard Thaler’s work Nudge)

We assign higher value to things we have worked hard to create or achieve. The IKEA Effect suggests we love things more specifically when we successfully complete them. If the furniture falls apart, the effect vanishes; it is the combination of labor and achievement that creates the disproportionate bond.

We are irrationally anchored to what we already possess or have already spent. Endowment Effect: Overvaluing things simply because we own them. Loss Aversion: The pain of a loss is twice as powerful as the joy of a gain. Sunk Cost Fallacy: Continuing a failing action because of prior investment. This leads to Escalation of Commitment, where we double down on failing plans to soothe our egos and validate past decisions.

The Hedonic Treadmill is our tendency to return to a stable "baseline" of happiness despite major life changes. We think a promotion will make us "permanently" happier, but after a few months, it becomes the "new normal." Focus on intrinsic goals (relationships, health) rather than extrinsic ones (status, wealth) for more lasting satisfaction. We work harder, advance, and can afford nicer things, yet this does not make us happier.

The House-Money Effect is the tendency to treat "found" money differently than "earned" money. We might gamble away a bonus we didn't expect, even though we would never spend our salary in the same way. This is an error; money is perfectly fungible. $100 is $100 regardless of the source.

The Inability to Close Doors is our psychological obsession with keeping options open, sometimes at great cost. By trying to be a "jack of all trades," we diffuse our focus. Learn to close doors and focus on important options. Document what NOT to pursue, making calculated decisions in advance.

People are "reward-seeking missiles"; they will do exactly what they are incentivized to do. However, Motivation Crowding shows that money doesn't always motivate, and often does the opposite. Financial incentives (bonuses) work in industries with uninspiring jobs. When someone is passionate, money can seem like a bribe, and could undermine the ultimate goal.

Outcome Bias is the tendency to evaluate a decision based on its result rather than the quality of the process. A lucky fool is called a genius; a careful strategist who suffers a rare break is called a failure. Judge decisions by the logic available at the time of the choice.

Procrastination is a failure of "self-regulation" fueled by Present Bias. This is mathematically defined by Hyperbolic Discounting, where our 'emotional interest rate' skyrockets as a reward becomes immediate. Use "commitment devices" to force your present self to behave for your future self.

The Scarcity Error is an irrational response to "limited" items, triggering a panic that overrides our rational centers. Marketers use "only 3 left!" to trigger this impulse. Stay rational by judging an item's value based on its utility, not its rarity.

This is the tendency to "say what is necessary" (lie) to get a job or project approved, assuming the details can be fixed later. When listening to a pitch, don't look at promises; look for past evidence where the person had "skin in the game."

Volunteer's Folly is the illusion that donating your physical time is always the best way to help a cause. Volunteering by providing a task that isn't your expertise isn't efficient. "6 hours building bird houses" could be funded with income from less hours of work while being completed by expert tradesmen. "Giving" should be judged by its objective impact, not by how virtuous the manual labor makes you feel.

In an auction, the "winner" is often the biggest loser because they are usually the one who overestimated the value of the item the most. Set your "maximum bid" based on objective value before the excitement begins.

6. Argumentation & Debating Fallacies

How we manipulate logic to win arguments rather than find the truth.

The Ambiguity Fallacy occurs when a double meaning or an unclear use of language is used to mislead or misrepresent the truth. It is a favorite tool of politicians who wish to remain "technically" honest while leading the audience to a false conclusion. To counter this, one must demand precise definitions and refuse to accept "vague" as an answer.

The Anecdotal Fallacy occurs when we use personal experience or an isolated example to justify an argument, especially to dismiss statistics. Because our brains are "narrative processors," a single vivid story about a grandfather who smoked and lived to 97 feels more "true" than a meta-analysis of ten thousand clinical studies. One must remember that your personal experience is a data point of one, and "data" is not the plural of "anecdote."

Begging the Question is a circular argument in which the conclusion is included in the premise (e.g., "The word of Zorbo is flawless because it says so in the Infallible Book of Zorbo"). It arises when people have an assumption so deeply ingrained that they take it as a given. Circular reasoning is fundamentally incoherent because it provides no external evidence to support its claims.

Belief Bias occurs when we judge the strength of an argument based on how plausible its conclusion seems, rather than how strongly the evidence supports it. If a conclusion aligns with our existing worldview, we will overlook logical fallacies in the argument used to reach it. Conversely, we will reject perfectly logical arguments if they lead to a conclusion we dislike.

The Burden of Proof fallacy occurs when someone makes a claim and, instead of proving it, challenges someone else to disprove it (e.g., "You can't prove ghosts don't exist"). The responsibility to prove a claim always lies with the person making the assertion. A lack of disproof does not constitute evidence of truth.

The Composition Fallacy assumes that what is true of the part must be true of the whole (e.g., "Atoms are invisible, I am made of atoms, therefore I am invisible"). The Division Fallacy is the reverse, assuming what is true of the whole must apply to every part. We must show evidence for why a consistency will exist rather than assuming it.

These are tactics used to win arguments rather than find truth.
Ad Hominem: Attacking the person's character rather than their claim.
Straw Man: Reframing an opponent's position to make it easier to attack.
Tu Quoque ("You Also"): Answering criticism with criticism to avoid defending your own position.
Genetic Fallacy: Judging something as good or bad solely on the basis of its origin.
Motte-and-Bailey: Defending a controversial position by retreating to an easily defensible one when challenged.

The Fallacy Fallacy is the presumption that because a claim has been poorly argued, or a fallacy has been made, the claim itself must be wrong. It is entirely possible to make a claim that is true but justify it with a poor argument. One must judge the validity of a fact independently of the logic used by a specific person to defend it.

This is a rhetorical trap used by consultants to buy time for failing strategies. By claiming things "must get worse" first, they create a win-win: if things stay bad, they were right; if they improve, they are geniuses. Unless a clear mechanism for the "rebound" is provided, be suspicious.

A Loaded Question is one that has an assumption built into it so that it cannot be answered without appearing guilty. These are particularly effective at derailing rational debates because they compel the recipient to defend themselves, putting them on the back foot (e.g., "Are you still having problems with that fungal infection?").

The Middle Ground Fallacy is the claim that a compromise, or middle point, between two extremes must be the truth. While the truth often lies in the middle, some things are simply untrue, and a compromise between the truth and a lie is still a lie.

We have a hidden prejudice against those who achieved success through grit and a preference for "naturals." This is reinforced by the Appeal to Nature, the argument that because something is 'natural' it is therefore valid, justified, or good. Nature is not a moral guide; many natural things are horrific, and many artificial things are life-saving.

This fallacy is an appeal to purity used to dismiss relevant criticisms. When a counterexample is provided to a universal claim, the speaker changes the criteria to exclude that example (e.g., "No Scotsman puts sugar on porridge." "My uncle does." "Well, no true Scotsman puts sugar on porridge").

Personal Incredulity occurs when someone says that because they find a concept difficult to understand (like biological evolution or quantum physics), it therefore cannot be true. This fallacy is used in place of actual understanding or investigation of the evidence.

The Slippery Slope argument warns that "if we allow A to happen, next thing we know Z will happen." It assumes a chain reaction without providing proof of the mechanism. Unless the link between A and Z is proven, the argument is baseless conjecture.

Special Pleading occurs when someone's claim is proven false, but instead of accepting the evidence, they move the goalposts or make up a specific exception to cling to their old belief (e.g., "The psychic test only failed because you didn't have faith").

The fallacy where you paint a bull's-eye over a cluster of random bullet holes. We tend to ignore random chance when the results "look" meaningful. If Hindsight Bias and Confirmation Bias had a baby, it would be this fallacy.

Module 03

Building a Practice

Via Negativa: The Negative Path
A central philosophy in Rolf Dobelli's book suggests that we often know more about what is wrong than what is right. Success is frequently the residue of avoided catastrophes. By systematically eliminating downsides, thinking errors, and logical fallacies, the "upsides" of life often take care of themselves.

The Five Mindtraps

The cognitive biases we just reviewed are often symptoms of our brains falling into broader comfort zones. In her book Unlocking Leadership Mindtraps, Jennifer Garvey Berger identifies five core "Mindtraps" that snag us. To escape them, we must pair a new habit with a specific question. Hover to explore.

📖
Simple Stories

Blinded to the real story.

The Habit

Carry three different stories.

The Question

How is this person a hero?

⚖️
Rightness

Feels right, but isn't.

The Habit

Listen to learn, not to win.

The Question

How could I be wrong?

🤝
Agreement

Consensus robs you of ideas.

The Habit

Disagree to expand.

The Question

Could this conflict deepen the relationship?

🎮
Control

Strips you of influence.

The Habit

Experiment at the edges.

The Question

What can I help enable?

🎭
Ego

Shackled to who you are now.

The Habit

Listen to learn from yourself.

The Question

Who do I want to be next?

The Ladder: Climbing Out of the Traps

You cannot escape these traps entirely; they are built into our biology. However, you can build a "ladder" to scamper out of them as soon as you realize you've fallen in by actively building these four connections:

01.

Connect to Purpose

A purpose is about something bigger than you. It gives you a shortcut to make decisions so you aren't swamped by complexity.

02.

Connect to the Body

Our bodies give us signals we often ignore (tight chest, racing heart). Stop and ask: "What is my body feeling right now?"

03.

Connect to Emotions

Use granularity to name your emotions. Imagine an emotion as a braided rope; unpick the strands to see shame, indignation, or gratitude.

04.

Connect to Compassion

Self-compassion allows you to look at your flaws with forgiveness. Compassion for others helps you connect without judgment.

Epilogue

Wisdom and Rationality

"

Warren Buffett on Avoiding Problems

Warren Buffett and Charlie Munger didn't succeed by being "heroes" who solved difficult business problems; they succeeded by avoiding them. This strategy suggests that wisdom is not about fixing disasters, but about structuring your life and choices so that disasters are never given a chance to happen. Avoiding a "swamp" is always more efficient than learning how to swim through one.

"

Plato on Rationality

Plato used the metaphor of a charioteer (Reason) trying to steer two wildly galloping horses (Emotions). Rationality is not the absence of emotion, but the constant effort of the "rider" to keep those emotions moving in a productive direction. If we stop paying attention, the "horses" will run us into a ditch. Thinking clearly is a practice of constant, conscious steering.