Friday afternoon, January 10th. Someone asks you a complex question. "Should I take this job?" "Is this relationship right for me?" "Is AI good or bad?" You feel pressure to give a clear answer. Yes or no. Good or bad. Right or wrong. This feels like wisdom—being decisive, cutting through complexity. But every time you compress reality into a binary, you're not revealing truth. You're obscuring it. Here's what nobody wants to hear: The smarter you get, the fewer definitive answers you have.

The Thesis

We treat certainty as a virtue. Being sure, being decisive, having strong opinions—these signal intelligence and confidence. Uncertainty signals weakness, confusion, ignorance.

This is backwards. Certainty is often the opposite of intelligence. It's what you get when you ignore enough complexity to force a simple answer. The world is relentlessly non-binary. Your need for binary answers isn't wisdom—it's compression. You're losing data to gain comfort.

The controversial claim: Smart people have more uncertainty, not less. They see more variables, more trade-offs, more context-dependence. Dumb answers are easy because they ignore everything that makes questions hard. The person who always knows what to do isn't wise—they're not paying attention.

The Compression Problem

Binary thinking is lossy compression:

Reality: High-resolution, multi-dimensional, context-dependent, probabilistic, dynamic Your brain: Needs to make decisions with finite processing power Solution: Compress reality into simple categories

Good/bad. Right/wrong. Us/them. Yes/no.

This compression is necessary for action. You can't hold all of reality in your head. You need heuristics, shortcuts, simplified models.

The problem: Most people forget they're compressing. They confuse the compressed version (their simplified mental model) with the uncompressed version (actual reality). They think their binary is truth, not convenience.

This is the certainty tax. Every time you compress for simplicity, you lose fidelity. The question is whether you're aware you're paying it, and whether the trade-off is worth it.

Where Certainty Comes From

Insufficient information:

The less you know, the easier it is to be certain. Few data points fit many simple stories. "Is coffee healthy?" Easy to answer if you've read one study. Impossible to answer definitively if you've read fifty.

This means: The person most certain is often least informed. They haven't encountered the contradictions yet. The expert saying "it depends" isn't wishy-washy—they've seen enough cases to know the answer isn't universal.

Motivated reasoning:

You want a particular answer. So you stop looking for evidence when you find enough to justify the answer you wanted.

Example: "Should I move to this city?" If you want to move, you'll find reasons it's great. If you don't, you'll find reasons it's terrible. Both sets of reasons are real. Certainty comes not from having better information, but from stopping your search when you like what you found.

Social pressure:

People reward certainty. "I don't know" sounds weak. "It's complicated" sounds evasive. "Here's exactly what you should do" sounds authoritative.

The incentive: Perform certainty whether or not you feel it. Signal decisiveness. Make predictions confidently. This is rewarded even when you're wrong, because confidence itself is valued.

Anxiety avoidance:

Uncertainty is uncomfortable. Your brain wants closure. Open questions feel unresolved. Multiple possible answers feel unstable.

Certainty is anxiolytic. It doesn't matter if it's false certainty—the psychological relief is the same. You'll pay the certainty tax (losing accuracy) to avoid the discomfort of ambiguity.

The Cost of Cheap Certainty

You stop learning:

Once you're certain, you stop looking. Why gather more information if you already know the answer?

This means: Certainty is the end of learning. The person who says "I know exactly how this works" won't notice when the system changes. The person who says "I think it works like this, but I could be wrong" stays calibrated.

Premature certainty locks you into your first decent answer. You never get to the better answers that require more data, more thought, more uncertainty tolerance.

You become predictable:

If your answers are always binary and certain, you're running on cached heuristics. You're not thinking through each instance—you're pattern-matching to a category and applying the standard answer.

This is efficient. It's also stupid. Most interesting questions don't have standard answers. The situation matters. The context matters. The particulars matter.

The person who can only give certain, simple answers is intellectually automated. They're running on autopilot. They stopped actually thinking years ago.

You can't handle complexity:

Binary thinking works well for binary reality. The problem: Almost nothing is actually binary.

Examples of non-binary reality:

  • "Is this person trustworthy?" (Trustworthy about what? In what context? With what incentives?)
  • "Is nuclear power safe?" (Compared to what? Over what timeframe? Counting which risks?)
  • "Should I quit my job?" (To do what? Given what alternatives? With what financial buffer?)

If you force these into yes/no answers, you're not being decisive—you're being reductive. You're pretending complexity doesn't exist so you can feel certain. Reality doesn't care. The complexity still matters whether you acknowledge it or not.

You generate confident bullshit:

Certainty without sufficient evidence isn't confidence—it's bullshit. You've decided on an answer before you have grounds to decide.

The danger: This feels identical to legitimate confidence. From the inside, unjustified certainty and justified certainty feel the same. You can't tell the difference unless you track your reasoning process.

Most people don't. They feel certain, so they assume they must have good reasons. They generate explanations post-hoc. This is how you become someone who's confidently wrong about everything.

The Alternative: Probabilistic Thinking

Instead of "yes/no", think "likely/unlikely":

"Should I take this job?" becomes:

  • 70% confident it improves work-life balance
  • 40% confident it advances my career trajectory
  • 90% confident the company culture is better
  • 20% confident I'll stay longer than two years

This is more accurate than "yes" or "no". It captures uncertainty. It allows for different factors to have different confidence levels. It maps better to reality.

Instead of certainty, track confidence:

"I'm 95% sure this is true" vs "I'm 60% sure this is true"

This is more honest. It also makes you calibratable. If you're wrong 40% of the time on things you called 60% confidence, you're well-calibrated. If you're wrong 40% of the time on things you called 95% confidence, you're overconfident.

Most people are wildly miscalibrated. They say they're certain about things they should be 60% on. They don't track their error rate. They don't learn.

Instead of permanent positions, hold provisional beliefs:

"Given what I currently know, I think X" vs "X is true"

The first is defensible. The second is a hostage to fortune. New information will eventually contradict any strong permanent position.

Smart people update. They hold beliefs provisionally. They're ready to change their mind when evidence changes. This isn't weakness—it's intellectual honesty.

Instead of simple answers, articulate cruxes:

"It depends on X" is more honest than "definitely yes."

If the answer actually depends on context, say so. Identify what it depends on. "Should you work at a startup? Depends on your risk tolerance, financial runway, career stage, and what you want to learn."

This is useful. The person now knows what variables matter. They can evaluate those variables for their situation. They get a better answer than your false certainty would have given them.

When Certainty Is Appropriate

Certainty isn't always wrong:

Some things are legitimately settled. You can be certain the earth is approximately spherical. You can be certain humans need oxygen. You can be certain 2+2=4 in base 10.

The question is: How do you know if certainty is warranted?

Certainty is earned when:

  1. You have overwhelming evidence - Not just one study, or one experience, or one argument. You've seen it from multiple angles, over time, in different contexts, all pointing the same direction.

  2. The question is simple enough - Some questions genuinely are binary. "Did this event happen?" is often genuinely yes/no. The answer might be uncertain, but the structure is binary.

  3. The stakes are low - Being certain about your lunch order? Fine. Being certain about a career decision? Probably unwarranted. Match certainty to consequence.

  4. You've stress-tested it - You've actively looked for disconfirming evidence. You've steelmanned the opposing view. You've checked for motivated reasoning. And you're still confident. Now certainty might be justified.

Most everyday certainty doesn't meet these bars. We're certain because it's comfortable, not because it's justified.

The Benefits of Uncertain Thinking

You actually update:

When you hold beliefs probabilistically, you can update incrementally. New evidence shifts your probabilities. You don't have to completely reverse position—you just adjust confidence.

This is how learning actually works. Binary thinking forces you into either stubbornness (defend your position) or whiplash (complete reversals). Probabilistic thinking allows smooth adjustment.

You can disagree productively:

"I think there's a 70% chance X is true" "I think there's a 40% chance X is true"

You disagree, but you're not in conflict. You can investigate together what would shift the probabilities. This is collaborative. Binary disagreement is adversarial.

You're more persuadable:

People with high certainty are hard to convince. They've committed to a position. Changing it feels like losing.

People with expressed uncertainty are easy to convince—just show them evidence. They haven't committed to being right. They've committed to being accurate.

Paradox: Expressing uncertainty makes you more credible, not less. People trust someone who says "I'm not sure, but here's what I think" more than someone who says "I'm absolutely certain" about inherently uncertain things.

You can hold contradictions:

"This person is generous AND selfish." "This policy helps AND hurts." "This is true in context A AND false in context B."

Binary thinking forbids this. People must be one or the other. Policies must be good or bad.

Reality doesn't care about your mental discomfort. People are genuinely contradictory. Policies genuinely have trade-offs. Uncertainty thinking lets you see this. Binary thinking forces you to deny it.

How to Think With Less Certainty

Track your confidence explicitly:

When you form a belief, write down your confidence level. "70% sure this project will succeed."

Later, check if you were right. If you were right 70% of the time on "70% sure" predictions, you're calibrated. If not, you're learning where you're overconfident.

This seems tedious. It's also the only way to improve calibration. Your gut feeling of certainty is uncalibrated. Measurement fixes this.

Seek disconfirming evidence actively:

When you form a belief, actively search for evidence against it. Not to be contrarian. To test if your certainty is justified.

If you can't find any counter-evidence, you're either right, or you're not looking hard enough. On most interesting questions, there's always counter-evidence. If you're not finding it, you're in an echo chamber.

Practice saying "I don't know":

This is uncomfortable. Do it anyway.

When asked for an opinion on something you haven't thought through: "I don't know. Let me think about it."

When asked about something genuinely uncertain: "I don't think anyone knows this yet."

This signals intellectual honesty. It also saves you from generating bullshit. Most bad thinking comes from feeling pressure to have an answer before you've done the thinking.

Unpack binary questions:

"Should I do X?" → "What am I optimizing for? What are the trade-offs? What would change my answer?"

Most yes/no questions are shorthand for complex multi-variable decisions. Unpack them. The answer is almost never a simple yes or no—it's "yes if A, no if B, depends on C."

Hold positions lightly:

Imagine someone offers you strong evidence against your position. How would you react?

If your reaction is defensive, you're too certain. You've made it about ego.

If your reaction is curious, you're holding it lightly. You want truth more than you want to be right.

Practice the second reaction. "Interesting, I didn't know that. Let me think about how that changes my view."

Takeaways

Core insight: Certainty is expensive. You pay for it in accuracy, learning, and truth. Most certainty is unearned—it comes from insufficient information, not superior understanding. Smart people are less certain because they see more complexity, not because they're confused.

What's actually true:

  1. Binary thinking is cognitive compression—necessary for action, but lossy
  2. Most interesting questions don't have binary answers, only binary-shaped approximations
  3. Confidence without calibration is just noise—you need to track your error rate
  4. "I don't know" is often the most intelligent answer available
  5. The person who's certain about everything has stopped thinking

What to do:

If you find yourself very certain:

  • Ask: "What would change my mind?"
  • If the answer is "nothing," you're not thinking—you're committed
  • Look for disconfirming evidence specifically
  • Write down your confidence level and check it later
  • Consider: Am I certain because the evidence is overwhelming, or because I stopped looking?

If someone demands certainty from you:

  • Resist the pressure to compress complexity
  • Say: "It depends on..." and articulate what it depends on
  • Give probabilities, not binaries ("60% confident" not "definitely yes")
  • Explain your reasoning, including your uncertainty
  • Remember: They might reward false certainty, but reality won't

If you want to get better at this:

  • Start a calibration journal (predictions + confidence + outcomes)
  • Practice saying "I'm not sure" in low-stakes situations
  • Deliberately seek views that contradict yours
  • When you update a belief, write down why (this teaches your brain that updating is good)
  • Stop using absolute language ("always", "never", "definitely") unless you mean it literally

The uncomfortable truth:

The smartest person in the room is usually the least certain. They've thought about it long enough to see the complications. They've encountered enough edge cases to distrust simple rules. They know what they don't know.

You want to be the person with all the answers. But having all the answers usually means you haven't understood the questions. Real expertise looks like saying "it depends" and then being able to articulate on what, and why, and how much.

The goal isn't to have no beliefs. It's to hold them proportionate to your evidence, to update them when evidence changes, and to remain capable of being surprised.

Because reality is surprising. If your beliefs make you unsurprisable, they're not tracking reality—they're protecting you from it.

Pay the certainty tax consciously. Sometimes you need a yes/no answer to act. Fine. Compress deliberately. But know what you're losing. Know you're choosing comfort over accuracy. Know you're optimizing for closure, not truth.

And on important questions—questions that actually matter—resist the tax entirely.

Stay uncertain. Stay learning. Stay wrong often enough to know you're still in contact with reality.

Today's Sketch

January 10, 2026