How to escape false certainty


by Andres Kuusk, author of “Unlocking the puzzle of success

At any given moment, you are operating in a version of reality that seems correct to you. Your decisions make sense. Your interpretations seem accurate. Your conclusions seem justified.

However, sometimes this is not the case. Not because you lack intelligence. Not because you lack information. But because the assumptions underlying your thinking are wrong.

The most dangerous mistakes are not caused by poor execution. They are caused by false certainties.

The problem with “obvious”

Most bad decisions don’t feel like mistakes when they’re made. They seem obvious. This is what makes them difficult to detect.

If something seems uncertain, we question it. If something seems obvious, we don’t do it.

In strategic environments, this is where errors often come from. One move looks okay. One conclusion seems clear. An explanation seems complete. This therefore remains undisputed.

But “obvious” is often just a disguised, unverified hypothesis.

Living inside the pattern

We do not experience reality directly. We experience a model of reality.

This model is built from:

  • past experiences
  • rules learned
  • social signals
  • internal biases

Most of the time, this model works quite well. But when this is not the case, the problem is difficult to detect because we use the same model to evaluate ourselves. This creates a closed loop:

  • The system validates its own assumptions.
  • The findings strengthen the system.
  • And the errors become invisible.

The moment of doubt

Progress often starts with something subtle. A small inconsistency. A detail that doesn’t completely fit. A result that seems slightly off.

This is the crunch time – the point where most people move on. They rationalize. They ignore. They assume it will resolve itself.

But this moment is precious. This is a signal that your model may be incomplete.

The Agent Mulder method

There is a simple way to work with this signal. A structured way to move from hypothesis to verification. This approach is inspired by a simple idea that I put in touch with agent Fox Mulder of The X Files: When something doesn’t add up, question it and test it.

This mindset forms the basis of what I call the Agent Mulder Method. Think of it as three steps: Spot. Challenge. Test.

1. Identify the hypothesis (the Mulder moment).

Notice when something doesn’t quite fit. It doesn’t have to be all wrong – just slightly inconsistent.

The key question is: What do I take as given here? This is the moment that most people miss.

2. Question the hypothesis (the Scully challenge).

Once identified, the hypothesis must be actively challenged. Not passively recognized – but contested. This step introduces friction into automatic thinking.

Ask:

  • Is this necessarily true?
  • What if the opposite was true?
  • What evidence supports this belief?

3. Test the hypothesis (The Skinner Shot).

The last step is verification. No discussion. No speculation. Essay.

What action would reveal if the hypothesis is correct? In practice, this often means:

  • try a different approach
  • collect specific data
  • expose the idea to reality

Without this step, the approach remains theoretical. Reality must have the last word.

Why it works

Most people operate in cycles of interpretation. They think. They decide. They explain. But they rarely test the underlying assumptions.

The Agent Mulder method interrupts this cycle by forcing you to move from hypothesis to verification. He introduces:

  • awareness (location)
  • friction (challenge)
  • reality (test)

This shifts thinking from reactive to deliberate.

A short example from practice

Years ago, I helped test an early version of an online board game. At first glance, the product looked impressive. The graphics have been taken care of. The board could be turned in all directions.

But something was wrong. There was no standard top-down view – a basic feature in virtually every board game interface.

It was the moment of doubt. I noticed the inconsistency, but initially dismissed it. I thought the developers knew what they were doing.

A few months later, the problem was still there. This time I challenged that assumption. I asked a simple question: “Have any of you ever played a board game?” »

This was not the case. This was the test – and the answer. The problem wasn’t a missing feature. This was a flawed assumption about the people building the product.

In hindsight, the error was not in the software. It was in my thinking. I noticed the signal, but I didn’t react to it.

A practical example

Let’s take a simple workplace scenario.

A project is delayed. The immediate hypothesis: “The team is slow.”

This seems plausible. This may even be partially true. But applying the method changes the process.

Place: Is this an observation – or a hypothesis?

Challenge: Could the delay be due to unclear requirements? Conflicting priorities? Structural bottlenecks?

Test: Clarify scope. Adjust the workflow. Remove a constraint. Observe the changes. Often the initial hypothesis turns out to be incomplete or wrong.

The cost of skipping steps

Most people notice inconsistencies sometimes. The less they challenge them. Almost no one tests them systematically.

This is where errors persist. If you skip:

  • Step 1 → you never see the problem
  • Step 2 → you accept the wrong explanation
  • Step 3 → reality corrects you later

And when reality provides the correction, it often costs more.

A different kind of trust

The goal is not to eliminate uncertainty. It’s impossible. The goal is to relate to it differently.

Confidence is not believing that your assumptions are correct. It’s about being ready to test them.

Pay attention to the three steps

You cannot avoid operating within a model of reality, but you can choose with what awareness you operate within it.

When something seems obvious, pause.
When something doesn’t quite fit, be careful.
When a decision matters, test it.

Because the biggest advantage is not having the right answers. It’s about knowing when to question them – and having a method for testing them.

Andres Kuusk

Andres Kuusk is a seven-time Pentamind World Champion, game theory professor and senior executive. His work focuses on strategic decision-making, cognitive biases and performance architecture. Drawing on competitive mental sports and business leadership, it explores how sound reasoning extends across all fields. He is the author of “Unlocking the puzzle of success“. Learn more about andreskuusk.com.




Source link

Leave a Reply

Your email address will not be published. Required fields are marked *