Why You Should Care about the Distinction between Mistake and Failure
By Amy Edmondson

guestPosted by

A failure is not the same as a mistake. Maybe, you think, this is just semantics. I see it as an important distinction that matters for today’s leaders. First, words are the leader’s primary tool. Leadership is the art of harnessing the ideas, talents, and efforts of others to achieve challenging goals. And the harnessing is largely accomplished through words – written communication, town halls, team meetings, or one-on-one conversations. Words matter. Words convey meaning, and meaning serves as the lens through which we see reality.

But why might distinguishing between mistakes and failures be particularly important? Briefly, and simply, because in a context of enormous uncertainty, failures are inevitable. And, perhaps counterintuitively, uncertainty does not necessarily bring mistakes. To understand why, we need some definitions.

A mistake(synonymous with error) is an unintended deviation from known procedures, standards, rules, or protocols. By definition, a mistake only occurs in familiar territory, where prior knowledge is available. Examples might include charging a customer the wrong amount for a product or entering incorrect data into an accounting system. Mistakes occur across a variety of activities – when we’re not paying attention or when we lack the knowledge or training to do it right. Fortunately, mistakes don’t always lead to serious consequences; some do; many don’t. But uncertainty doesn’t increase the likelihood of not paying attention – it may have the opposite effect. A failure, in contrast, is an outcome that deviates from desired results, regardless of intent or cause.

Yes, it is the case that many failures are caused by mistakes. But not all failures are caused by mistakes, and not all mistakes cause failures. If you mistakenly add the wrong amount of an ingredient while following a trusted recipe, but the result is delicious, no failure has occurred! If a blind date turns out a failure, but a close friend who made the introduction had good reason to believe it might succeed, no mistake has occurred. In fact, I call this one an intelligent failure!

In my research, I’ve identified three types of failure:

  • Basic failure: An undesired result caused by a mistake, in familiar territory.
  • Complex failure: An undesired result of the interaction of multiple factors, where none of them on their own would have led to a failure.
  • Intelligent failure: An undesired result of an action in new territory, driven by a hypothesis, in pursuit of a goal, where care has been taken to minimize unnecessary risk.

Whereas basic failures are usually caused by human error, intelligent failures are not. And complex failures, those perfect storms that happen when multiple small things come together to produce havoc, can often be prevented with vigilance, but in our increasingly complex and interconnected world, they’re on the rise. We should do our best to prevent as many as possible and learn from the rest. Intelligent failures, although disappointing at the time, bring valuable new knowledge.

Relationships between uncertainty and preventability and failures and mistakes

As depicted in the figure below, basic failures are the most preventable of the three, followed by complex failures.  Intelligent failures, in life or in work, are not preventable. As uncertainty goes up, the necessity of intelligent failures goes up (because the necessity of smart experimentation goes up). The path toward progress in science or innovation is paved with intelligent failures. (But not with error.)

Source: Figure 3.1 in Right Kind of Wrong.

Why shouldn’t we expect more mistakes as uncertainty goes up? Consider a lab: When a scientist tests an informed hypothesis, having read the relevant literature, and that hypothesis is not supported by the data, that is both an intelligent failure and a useful step forward in a program of research. However, when that scientist mistakenly uses the wrong chemical or the wrong equipment in an experiment and it fails, she has learned nothing new other than to pay more attention when conducting an experiment.

Excellent organizations therefore strive to minimize mistakes. But they also recognize that “to err is human,” and some mistakes will slip through. Excellent organizations, therefore, work hard to catch and correct the errors that do occur before they cause harm.

Psychological and organizational consequences of conflating mistake and failure

When organizations and their managers conflate mistakes and failures, employees fear repercussions from any undesired outcome – whether or not it brings valuable new information.  They hide bad news and cover up mistakes – allowing small problems to spiral out of control into larger, preventable failures. Worse, they avoid risk taking.  People become unwilling to do anything outside the tried and true, inhibiting innovation and dooming the organization to obsolescence in the long run.

By distinguishing between mistakes and failures, leaders take an important step toward building a climate where failures are reported and discussed and recognized as essential for innovation – and where both failures and mistakes are seen as learning opportunities. 

Lastly, mistakes and failures call for different responses. Mistakes should trigger corrective action and process improvements. Work processes can be redesigned to make error less likely, and training can be increased or improved. Meanwhile, failures—especially those resulting from experimentation—should prompt reflection and pivots. Leaders must reward thoughtful risk-taking and sanction careless action.

Making the distinction between failures and mistakes thus contributes to psychological safety. It increases the chances that people will report mistakes promptly, helping correct minor issues before they escalate. Distinguishing between mistakes and failures reduces the stigma around failure. The distinction between mistakes and failures is thus not mere semantics—but foundational to a way of thinking that is scientific, systematic, and compassionate. A way of thinking that helps organizations learn and innovate. By understanding and conveying this crucial difference, leaders create psychological safety, support error reporting, and reward responsible risk-taking in support of sustainable success.

About the author:

 Amy Edmondson, Professor of Leadership and Management, Harvard Business School; author, Right Kind of Wrong and The Fearless Organization; Honorary Fellow of the Global Peter Drucker Forum.

Leave a Reply

Your email address will not be published. Required fields are marked *