Taking any perspective or action and expecting to not be wrong is like jumping into the ocean expecting not to get wet. Being wrong is an unavoidable aspect of life, an outcome of the inherent limitations of our knowledge and the intricate nature of, well, most everything. Despite its inevitability, many of us harbor a deep-seated fear of admitting mistakes, driven by anxieties about being judged, losing credibility, and appearing weak.
At the core of our aversion to being wrong lies a fundamental drive to protect our self-concept. We often mistakenly equate being wrong about something in particular with being inherently flawed in general, leading to a fixed mindset that dreads mistakes. Realizing you’re wrong can be an uncomfortable experience—and this discomfort is by design. Our ancestors survived by making the right decisions. They evaded predators, gathered resources, and lived long enough to reproduce because they avoided errors. This evolutionary pressure has ingrained in us a deep aversion to being wrong, a protective mechanism that shaped our minds over millennia.
Double-edged sword
While most adaptive traits provide survival advantages, they can sometimes produce maladaptive effects at the individual level, especially when expressed in the wrong context or to an extreme degree. The fight-or-flight response, designed to enable survival by readying the body for action, becomes problematic when it’s frequently activated in non-threatening scenarios. It can cause ongoing stress and anxiety, contributing to severe conditions such as hypertension and heart disease. Our ancestral craving for calorie-rich foods is another trait that becomes detrimental in a context of abundant food supply, causing obesity, diabetes, and cardiovascular complications. Similarly, cognitive mechanisms that help us maintain self-esteem and social standing can lead to problematic behaviors such as biased decision-making, rationalization, or outright dismissal of contradictory evidence.
Self-deceptive processes like self-serving biases and self-enhancement evolved to promote confidence and the appearance of competence, thereby increasing reproductive success. The tradeoff is that an individual convinced of their infallibility is likely to become overconfident, deny their flaws, and ignore constructive criticism, which will stunt their ability to learn and grow. When we encounter group disagreements, interpersonal strategies come into play. To reduce dissonance, we might try to persuade others to share our views or align ourselves with like-minded individuals. While achieving consensus can be comforting, it often comes at a cost. Group polarization and the reinforcement of biases can occur, making us resistant to alternative perspectives. Echo chambers grow and partisan divides widen as people seek validation from those who already agree with them.
Falsifiability
Theoretical physicist Richard Feynman once said, “We can only be sure that we're wrong.” In other words, nothing can be proven with absolute certainty; it can only fail to be disproven by available evidence. There’s always the possibility that future observations under different conditions could contradict the theory. Philosopher of science Karl Popper, known for advocating empirical falsification, used the famous example that the statement, 'All swans are white,' cannot be verified as true by observing any number of white swans, since the observation of just one black swan would falsify the universal statement.
Sometimes when I hear someone (including myself) say “That’s right!” I often envision the word “wright” — a portmanteau of the words “wrong” and “right.” In that, it’s mostly right and a little bit wrong too. “Wright” acknowledges that while something may be mostly correct, there are always minor imperfections or errors.
Power of error
In his book Think Again, Adam Grant recounts an exchange with renowned psychologist Daniel Kahneman at a conference. After Grant presented his research, Kahneman approached him and expressed, “That was wonderful! I was wrong.” Intrigued by his upbeat response, Grant asked if he enjoys being wrong. Kahneman shared that being wrong is the only way he felt sure he’s learned anything and that his attachment to ideas was provisional. With a smile he asserted, “There’s no unconditional love for them.” (Popper would be proud.)
When faced with a poor outcome, it’s tempting to judge the preceding decision as a mistake. However, that approach can mislead you to believe a decision was wrong when in fact it was sound. In Thinking In Bets, Annie Duke introduced the concept of “resulting,” a fallacy that equates the quality of a decision with its outcome. She emphasizes that even decisions with a high probability of success can lead to negative outcomes due to unforeseen circumstances. Conversely, poor decisions can sometimes result in positive outcomes purely by luck.
To avoid falling into the trap of resulting, it is crucial to focus on the decision-making process itself. The quality of a decision should be based on the information available at the time the decision was made, before the outcome is known. Avoid the pitfall of deeming a decision good or bad solely based on whether the outcome was successful. This shift in focus acknowledges that luck and external factors can significantly influence results and underscores the importance of a sound decision-making process. With this approach, you are better able to identify what’s within your control and make adjustments that increase your odds of success next time around.
Of course, no one enjoys being wrong, but you can genuinely enjoy discovering that you were wrong because it means you’re now less wrong than before. Marcus Aurelius, in his Meditations, reflected, “If anyone can refute me—show me I’m making a mistake or looking at things from the wrong perspective—I’ll gladly change. It’s the truth I’m after, and the truth never harmed anyone. What harms us is to persist in self-deceit and ignorance.” His quote encapsulates the essence of intellectual humility: valuing truth over ego and accepting errors as part of the learning process. The path to understanding begins with acknowledging what you don’t know. Each mistake is another gateway. It’s better to admit your gaps in understanding and risk appearing foolish temporarily than to remain ignorant forever by ignoring them.
Instead of shying away from mistakes, seek out opportunities that challenge your beliefs and expose you to new perspectives. Approach each error as a chance to learn and grow, not as a reflection of your worth. When we only seek situations that mirror our beliefs, we limit our chances of finding breakthroughs, which, by definition, include an element of surprise. Make a conscious effort to welcome disconfirming evidence, engage with diverse viewpoints, and celebrate the moments when you inevitably discover you were wrong. As Miles Davis put it, “If you hit a wrong note, it’s the next note that you play that determines if it’s good or bad.”
With the advent of social media, the fear of public embarrassment or ridicule is amplified, as errors can be widely broadcast and remain accessible indefinitely.
Stanford psychologist Carol Dweck coined the terms “fixed mindset” and “growth mindset” based on decades of research on student attitudes and achievement. A fixed mindset, which views abilities and intelligence as static, tends to avoid challenges and fears mistakes, which hinders growth. In contrast, a growth mindset embraces mistakes as valuable opportunities for learning and development. Those with a growth mindset understand that mistakes are a natural part of the learning process.
Self-serving biases refer to the tendency to attribute positive outcomes or successes to one's own abilities or efforts, while attributing negative outcomes or failures to external factors beyond one's control. This bias serves to protect and enhance one's self-esteem by preserving a positive self-image. For example, a student attributing a good grade on an exam to their hard work and intelligence, but blaming a poor grade on the difficulty of the test or an unfair professor.
Self-enhancement refers to the tendency to view oneself in an unrealistically positive light, exaggerating one's positive qualities and minimizing negative ones. This strategy helps maintain a favorable self-concept and reduce dissonance arising from information that contradicts one's positive self-image. For example, selectively focusing on and remembering instances of success while minimizing or forgetting failures or mistakes.
Richard Feynman delivered a lecture called Seeking New Laws as part of the Messenger Lectures on The Character of Physical Law at Cornell University on November 9, 1964. During the lecture Feynman said the following: “If you have a definite theory and a real guess, from which you can really compute consequences, which could be compared to experiment, then in principle, we can get rid of any theory. We can always prove any definite theory wrong. Notice, however, we never prove it right. Suppose that you invent a good guess, calculate the consequences, and discover that every consequence that you calculate agrees with experiment. Your theory is then right? No, it is simply not proved wrong. Because in the future, there could be a wider range of experiments, you can compute a wider range of consequences. And you may discover, then, that the thing is wrong... But it can never be proved right, because tomorrow’s experiment may succeed in proving what you thought was right, wrong. So we never are right. We can only be sure we’re wrong.”
For example, Newton’s law of universal gravitation, proposed in 1687, was accepted for centuries but failed to explain anomalies in Mercury’s orbit, notably the precession of its perihelion which was about 43 arc-seconds per century more than predicted. Einstein's general relativity, introduced in 1915, accurately explained this anomaly by incorporating the curvature of spacetime. General relativity is now the accepted explanation for these subtleties in Mercury's orbit—until further notice.
Empirical falsification, a concept proposed by Karl Popper, asserts that scientific theories and hypotheses cannot be conclusively proven true but can be proven false. For a theory to be considered scientific, it must be testable and falsifiable, meaning there must be possible observations or experiments that could show the theory to be incorrect. This principle emphasizes that scientific knowledge is always tentative and open to revision.
I considered “wronght,” which looks and sounds more interesting, however, it also seems like it belongs alongside onomatopoeic words like “bonk”, “thunk” or “slump.” It felt too heavy on the wrongness and too light on the rightness to capture the proportions I needed it to. Still, maybe it exists on the spectrum: wrong, wronght, wright, right
The concept of “errorless learning” traditionally advocates for educational settings where mistakes are minimized in an attempt to ensure that incorrect information isn't learned. However, research by Nate Kornell and colleagues from UCLA challenges this view, showing that making errors can actually enhance learning. According to their findings, people retain information better and for longer periods if they are tested rigorously and allowed to fail initially. This research suggests that the act of attempting and failing to retrieve information helps deepen memory retention. These insights are not only pivotal for educational practices but also offer valuable strategies for anyone engaged in learning new material.
Many major scientific discoveries including Penicillin, X-rays, Microwave ovens and the first artificial sweetener (Saccharin) were made by accident, with estimates suggesting that around half of all inventions arise unexpectedly.