Why We Justify
Foolish Beliefs, Bad Decisions,
and Hurtful Acts


A review by Daniele Procida,
(teacher of philosophy at Cardiff University),
of Mistakes Were Made (But Not by Me)
by Carol Tavris and Elliot Aronson

There is a vast body of literature on how to do well, how to be happy, what to choose for one's own benefit and that of others. We are not short of such analyses or guidance.

In contrast, the body of work which considers our failure to do well and be good is decidedly smaller, and also rather lamer in its power to explain why we make bad decisions and commit hurtful acts.

We remain opaque to ourselves, thinking, acting and responding in ways which are harmful, counter-productive and baffling.

Most baffling of all is our propensity to continue in these patterns, to compound error with error. To try to get out of a hole by digging deeper.

Attempts to explain all this self-destructive behavior tend to sound like selfish egoism, stupidity, mindless evil, or contentious psychological theories which assign no responsibility.

Mistakes Were Made (But Not by Me) offers an alternative to these by describing a simple process of mind and behavior, which which by its very nature is hidden from our view.

This process is self-justification, and it is driven by cognitive dissonance, which is the discomfort we feel when we sense the gap between our self-image and our less attractive reality.



It works like this: I do something that I should not have done, and this troubles me, because I'm not the kind of person who does that sort of thing. So, to soothe this nagging complaint of the soul, I convince myself that there was nothing wrong with what I did, and I confirm this by reinforcing it at the earliest possible opportunity.

So, although I hate it when people treat menials rudely, I fail to speak out when the boss I'm very anxious to impress humiliates and bullies the waitress.

But I'm not the kind of person who is too weak to stand up to injustice! A gulf yawns between self-image and reality, the dissonance is unpleasant and unsettling.

Harmony will most easily be restored, the gap most painlessly closed, by retelling, or reframing, the story.

"Heh heh," I chuckle at the boss's nasty joke, adding: "She's lucky you're not the kind of guy who'd want to get her in real trouble for that!"

Or: I'm not the kind of person who allows smooth-talking salesmen to get the better of me, but somehow I seem to have foolishly ordered 1000 leather business cards; naturally, I'm troubled by the dissonance. There's no way out of this without having to admit to myself and others that in fact I am too susceptible to basic techniques of persuasion, so instead: the very next time someone presents me with their own card, I feel mild disgust for the pitiful rectangle of paper I've just been handed.

This simple process, argue the authors, explains to a great extent why, after we make mistakes, we continue making them.

According to dissonance theory, what we should expect from someone who finds themselves in a hole are not efforts to climb out, but energetic digging in the wrong direction.

The analysis is counter-intuitive, but human behavior seems to defy commonsense.

There are two essential aspects of the process which bring people to do and continue doing harmful things, all the while justifying it to themselves.

First, there is the cognitive dissonance which leaves them in urgent need of something that will make them more comfortable with what they've done.

Then in the spiral of self-justification, they do more things to make themselves feel better, but those things require further justification, and so it continues on.



In addition, the authors identify other mechanisms. There is the blind-spot.

The biggest blind spot of all falls over our own integrity.

It is what permitted Dr Andrew Wakefield to accept large sums of money to conduct research on autistic children, from lawyers representing their parents, and then fail to disclose the fact to the Lancet when it published a paper by his team reporting a correlation between autism and childhood vaccination.

The paper has been discredited, Wakefield is currently facing a General Medical Council hearing for professional misconduct and dishonest behavior, and his reputation hangs in the balance if it is not already destroyed.

Wakefield continues to maintain his faith in the paper and his actions, resolutely denying that a conflict of interest existed.

We do not need to demonize him, or even think him a liar. At some point he needed to reduce the dissonance between his self-image and his actions. How could a researcher of independence and integrity accept a large sum from lawyers? How could he not disclose this to the editors of the journal? Using his blind spot of assured moral rectitude, justification would kick in: Of course the money won't affect my judgments - my professional integrity will see to that! Of course I would have disclosed a conflict of interest - but obviously this wasn't one. And he would genuinely and sincerely believe this.

Our intuitions, our commonsense, and other people warn us to beware of evil people trying to do wrong.

But they are mistaken.

We should be most wary of decent people, people like us, like Andrew Wakefield, who are sure they are doing right.

It is this last insight which can be most useful to us. We too believe we are doing the right things and that we are justified in our actions. But we too are driven to justify to ourselves the things we do, and we too have a blind-spot which hides this from us, and we too will resist being confronted both with the blind-spot and what it hides.

Our blind-spots are always ready to lead us into anything, from looking corrupt to looking ridiculous.



Another aspect of self-justification is what the authors call the "pyramid of choice."

Our first step into dissonance is usually a tiny one, and leaves us only a short distance from where we were before. But we have started a slide, and the next step will be in the same direction, and the next, until we find ourselves at the bottom, far from where we started.

Often, standing at the top of the pyramid, we are faced not with a black-and-white, but with a gray choice whose consequences are shrouded.

The first steps along the path are morally ambiguous, and the right decision is not always clear. We make an early, apparently inconsequential decision, and then we justify it to reduce the ambiguity of the choice.

This starts a process of entrapment - action, justification, further action - that increases our intensity and commitment, and may end up taking us far from our original intention or principles.

In endeavors where successful work is marked by the capturing of a prize - the conviction of a suspect, the uncovering of a repressed memory, the bagging of a result in one form or another - the pressure is on to do just that.

Any setbacks along the way, any obstacles or difficulties, provoke discomfort and dissonance, and the urge to reduce it; the more the work-culture demands that prize, the greater the chance that the first step off the top of the pyramid will be in a dangerous direction.

One might expect an error-averse culture to produce fewer serious errors, but in fact this is not the case. Once again, dissonance theory predicts the unexpected outcome: the more error-averse the culture, the more likely it is that dissonance will push practitioners down the wrong side of the pyramid, with error compounding error, and every step making it harder to climb back up again.



But Mistakes Were Made serves as a warning to look more closely at basic assumptions and practices of our own, not just on the scale of cultures and professions, but even in our own personal lives.

Dissonance theory explains why people who are given the opportunity to vent their anger will afterwards feel more, and not less, animosity towards its object.

Aggression requires justification, which in turn justifies further aggression: whereas controlling, rather than expressing, our anger is what will return our blood pressure and equilibrium to normal, and allow us to let go of the unhealthy ire.

(edited by David Van Alstyne)

Home / Of General Interest