My Photo

The Out Campaign

Atheist Blogroll

Blog powered by Typepad
Member since 05/2005

« Mistakes Were Made (But Not By Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts -- A Review | Main | Carnivals of Liberals #57 and Skeptic's Circle #79 »



So what did the book change your mind about?

"I'm resolving to be better about acknowledging when I make mistakes, and correcting them. I'm resolving to be better about acknowledging when people I disagree with make good points. And when I'm in one-on-one debates with people, I'm resolving to think, not just about why I'm right and they're wrong, but about what kind of argument is likely to persuade them."

I think that I try and do those things as much as I can anyway, but I haven't read the book. (Not that I'm not tempted to.) I guess what I mean is, didn't you think that those things were good ideas before you had read the book? I would guess your answer is probably along similar lines to eating or drinking unhealthily. You already know it is bad, but you need someone to point out just HOW bad before you actually change your mind about doing it.

John W. Ratcliff

I don't know if the book you read got into it, but the issue of relativistic belief systems, individual reality tunnels, and quantum psychology have a lot to do with this topic.

Most atheists I have encountered presume that there is a single, fixed, 'etic' (external) reality, often fully reducible to entirely materialistic properties, that is fixed and absolute. This can never be proven of course; since each of us filter reality through our individual neurolinguistic grid we all experience it in unique ways.

While it is fine to talk about the scientific method and how it is the greatest tool we have for discovering what we can best, collectively, share as knowledge about the Universe, it does not, in and of itself, define the limits of that reality.

At a certain point you have to acknowledge that people who have an experience, unique to themselves, have a perfectly rational and logical reason to modify their beliefs to match their experience.

The simplest example of this kind of thing would be a UFO experience. Now, personally I have never seen a UFO and, to my knowledge, UFOs have never been reproducible on demand under laboratory conditions. This would suggest then that UFOs 'are not real'.

However, if I were to have witnessed a UFO (not some distant blurry blob but let us say a structured craft, a flying saucer if you will, and it had not only been witnessed visually but had been touched, felt, and left a residual physical affect on the environment), then I would be crazy, illogical, and irrational not to believe in them.

Given this specific example we have two people; one who thinks it would be an irrational and illogical leap to believe in UFOs and another who would feel, justifiably so, just the opposite.

Once you fully embrace the implications of quantum psychology and try, more often, to employ Eprime in your speech, I believe we take a major step forward in how we communicate as a species.


(References to Quantum Psychology, etc. come from the works of Robert Anton Wilson.)

Greta Christina

"So what did the book change your mind about?"

Good question. Here's what it changed my mind about:

a) It pointed out a number of specific ways that this process works that I wasn't aware before -- thus, I hope, making me better able to recognize it.

b) It pointed out a number of specific strategies for dealing with the process that I hadn't known about -- thus, I hope, making me better able to cope with it.

And it also pointed out the degree to which this process is usually completely unconscious. That *was* news to me. I'd tended to think of rationalizations as the more conscious, deliberate variety. Knowing just how powerfully resistant the process is to consciusness and introspection is making me more vigilant about trying to recognize the signs of it. And maybe more importantly, it's making me less critical, and more empathetic, when other people do it.

To some extent, it is, as you suggested, a matter of learning just how harmful this problem is, and just how difficult it is to address. But it's also got many specific, pragmatic pointers on dealing with it that I hadn't had before.



I don't deny that a person can have such an experience, what I deny is that their interpretation of said experience is correct. If something is entirely subjective and unverifiable, then it is useless. That person may wholeheartedly believe in it, but it is still useless.

Also, particularly relevant: if said claim defies what we know about the universe (I'm not talking about stuff that is remotely possible, I'm talking about claims that are simply ridiculous), then we have all the reason in the world to dismiss it as rubbish.

Thus the rub to your little tirade: if it is entirely subjective, and unverifiable, then skeptics are quite justified in dismissing it as nonsense. To do otherwise is to fall prey to all sorts of woo and bullshit.


"A mule with guns" ... and coffee on the keyboard.

So without giving away the whole book, what is an example of a sort of strategy are you referring to for dealing with rationalization in others?

I believe I have gotten pretty good at spotting at least some of my own rationalizations (though I'm not at all averse to having additional strategies). However, by and large my success at helping other people recognize their own rationalization has been pretty hit and miss. I'm not talking about deconversion - just getting someone to see that their premises aren't necessarily givens - in any number of situations. Sometimes I have been very effective, sometimes quite ineffective.

So I intend to pick up the book soon (next time I'm near a bookshop if possible), but for right now I'm curious to understand at least something of what you're talking about.


You had me at the title actually. It's been delightful to read your thoughts on it; as compelling an endorsement of any book as I've ever read.

Thank you.


Another great post! I must read that book.

Explaining religion as a defense strategy against cognitive disonance when beliefs don't match reality is a useful way to go, but only so far. It can explain a lot, but like all theories of religion, it cannot explain everything. It is too complex a phenomenon for that.

I haven't read it, but a book from several decades ago by a chap named Festinger (sp?) addressed cognitive disonance when apocalyptic predictions proved unreliable.

One guy on my thesis examination committee, Robert Carroll (now deceased) took up this idea and used it to explain religious change that resulted in rewriting early versions of what are now biblical prophetic books: The earlier "prophecies" did not come true, so the texts were edited, supplemented and reinterpreted to preserve their "reliability" and to project fulfillment onto the future.

I think what Greta-Christina is pointing to is a frequent strategy reinforcing belief,but also reinforcing communities when they are threatened by "outside" ideas. I'm not sure it is an explanation for religion per se. Religion seems to me to be more of a symbolic projection of community identity and values, that is constantingly evolving and changing while affirming its own timelessness. There are many processes of group boundary definition and maintenance. Teh bigger the percieved threat, the higher the walls surrounding the group get and the more desparate the need to maintain solidarity even in the face of "reality".

Anyway, Greta-Christina well done, with a thoughtful post to start a cold Canadaian morning, and now for more coffee.


John W. Ratcliff


I'm surprised at your response. The theme of this post, I thought, was about facilitating communication between people who hold differing belief systems.

I feel it is important to acknowledge the simple fact that each individual human being forms a belief of reality in a distinctly unique way.

As wonderful as the scientific method is, it does not sustain every belief that an individual holds. Their beliefs are always going to be formed from their personal experience first and the consensus view the latter.

Simply because someone has an experience that cannot be reproduced on demand under laboratory conditions does not make it any less 'real' to that individual.

I believe it is important to acknowledge this and grant that each individual is going to interpret reality in their own unique, and quite relativistic way.

Given your previous statement it seems to me that you believe strongly that reality is confined to what the scientific method can, and has, revealed about it. I might remind you that reality is under no specific restrictions to abide by these rules.

Last I checked, given our best understanding of modern physics, there is still no such thing as a thing (See: 'The Matter Myth' by Gribbon and Davies).

Who collapses the quantum wave function, or, is that an illusion in and of itself?



Very interesting commentary. This is why I so appreciated Sam Harris's talk that he gave to the Atheist Alliance. When I made my break with religion I found it very hard to embrace atheism because to me it seemed like another mule(your great metaphor) of a different color.


John, that was my response. I'm well aware that people form their subjective interpretation of reality from what they experience, but, like I said, a subjective experience (a misfiring of neurons, irrational interpretation of an experience, whatever), that is not verifiable is likely not real. Not to say that it *cannot* be real, but, it likely is not. There's a reason why there are skeptics, and I feel that skepticism is a necessary thing.

Hard-core skeptic, here.


Based on your recommendation, I picked up this book last night. Reading it, I was struck by how much their theory of cognitive dissonance and rationalization applies to political discourse on the web. In particular, I've noticed that people will trash someone on the "other side" for certain actions/behavior, while ignoring or giving a free pass to someone on their own side that does the same thing. Conversely, when someone attacks someone on the other side for a certain/action behavior, a defender from the other side will invariably answer with "how come you weren't saying the same thing when your guy did it." The attacker will either ignore this, or will come up with some reason why what their guy did wasn't so bad/wasn't the same thing, and the defender will often not actually defend the behavior of their own guy.

Of course, I'm not saying this always happens: there are some people who hold the people on their side to the same standards, and who don't brush off attacks with counter-attacks or weak justifications. But I do see this a lot, and it very neatly follows the patterns laid out in the book.

Greta Christina

David: I totally agree with you. And it's not just discourse on the Web. It's discourse, period. Heck, it's life, period. I have definitely found myself cutting people a lot more slack if I like them, and cutting people no slack at all if they don't. When I'm ragging on someone I don't like for doing something that annoys me, I often have to stop and ask myself, "Would this have bothered me if anyone else had done it?"

Greta Christina

"So without giving away the whole book, what is an example of a sort of strategy are you referring to for dealing with rationalization in others?"

Well, the authors explain this better than I can, and in more depth. But to give an example... Well, let me just quote. Here, they're talking about what to do -- and what not to do -- if you have a relative who's fallen victim to a con artist and is rationalizing themselves into believing that it's not a con:

"Therefore, says Pratkanis, before a victim of a scam will inch back from the precipice, he or she needs to feel respected and supported. Helpful relatives can encourage the person to talk about his or her values and how those values influenced what happened, while they listen uncritically. Instead of irritably asking 'How could you possibly have believed that creep?' you say 'Tell me what appealed to you about the guy that made you believe him.' Con artists take advantage of people's best qualities -- their kindness, politeness, and their desire to honor their commitments, reciprocate a gift, or help a friend. Praising the victim for these worthy values, says Pratkanis, even if they got the person into hot water in this particular situation, will offset feelings of insecurity and incompetence."

They also talk a lot about teaching children that it's okay to make mistakes... and teaching them that making mistakes doesn't reflect on their character. They have a whole section on how in America, we tend to think -- and to teach our children -- that intelligence and ability are natural, inherent character traits. So when we make mistakes, we tend to take it personally, to see it as reflecting on our innermost character. Other cultures see intelligence and ability more as something you acquire through hard work... so they're more likely to see mistakes as part of that hard work, instead of a personal failing. So we can encourage each other (and ourselves) to see mistakes in this more positive light.


You hooked me. I just picked the book up at Borders.

The Ridger

I'll be getting it today.

Christian Bachmann

Strongly agree with your idea that atheists should be skeptic of their own beliefs and behaviours. I am a fan of Karl Popper's idea that hypotheses never can be proven but only falsified. We atheists have the hypothesis that no god exists. Therefore we should take easy opportunities to find evidence for god(s), trying to falsify our hypothesis, and this requires discussion with theists being taken seriously. Of course, falsification will fail. You might consider this a waste of time but I think it will pay off in form of a better funded reasoning and a better ability to draw the undediced on our side.

Shaun R. Connell


Thanks for nothing that you are going to be more open about your mistakes and rationalizations. Hopefully both sides of the God debate will learn from your example.

Shaun R. Connell

Oops, I mean, "thanks for noting" -- a single typo can completely switch my meaning around. :P


Thanks, Greta.

Good example, because I see right away how that connects directly to reasons for someone rationalizing in the first place.

I guess if you understand what feelings led to the need to hold to a given rationalization, you can support the feelings independently of the rationalization, making it easier to examine and hence perhaps let go of the rationalization.


Great book -- I read it about a month ago. The thing that made the most imnpression on me was how we become invested in our choices. How we can rationally compare several, quite similar items -- but once we choose one we start recasting it in our minds as the BEST choice, oh so much better than those other inferior choices.
Books like this challenge you to be on guard against your own "instinctive" reactions. Another good one is "Don't Believe Everything You Think."


Incidentally, a lot of these ideas also appear in "Stumbling on Happiness" by Daniel Gilbert. His book is more about how we're really bad at predicting the future because of the irrational mistakes we make when remembering, perceiving, or predicting.. but it's a lot of these same topics. If you haven't read it, I'd recommend it.


Hmm, yes, you've talked me into going out and getting a copy of this too.

Cognitive dissonance is so insidious, and it's easy to take a holier-than-thou approach as many atheists do (should that be rationaller-than-thou?)

I think admitting your own mistakes and confusions and weaknesses can be a really a powerful way of getting others to look at the extent to which they do the same - often much more effective than shouting at them.

(I particularly like the World Question Centre's collection of people talking about how they've changed their minds-

Anyway, thanks - must add you to my blogroll as there's lots of interesting stuff here!



Have you read Robert Cialdini's "Influence -- Science and Practice"? That was one of life changing books for me, the first that comes to mind when I think of "books everyone should read". The full text is available online here. The book you described reminded me very powerfully of the chapter "Commitment and Consistency". Actually, the reason I'm hesitant to buying "Mistakes Were Made" is because in all your review I haven't found something that they say and Cialdini doesn't... while he actually suggests one technique of fighting rationalization that I find very useful. It goes like this: try earnestly to distance yourself from the decision -- ask yourself the question "If I were put back in time, would I honestly do the same thing again?" and then listen carefully to your first response, the one that comes from the heart of hearts.
It's not a perfect idea, but a damn good one...

Concerning how to speak to someone who's been rationalizing something for a long time, I think you'll enjoy Steven Hassan's book "Releasing the Bonds", about helping people who were brainwashed by destructive cults. Excerpts are available on his site, -- a great site about cults and fighting cult mind control.

The comments to this entry are closed.

Subscribe/ Donate to This Blog!

Books of mine

Greta on SSA Speakers Bureau

  • Greta Christina is on the Speakers Bureau of the Secular Students Alliance. Invite her to speak to your group!

Your email address:

Powered by FeedBlitz

Powered by Rollyo

Some Favorite Posts and Conversations: Atheism

Some Favorite Posts and Conversations: Sex

Some Favorite Posts: Art, Politics, Other Stuff