The Darker Side of the Trust Mechanism

As a nice bookend to Alasdair Palmer’s report on Daniel Kahneman (see the P3 story) is Chris Mooney’s report on Liu and Ditto.

Chris is trying really hard to squeeze a political spin out of this, but the main story seems to be:

Jonathan Haidt… postulates that our views of what is right and wrong are rooted in gut emotions, which fire rapidly when we encounter certain moral situations or dilemmas — responding far more quickly than our rational thoughts. Thus, we evaluate facts, arguments and new information in a way that is subconsciously guided, or motivated, by our prior moral emotions. What this means – -in Haidt’s famed formulation — is that when it comes to evaluating facts that are relevant to our deep-seated morals or beliefs, we don’t act like scientists. Rather, we act like lawyers, contorting the evidence to support our moral argument.

But are we all equally lawyerly? The new paper, by psychologists Brittany Liu and Peter Ditto of the University of California-Irvine, suggests that may not actually be the case.

Liu and Ditto found a strong correlation, across all of the issues, between believing something is morally wrong in all case — such as the death penalty — and also believing that it has low benefits (e.g., doesn’t deter crime) or high costs (lots of innocent people getting executed).

However, not everyone was equally susceptible to this behavior. Rather, the researchers found three risk factors, so to speak, that seem to worsen the standard human penchant for contorting the facts to one’s moral views. Having a strong moral view about a topic makes one’s inclination toward “moral coherence” worse, as does knowing a lot about the subject (across studies, knowledge simply seems to make us better at maintaining and defending what we already believe), [and the] third risk factor is … political conservatism.

Call em a skeptic on these results. I suspect they are culturally conditioned. But the key point remains. No matter how well formulated your argument, most people will not care.

But I didn’t need psychologists telling me that. I’ve known it since not long after I got onto usenet in 1989.

We believe what we trust. What comforts us. What our friends want us to believe.

What I don’t know is how to overcome it. Expertise needs a role in governance. Somehow it used to have one. Somehow the iconoclasts of my generation (my g-g-g-generation) ruined that. History will not treat us kindly after all.



  1. Somehow the iconoclasts of my generation (my g-g-g-generation) ruined that.

    Hardly that simple, surely? Nixon was forced to enroll large numbers of scientists into government when he formed the EPA; Dick didn't do that because he was a nice guy, rather he was forced to do so by a coalition composed in part of dirty smelly hippies and slide-rule wielding scientist experts working together.

    The ongoing and painful lobotomizing of the government is mostly the outcome of decades of effort by myriad outfits like the Club for Growth.


    Confusion has its cost

    or how about

    So we cheated and we lied
    And we never failed to fail

    (Crosby, Stills and Nash)

  2. If people are only convinced by emotional arguments then that is what must be used. Fear is a strong(est) emotion. That is what should be used to persuade people that action is needed on global warming. The problem is that the scientists see fear mongering as cheating. So long as they are unwilling to overcome their scruples and face up to that unpleasant scientific fact, then "war (we are) all doomed" 🙁

    Cheers, Alastair.

  3. > But I didn’t need psychologists telling me that.

    This has been acknowledged by Art Markman:

    > Now, a lot of times psychology research is criticized for demonstrating things that people already know are true.

    Here's the spoiler of his blogpost:

    > [W]hen you are hot, it gets easier to imagine a world that is suffering the effects of global warming, and that increases your belief in global warming. When it is hot, it may also become easier to think about heat conceptually, but that conceptual ease does not seem to translate into changes in beliefs about global warming.

    So it seems that people tend to trust their most vivid perceptions.

    So we might need people that are hot.

    No, I'm not telling who I'm thinking about right now.

  4. Here's a good one on science thinking (from Tenney's blog* to my eye to P3):

    *found at:

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.