Thursday, April 12, 2012

blogging and the halo effect

As heuristics go one of the ones we use to get through the day has the handy moniker, "the halo effect".  Daniel Kahneman in Thinking Fast and Slow describes this as a process of identifying exaggerated emotional coherence. We extrapolate from incomplete data and build massive narratives on a few details observed from a short interaction.

Kahneman describes simply but vividly how this propensity to impute to the whole from selections led him to some disturbing discoveries about his teaching and grading approach. He related how he would grade papers in the usual way--he would start grading a single test by going from start to finish.  He began to question whether this was truly assessing the answers of his students.  He changed up his approach. Instead of grading each test student by student he chose to grade each test question by question. He began to realize that a student who aced the first question on the test might immediately crash and burn on the second question.  He came to realize he'd been grading based on the halo effect, a realization that deeply troubled him.

Kahneman described how he was distraught and bewildered by the results his new approach to grading his students' tests showed him.  But he also concluded that this was good for him. It compelled him to rethink how to grade tests. He has, for those who have read him, come up with the shorthand of System 1 and System 2.  System 1 forms immediate impressions, guesstimates and generally is remarkably accurate.  System 2 is the formally analytical set of cognitive processes, to simplify things a bit, and while System 2 can definitely analyze things it is frequently tempted to go with whatever sensible, coherent explanation System 1 will come up with.  Decorrelating error, as Kahneman puts it, is foundational to making more accurate assessments in testing.

The halo effect can be offset by group observation but, famously, group observation itself can be marred by the halo effect. It's easy for bloggers and self-styled iconoclasts to claim that the sheeple are easily influenced.  Yes, but so are self-styled iconoclasts and bloggers (including those bloggers who complain about bloggers). The halo effect permits a grossly simplistic assessment of whole categories of other people and comprehensive praise or blame is imputed through the most tangential of associations. 

Kahneman notes in Thinking Fast and Slow that even unbiased witnesses can still bias other witnesses.  Is the halo effect so powerful that even unbiased witnesses can actually bias each other?  Yep. But how? Kahneman explains a simple boardroom procedure--if people are asked to raise their hands to make a corporate decision compliance is almost physiologically guaranteed due to herd patterns.  By contrast, a secret ballot and individually writing out thoughts before a key decision drastically reduce the impact of a halo effect in a corporate boardroom decision. What Kahneman literally spells out is: "The standard practice of open discussion gives too much weight to the opinions of those who speak early and assertively, causing others to line up behind them." (Thinking Fast and Slow, Daniel Kahneman, page 85)

So what might a real world application of this look like?  Well, here's at least a possible case study:

http://joyfulexiles.files.wordpress.com/2012/03/scott-thomas-10-10-2007-no-show-for-trial.pdf

Scott Thomas wrote to Paul Petry: "The elders will submit their vote by show of hands."
This show of hands was going to be, as Jamie Munson put it "a final and binding decision."

http://joyfulexiles.files.wordpress.com/2012/03/10-16-2007-statement-of-the-elders.pdf

"It was unanimously decided ... ." shows us that a show of hands was a show of all hands.  Not too surprising why Petry asked if the vote was going to be by way of secret ballot, is it?  As Daniel Kahneman puts it in Thinking Fast and Slow the "wisdom of crowds" only works if there is no pervasive bias or steering. The trouble is that even knowing about cognitive biases does not mean you transcend them. Knowledge does not always bring with it the power to overcome elements of yourself. Sometimes knowledge merely provides a label for a bad habit of thought you can't shake.

Kahneman describes a great failure in how our minds work which is to simply say that our brains easily take the path of least resistance so that what you see is all there is. The halo effect has its way.  But the halo effect works in both directions and can lead to conspiratorial mind games in which not only the best is assumed on the basis of a small snapshot, the worst can be assumed as well.

Kahneman explains that the shortcut seeking habits of the human mind are such that if only one side is presented the human brain will favor that one-sided presentation even if a person is fully capable of imagining another perspective.  Why?  Kahneman explains that this is because humans prefer a narrative with emotional coherence over a pile of emotionally, intellectually, or morally ambiguous details. Kahneman takes a great deal of time (his whole book) to explain that this rapid ability to assess and react is normally the most amazing part of how our brains help us deal with the world around us. 

The trouble, of course, is that there are critical moments where the cognitive shortcuts that let us swiftly assess and react can terribly mislead us. We become overconfident in the coherence of the narrative we have created, often a narrative we have created reflexively before we have even analyzed what the nature of the narrative is.  It then becomes the narrative we live by whether or not it is actually true. We are often oblivious to the reality that how a subject gets framed will influence how we think about things.  Journalists and people who are interview subjects understand what leading questions are yet many people will not.  What the leading question clarifies is the reality that how a question is even asked can bring with it a bias or shape a bias in the response of the recipient.

Still another problem is what Kahneman describes as a "base-rate neglect".  He uses the example of a set of character traits many associate with librarians that can also be true of farmers, yet most people will credit someone with these traits as a librarian even though there are far more farmers. This can be applied, with some care, to the realm of blogging.  For instance, I may see some anonymous joe remarking how such and such bloggers must have a low view of pastors and the pastoral office to criticize this or that pastor.  That can be taken as a base-rate neglect. Want to know who else can be brutally dismissive of pastors? Pastors. If we were to look at the statistics of who actually identifies as secular, liberal or atheist and compared those numbers to those who describe themselves as religious and serious about their faith the atheists won't provide the bigger number.  But a base-rate neglect might inspire a person to presume that anyone who speaks up against pastors must have some issue with spiritual authority when that isn't even remotely the case.

Conversely, people committed to the idea that they must be hold-outs against spiritual abuse may exercise the same sweeping assessments of those they criticize. The halo effect works in both directions, and the temptation to exaggerate the emotional coherence of an entire group of people remains with us even if we know this is a bias we are liable to be tempted to.  When I was at Mars Hill the halo effect was the whole reason the courtship idiocy had any traction.  The right guys had the right halo effect around them at the right time and it didn't matter that reality pointed in another direction, most people wanted to believe in the exaggerated emotional coherence the courtship fad provided about the church culture and the individuals who stumped for it. A culture of complicity and duplicity didn't just burst forth in 2007 with firings, it was nascent within other self-congratulatory pious fraud fads in the culture before that point. 

Yet if these things are true how is it you or I may soldier on with a grossly simplified assessment of something that we know is grossly simplified?  Well, I suppose to explore that we'd have to discuss the sunk cost fallacy.  We get to points where even if we know things are not as we originally surmised we've committed too much of ourselves to the enterprise to back out of it now. The positives still outweight the negatives enough to soldier on.  It would cost too much emotionally and socially to actually concede I may have simplified or misrepresented things a bit because I don't want to concede the point that I was ignorant, let alone potentially willfully misunderstanding things to score a point.

Or if I were a dad I might not feel comfortable conceding I was blindsided by a young guy who won the heart of my daughter until five years after he's married her because if I let on I had no idea what was going on it would make me look ignorant and out of control and, well, I can't have that now, can I? I get that but to me it's the same process of self-defending simplification and evasion I've seen in how people defend or attack certain churches on the basis of the assessment of a single person.

Now of course groupthink in a boardroom setting with brainstorming and a show of hands will reduce rather than promote creativity; it will play to the biggest and loudest mouths who speak first because people are that way, even the people who claim otherwise because those people, often, are the big loud mouths who speak first and steer the conversation in the board room.  In many of these cases the sunk cost fallacy kicks in and we've got this impulse to keep things as simple as we've made them out to be because we've already invested too much time, feeling, and effort into building something from that assessment.  If it happens to be a house of cards, well, we'll just have to glue the cards together.

There's a pretty good chance, no matter how confidently you think you've put things together, that the reality is not quite as simple as what you've been making it. I know this won't stop you from saying it's that simple anyway. As I was saying earlier, knowing what a cognitive bias is doesn't stop you from working within it.  Just because I know what a cognitive bias is doesn't mean I'm not effected by them.  If there's something to be said for the old proverb "iron sharpens iron" it may be that the ancients were no less aware than we are that there are times when you can't see outside your prejudices and that the only person who can is, very simply, someone who isn't you.

The halo effect to damn or praise is not just something that happens in the mind of other people.  It happens in my head and it happens in yours. Don't kid yourself about whether or not you have biases. You have them. Only fools and liars will even suggest otherwise about themselves, probably to steer a conversation where they want it to go. What you can attempt to do is find out whether your biases are keeping you from discovering facts or are leading you to judgments that can't withstand further investigation.

1 comment:

Anonymous said...

Good thoughts here, Wenatchee. Thanks for sharing. As I went through my own personal ordeal with Mars Hill and ran things up the food chain (CG Pastor, Campus Pastor, Lead Pastor, Preaching Pastor) with multiple requests for some type of appeal, I kept thinking at some point, someone will be objective enough to call out the unbiblical, unkind, un-Christlike things were said and done. I thought surely someone along leadership chain will step in and acknowledge that my situation was all a misunderstanding. What I ultimately determined was: 1) the sunk cost fallacy - it seemed no leaders wanted to get there hands dirty and actually look into the facts. Instead the consistent refrain was that they supported the previous decisions - all while disregarding the specifics that I clearly articulated as problematic. 2) the halo effect is real at MHC. In my situation, because multiple pastors agreed that I was prideful (mind you there was no black and white, tangible sin ever mentioned), it was considered inarguable. Later, pastors admitted that they predisposed each other to perceive me in this light by "warning" them about me before some of them had ever had an opportunity to form their own opinion through first hand interaction. 3) the leaders involved in my situation were not renegades, but rather "company men" who were following the guidance given from leadership whether by explicit training or indirect example.