Saturday, October 24, 2015

HT Mockingbird--"You're not as virtuous as you think you are", a long set of excerpts

https://www.washingtonpost.com/opinions/youre-not-as-virtuous-as-you-think/2015/10/15/fec227c4-66b4-11e5-9ef3-fde182507eac_story.html
...
When I ask students whether, as participants, they would have had the courage to stop administering shocks, at least two thirds raise their hands, even though only one third of Milgram’s subjects refused. I’ve come to refer to this gap between how people believe they would behave and how they actually behave as “moral overconfidence.” In the lab, in the classroom and beyond, we tend to be less virtuous than we think we are. And a little moral humility could benefit us all.

Moral overconfidence is on display in politics, in business, in sports — really, in all aspects of life. There are political candidates who say they won’t use attack ads until, late in the race, they’re behind in the polls and under pressure from donors and advisers, their ads become increasingly negative. There are chief executives who come in promising to build a business for the long-term but then condone questionable accounting gimmickry to satisfy short-term market demands. There are baseball players who shun the use of steroids until they age past their peak performance and start to look for something to slow the decline. These people may be condemned as hypocrites. But they aren’t necessarily bad actors. Often, they’ve overestimated their inherent morality and underestimated the influence of situational factors.


Moral overconfidence is in line with what studies find to be our generally inflated view of ourselves. We rate ourselves as above-average drivers, investors and employees, even though math dictates that can’t be true for all of us. We also tend to believe we are less likely than the typical person to exhibit negative qualities and to experience negative life events: to get divorced, become depressed or have a heart attack. [emphasis added]

In some ways, this cognitive bias is useful. We’re generally better served by being over confident and optimistic than by lacking confidence or being too pessimistic. Positive illusions have been shown to promote happiness, caring, productivity and resilience. As psychologists Shelley Taylor and Jonathon Brown have written, “These illusions help make each individual’s world a warmer and more active and beneficent place in which to live.”

But overconfidence can lead us astray. We may ignore or explain away evidence that runs counter to our established view of ourselves, maintaining faith in our virtue even as our actions indicate otherwise. We may forge ahead without pausing to reflect on the ethics of our decisions. We may be unprepared for, and ultimately overwhelmed by, the pressures of the situation. Afterward, we may offer variations on the excuse: “I was just doing what the situation demanded.”


The gap between how we’d expect ourselves to behave and how we actually behave tends to be most evident in high-pressure situations, when there is some inherent ambiguity, when there are competing claims on our sense of right and wrong, and when our moral transgressions are incremental, taking us down a slippery slope. [emphasis added]

...
We would see fewer headlines about scandal and malfeasance, and we could get our actions to better match our expectations, if we tempered our moral overconfidence with some moral humility. When we recognize that the vast majority of us overestimate our ability to do the right thing, we can take constructive steps to limit our fallibility and reduce the odds of bad behavior.


One way to instill moral humility is to reflect on cases of moral transgression. We should be cautious about labeling people as evil, sadistic or predatory. Of course, bad people who deliberately do bad things are out there. But we should be attuned to how situational factors affect generally good people who want to do the right thing.

Research shows that when we are under extreme time pressure, we are more likely to behave unethically. When we operate in isolation, we are more likely to break rules. When incentives are very steep (we get a big reward if we reach a goal, but much less if we don’t), we are more likely to try to achieve them by hook or by crook.

I teach a case about an incentive program that Sears Auto Centers had in the 1990s. The company began offering mechanics and managers big payments if they met certain monthly goals — for instance, by doing a certain number of brake jobs. To make their numbers, managers and mechanics began diagnosing problems where none existed and making unnecessary repairs. At first, employees did this sporadically and only when it was absolutely necessary to make quota, but soon they were doing unneeded brake jobs on many cars. They may not have set out to cheat customers, but that’s what they ended up doing.

Along with studying moral transgression, we should celebrate people who do the right thing when pressured to do wrong. These would include whistleblowers such as Jeffrey Wigand of the tobacco industry and Sherron Watkins of Enron. But we also can look to the civil rights movement, the feminist movement and the gay rights movement, among others, to find people who used their ingenuity and took great risks to defy conventions or authorities they considered unjust.
It's popular enough for people on the internet to type "LOL don't drink the kool-aid".  Yes, well, welcome to the kool-aid drinking club fellow human.

Nobody thinks they're going to drink the kool-aid even after it turns out they've chugged down a gallon of it. If anything, ten years connected to Mars Hill has persuaded me that the more certain you are that you won't drink the kool-aid the more you have probably had and the more you will probably drink. Doubting the moral compass of others isn't as paramount to "not drinking the kool-aid" as doubting the rightness of your own moral compass.  In Christian terms that's recognizing that you, yes you, have the capacity to sin and that sin is not just about all the bad stuff you knowingly do on purpose because you think you can get away with it--it's also about the terrible consequences of stuff you think is okay to do because your heart's in the right place and in case it's everyone else that needs to fall in line. Whatever shortcut I'm about to take is worth it because of the results that will accrue.

Americans like to think to themselves they would not be the sheeple in the Milgram experiment, but you're a sheeple, you just haven't found the cause or person for which you're willing to be one, perhaps.

No comments: