There are many such examples of what Bazerman and Tenbrusel would argue are unintentional ethical failings: People fall prey to self-serving bias: an accountant whose future business depends on maintaining the approval of the companies he’s meant to be auditing is genuinely more likely to believe his clients’ books are in order. We discriminate unconsciously against those who aren’t like us, passing them over for promotion or low-balling them in negotiations. And even when we lie, cheat, or steal for personal gain, we often disengage, at least temporarily, from the set of values that would normally lead us to look down upon those who lie, cheat, and steal.
These acts are all done, by and large, unthinkingly. In the terminology of Nobel laureate Daniel Kahneman, they’re processed automatically by our System 1 thinking—that is, the thinking that’s driven by intuition and emotion. If we could only force the System 2 part of our brain, which reasons logically through decisions, with full appreciation of the many biases that plague our intuitions and instincts, we might behave differently. So a first step is at least equipping business school students (also future lawyers, doctors, accountants, and probably everyone else) with a basic understanding of our psychological frailties and vulnerabilities—greater self-knowledge is at least a first step towards a solution.
However, mere knowledge of our flaws isn’t necessarily enough to stave off unconscious temptation. As Kahneman notes in his best-seller, Thinking Fast and Slow, “My intuitive thinking is just as prone to [cognitive errors] as it was before I made a study of these issues.” No one, Kahneman and Bazerman included, is immune to ethical blind spots.