The rapidly developing field of behavioral ethics has described a decision-making process whereby we recognize what we should do
Think back to recent events when people making unethical decisions grabbed the headlines. How did auditors approve the books of Enron and Lehman Brothers? How did feeder funds sell Bernard Madoff's invesments? We would never act as they did, we think. We operate under a higher standard.
But the fact is that while we like to think of ourselves as fair, moral, and lawful, recent science shows us that we are quite capable of committing unethical acts, or approving of the dishonest acts of others, even as we believe we are doing the right thing.
Recognizing why we do this and how we can get out of the trap is the subject of the new book, Blind Spots: Why We Fail to Do What's Right and What to Do about It, by Max H. Bazerman, a professor at Harvard Business School, and Ann E. Tenbrunsel, a professor of business ethics at the University of Notre Dame.
In short, there is a gap between intended and actual behavior, according to the authors. The rapidly developing field of behavioral ethics has described a decision-making process whereby we recognize what we should do—give equal weight to job candidates of all races, for example—but in the end do what we want to do—hire just white candidates.
Such actions are not without consequences. The Challenger space shuttle explosion, steroid use in major league baseball, and the financial crash are all results of unethical decision-making, even though the participants at the time may have seen themselves acting in the right.
We asked Bazerman to discuss some of the ideas behind the book. A book excerpt follows.
Sean Silverthorne: Why did you write this book, and who should read it?
Max Bazerman: Research over the last two decades has documented that good people do bad things without being aware that they are doing anything wrong. Yet, training in ethics and corporate programs focus on intentional acts. We saw an opportunity to contribute to our understanding of how so many unethical acts occur.
Q: Why don't traditional approaches to thinking about ethics work?
A: Most ethicists define ethics to involve intentional action. Yet, if unethical actions are occurring without intent, we need to solve those problems as well. Our book is an attempt to move in this direction.
Q: What are ethical blind spots, and how do they influence our decisions?
A: There are many. But a few that will illustrate the point include having gender or race biases without knowing that you have these biases, overclaiming credit without meaning to do so, being affected by conflicts of interest, and favoring an in-group—such as universities often do when they give preferential treatment to the children of alumni. All these unethical actions can occur without anyone realizing that they are doing anything wrong.
Q: What is motivational blindness?
A: Motivational blindness is the tendency to not notice the unethical actions of others when it is against our own best interests to notice—such as auditors who fail to notice the faulty accounting practices of their clients, who have the power to fire them if they do notice.
Q: The book is full of examples of people making decisions that they thought were ethical but clearly violated their own standards for ethical behavior—for example, decisions that set the stage for the subprime mortgage crisis. Is there anything in the news more recently that illustrates some of your points?
A: Sure. Despite attempts at auditor reform a decade ago, we see Ernst & Young charged with contributing to the fall of Lehman Brothers. One can certainly ask whether Ernst & Young didn't notice what was wrong with Lehman's books because noticing was not in Ernst & Young's interest. We can tell the same story with the security rating agencies and their role in our recent financial collapse.
Q: The book is a little down on organizational programs designed to encourage ethical behavior. You say in general they don't work, and in fact can cause unethical behavior. What's the problem?
A: I would say that we see current programs as limited, due to the limited attention given to bounded ethicality—or the ways in which good people do bad things without knowing that they are doing so.
Q: What are some steps individuals and organizations can take to make decisions that are truly in line with their own ethical views?
A: Organizations can monitor how they are creating institutions, structures, and incentives that increase the likelihood of bounded ethicality. When industries allow conflicts of interest to remain, the leaders are responsible for the boundedly unethical actions that follow.
Q: How do you become aware of your blind spots?
A: By looking at the data. If you firmly believe that you want to give women and minorities greater opportunities in your organization, but the data show that you always seem to see the white male as the best candidate, this might provide a hint.
Book excerpt from Blind Spots: Why We Fail to Do What's Right and What to Do about It.
Preparing to Decide: Anticipating the "Want" Self
The "want" self—that part of us which behaves according to self-interest and, often, without regard for moral principles—is silent during the planning stage of a decision but typically emerges and dominates at the time of the decision. Not only will your self-interested motives be more prevalent than you think, but they likely will override whatever "moral" thoughts you have. If you find yourself thinking, "I'd never do that" and "Of course I'll choose the right path," it's likely your planning efforts will fail, and you'll be unprepared for the influence of self-interest at the time of the decision.
One useful way to prepare for the onslaught of the "want" self is to think about the motivations that are likely to influence you at the time you make a decision, as Ann [Tenbrunsel] and her colleagues have demonstrated in their research. Drawing on the sexual harassment study discussed in chapter 4, participants were asked to predict how they would react if a job interviewer asked questions that qualified as sexual harassment. Participants who were induced to think about the motivation they likely would experience at the time of the decision—the desire to get the job—were significantly less likely to predict that they would confront the harasser and more likely to predict that they would stay silent (just as those in the actual situation did) than were those who were not asked to think about the motivation they would experience at the time of the decision. As this study suggests, thinking about your motivations at the time of a decision can help bring the "want" self out of hiding during the planning stage and thus promote more accurate predictions.
Narrowing the Gap
To help our negotiation students anticipate the influence of the "want" self on decisions that have an ethical dimension, we ask them to prepare for the very question they hope won't be asked. When preparing for a job negotiation, for example, we encourage them to be ready to field questions about other offers they may have. Otherwise, when a potential employer asks "What's your other salary offer?" an applicant's "want" self might answer "$90,000," when the truthful answer is $70,000. If an applicant has prepared for this type of question, her "should" self will be more assertive during the actual interview, leading her to answer in a way that's in harmony with her ethical principles, yet still strategic: "I'm afraid I'm not comfortable revealing that information."
"You might also precommit to your intended ethical choice by sharing it with an unbiased individual."
Similarly, rehearsing or practicing for an upcoming event, such as a work presentation or exams, may help you focus on concrete details of the future situation that you might otherwise overlook. In her book Giving Voice to Values, Mary Gentile offers a framework to help managers prepare for difficult ethical decisions by practicing their responses to ethical situations. When you are able to project yourself into a future situation, almost as if you were actually in it, you can better anticipate which motivations will be most powerful and prepare to manage them.
The point of increasing your accuracy in the planning stage of decision making isn't to recognize that you will be influenced by self-interested motives and admit defeat to the "want" self. Rather, it's to arm you with accurate information about your most likely response so that you can engage in proactive strategies to reduce that probability. Knowing that your "want" self will exert undue pressure at the time of the decision and increase the odds that self-interest will dominate can help you use self-control strategies to curb that influence.
One such strategy involves putting in place precommitment devices that seal you to a desired course of action. In one example, Philippine farmers who saved their money by putting it in a "lockbox" that they could not access were able to save more money than those who did not, even factoring in the small cost of the lockbox. By eliminating the farmers' ability to spend their money immediately, the lockbox effectively constrained the "want" self. Ann's teaching assistant used a similar precommitment strategy to constrain her "want" self during finals week. Knowing she should study but would be tempted to procrastinate by spending time on Facebook, she had her roommate change her password so that she could not access the social networking site. By doing so, the student constrained her "want" self from acting and allowed her "should" self to flourish. Such precommitment devices explain the popularity of personal trainers at health clubs. By making appointments with a trainer (who might charge up to $100 an hour) with the threat of a cancellation fee, clients precommit to their "should" self, ensuring that they will work out rather than giving into the strong pull of the "want" self and watching TV instead.
When faced with an ethical dilemma, we can use similar strategies to keep our "want" self from dominating more reasoned decision making. Research on the widespread phenomenon of escalation of commitment—our reluctance to walk away from a chosen course of action—shows that those who publicly commit to a decision in advance are more likely to follow through with the decision than are those who do not make such a commitment. You might also precommit to your intended ethical choice by sharing it with an unbiased individual whose opinion you respect and whom you believe to be highly ethical. In doing so, you can induce escalation of commitment and increase the likelihood that you will make the decision you planned and hoped to make.
This article was provided with permission from Harvard Business School Working Knowledge.