How to turn down the boil on group conflict

Intergroup conflict can grind office productivity to a halt. Jeffrey Lees discusses how understanding psychological stereotypes can help divided parties compromise.

Published: Feb 24, 2020 02:28:26 PM IST
Updated: Feb 24, 2020 02:30:54 PM IST

Image: Shutterstock

Even as polarized political discussion appears to have frozen the possibility of compromise, new research suggests that divided sides can come together on many issues to make decisions.

“Our research finds that inaccurate beliefs really drive behavior and contribute to intergroup conflict,” says Jeffrey Lees, a doctoral candidate in Organizational Behavior and Psychology at Harvard Business School.

In actuality, most people have a wildly inflated sense of just how negative the other side feels, according to a new paper that Lees co-wrote with Harvard University Associate Professor of Psychology Mina Cikara. “If you forecast that no matter what you propose, the other side will hate it, then you are going to say compromise is a waste of time,” Lees says.

The paper, Inaccurate Group Meta-Perceptions Drive Negative Out-Group Attributions in Competitive Contexts, was published in November 2019 in the journal Nature Human Behavior.

We want to compromise
In a series of experiments, Lees and Cikara found people are much more willing to compromise, but resist trying because they think those on the other side—and even those within their own group—will resist going along. But they also unearthed good news on how mistrust can be overcome on many issues.

Lees first started considering these dynamics in a business context. “I was thinking about how people inside organizations predict how people outside of the organization perceive it, and how they might get that judgment wrong,” Lees says. “It didn’t take me long to realize how that sort of judgment applies in other contexts.”

He teamed up with Cikara, whose lab has looked at how people’s perceptions of others changed based on whether they think of them as individuals or groups. “How we attribute motives to other people becomes distorted when we stop thinking of them as individuals and instead move to a framework of ‘us versus them,’” Cikara says.

In a political context, that can quickly lead to conflict.

“People not only have stereotypes of what other people are like, they also have stereotypes of what other people believe,” Cikara says. “‘They hate us for our freedom,’ or ‘they think we’re liberal snowflakes,’ or ‘they’re doing that to be obstructive,’ or ‘they want to ruin our American way of life.’ But when you actually talk to people about their opinions, almost nobody actually talks like that.”

Lees' and Cikara’s experiments found most people are much less negative than the stereotypes we harbor about the other group. For each experiment, the researchers presented real-world scenarios that advantaged one side or the other, and then asked participants to predict how negatively the other side would react.

For example, one scenario presented to participants who identified as Democrat, explained that Democrats in a state legislature were considering a change to committees that draw voting lines. While currently, committee members were appointed by the governor, a Republican, the new proposal would allow equal representation by both parties.

They then asked participants to predict on a 100-point scale how much Republicans would dislike or oppose the measure or consider it politically unacceptable. Responses averaged in the 80s, with the largest clump at 100.

As negative as possible
“The forecasts were pretty much as negative as possible,” Lees says. In reality, however, the real responses were closer to 50—showing that participants overrated how badly the other side would feel.

Lees and Cikara found similar results for other scenarios, involving changes to selection of judges, campaign financing, and renaming of a state highway. (The researchers purposefully stayed away from “hot button” issues such as gun control and abortion, which might spur too much passion.)

The results were consistent for both Democrats and Republicans, or even if they just presented an anonymous “Party A” and “Party B.”

“They are totally insensitive to the scope or impact of the issue,” Cikara says. “They just think the other side is going to be upset about anything.”

Even more interesting, people made the same forecasts about others in their own group, believing their fellow Democrats or fellow Republicans were angrier about a measure, even when they themselves were only mildly opposed.

Having such polarized views of both political parties naturally leads to less willingness to negotiate and compromise, Lees says.

“If you are a legislator, you are thinking no one across the aisle or in my own tribe will support compromise, but that’s in fact wrong. Both sides might be okay with compromise, but no one’s willing to propose it because of inaccurate forecasts.”

Overcoming bias to reach cooperation
The news from their experiments wasn’t all bad. When the researchers flipped the script to create scenarios that were cooperative, study participants were much more accurate in their predictions. For example, in the voting districts scenario, the researchers told participants that it was Democrats who were proposing the change to make the commission fairer, even though a Democrat was currently in the governor’s office and stood to lose advantage through the change.

In that case, Democrats and Republicans alike accurately predicted how both sides would feel.

“Suddenly, people’s forecasts become accurate, which is quite an optimistic finding for cooperation,” Lees says. “If you can actually engender cooperation, people are much more likely to have accurate perceptions that might drive reconciliatory behavior.”

In a final experiment, Lees and Cikara showed that people could change their perceptions when confronted with new information. After making their own predictions about the negativity of the other side, participants were shown their true level of opposition—on average, much lower than they’d assumed. Afterwards, people decreased the degree to which they thought the other side was engaging in purposeful obstructionism.

 “There’s a lot written about how people are totally insensitive to the truth when told that their beliefs are wrong,” Lees says. “This suggests that’s not the case. People are willing to update their beliefs when they are simply told they are inaccurate.”

Better business outcomes
This finding, which indicates the potential for creating cooperation, carries implications for business as well.

“In the context of teams or negotiations, adopting a competitive mindset can lead to undue pessimism about how others feel,” Lees says. “These inaccurate beliefs can lead to missed business opportunities. But if those contexts are reframed as cooperative, accurately forecasting how someone across the negotiation table might respond to a particular proposal becomes easier.”

That’s good news in a society that often seems to be grappling with intractable partisanship on every issue. While some issues may still present a gulf too wide to bridge, the study shows that there is at least some room for compromise and mutual understanding between the parties, if they can just start talking to each other.

“When you’re not talking about hot-button issues, you shouldn’t be afraid to broach the topic with people who have a different position than you,” Cikara says, “because it turns out you most likely have an inaccurate perception about what they think—and they have the same of you. All it takes is one person to break the cycle.”
 

Michael Blanding is a writer based in the Boston area.

This article was provided with permission from Harvard Business School Working Knowledge.

Meet the US delegation accompanying Trump to India
Telecom Sector's Towering Woes
X