AI is powerful but is it fair? This new approach levels the playing field

In the rush to adopt AI, many organizations are overlooking a critical issue: fairness. Ensuring AI systems make decisions that don't unfairly disadvantage individuals or groups is crucial

Published: Sep 11, 2024 12:04:09 PM IST
Updated: Sep 11, 2024 12:04:41 PM IST

Image: ShutterstockImage: Shutterstock

Generative AI is transforming businesses, offering unprecedented efficiency and productivity gains. This technology automates repetitive tasks, streamlines processes, and analyzes vast amounts of data faster than ever before. As a result, businesses are seeing dramatic improvements, boosts in operations and bottom lines.

But in the rush to adopt AI, many organizations are overlooking a critical issue: fairness. Ensuring AI systems make decisions that don't unfairly disadvantage individuals or groups is crucial.

Ignoring this can lead to unintended biases, worsening inequalities and eroding trust. Imagine an AI system that consistently assigns the best shifts or most desirable tasks to certain employees – this could create a hidden layer of workplace discrimination.

Darden Professor Rupert Freeman and his colleagues are addressing this challenge with a new concept called "order symmetry." Their new paper, "Order Symmetry: A New Fairness Criterion for Assignment Mechanisms," proposes a fresh approach to ensuring AI allocates tasks and resources fairly.

Order Symmetry Levels the Playing Field

“Order symmetry means that no one is treated inherently better by the rule than anyone else,” says Freeman, an Assistant Professor of Business Administration. “This ensures that no one is always getting the best items, making the process fair for everyone.”

Unlike other fairness concepts, such as anonymity (treating everyone the same) or envy-freeness (nobody wants what someone else has), order symmetry can be achieved without randomizing the allocation process.

This idea isn’t just theory; it has real-world uses in business. Order symmetry helps make job assignments fair, so no one employee always gets the short end of the stick. As companies use more AI in their operations, especially for things such as assigning tasks or resources, making sure these systems are fair becomes crucial. Freeman emphasizes that organizations need to focus on making unbiased decisions and creating fair outcomes for everyone. 

Freeman’s paper, co-authored with Geoffrey Pritchard and Mark Wilson, shows how order symmetry can be used to pick the fairest way to allocate resources. They look at different methods, including one called Top Trading Cycles (TTC). In this system, everyone ranks their choices for available resources. For example, in a team, each worker lists which vendor they would like to work with on a project. 

“You start by making a preliminary assignment of vendors to workers,” Freeman explains. “Then, based on the listed preferences, the algorithm automatically facilitates trades among groups of workers who would mutually benefit. After making a trade, the affected workers and vendors are removed from the pool and the process is repeated until everyone has a match.”

Also read: Generative AI is evolving every week, but the adoption is happening at a slower rate: SymphonyAI's Romesh Wadhwani

Keeping People Honest

TTC achieves order symmetry by treating all participants equally, regardless of the order in which they are considered. Also, TTC is strategy-proof, meaning participants can’t improve their outcomes by manipulating the system. They have no incentive to be dishonest because being truthful is always their best strategy.

TTC also ensures “Pareto efficiency,” an economic state where resources are distributed so that no one can be made better off without making someone else worse off. This way it makes the most of what is available.

The popular Serial Dictatorship (SD) rule is an example of a system that can create unfairness. Here people are ranked beforehand and pick their favorite option one by one. It’s like being last in line at a buffet — the first person gets the best choices, while those at the end are stuck with whatever’s left over.

“This predetermined order creates an intrinsic inequality whereby some participants can reasonably claim that they are treated unfairly, as the mechanism is fundamentally based on a hierarchy of choices,” Freeman says.

Also read: How will AI reshape our world? It's really up to us

Freeman’s research concluded that an order symmetry perspective could also be used to enhance the fairness of other resource allocation methods, such as the classic “Boston method,” which uses a predetermined tie-breaking rule when resources are scarce. In the usual implementation, the same tie-breaking rule is used in multiple places in the algorithm.

“The basic idea of order symmetry is that no participant should be given an advantage by the algorithm,” he explains. “Breaking ties in the same way throughout the algorithm violates this principle. Instead, the order symmetry lens suggests that we should instead use different tie-breaking orders at different points in the algorithm. This is a very simple change but one that can lead to significantly enhanced fairness.”

The Business Case for Fair Allocation

Companies are realizing that as they use AI to boost efficiency, they need to build fairness into these systems from the start. Ideas like order symmetry help ensure AI doesn't play favorites or discriminate against anyone.

“By integrating such fairness principles into AI algorithms from the very beginning, organizations can ensure that the resulting systems are not only efficient but also equitable and just,” says Freeman. “This approach helps to prevent biases and unfair treatment that could arise if fairness is only considered as an afterthought.”

Also read: Put humans first, AI second: David De Cremer

Using fair AI practices can improve how companies make decisions, especially in areas such as hiring and allocating resources, notes Freeman. “By prioritizing fairness, organizations can mitigate risks associated with biased decision-making, thereby safeguarding their reputation,” he adds.

In the end, fairness in AI isn’t just about doing the right thing—it’s good for business too.

When companies use AI systems that consistently produce fair results, they build trust with their employees and customers. This trust leads to stronger loyalty and can set a business apart as a leader in responsible technology use. As AI becomes more prevalent in the workplace, companies that prioritize fairness will likely find themselves with a significant competitive advantage.

[This article has been reproduced with permission from University Of Virginia's Darden School Of Business. This piece originally appeared on Darden Ideas to Action.]