Professor Ali Makhdoumi designed a data acquisition mechanism that maximizes platforms' utility while compensating privacy-sensitive users
The rise of artificial intelligence and machine learning is increasing the demand for data from app users, device owners, firms, consumers, and even patients.
As data-hungry technologies are getting more and more efficient, the key question is how to incentivize data-sharing while protecting users’ privacy, said Ali Makhdoumi, an associate professor of decision sciences at Duke University’s Fuqua School of Business.
In a new paper to appear in the journal, Operations Research, Makhdoumi and co-authors Alireza Fallah of University of California, Berkeley, Azarakhsh Malekian of University of Toronto, and Asuman Ozdaglar of Massachusetts Institute of Technology argue that the solution may be in designing a mechanism that measures the privacy sensitivity of users and compensates them for relinquishing personal data.
“In many machine learning applications, we treat data as given,” Makhdoumi said. “However, typically data is provided by individuals who have their own privacy concerns. So, the question is: how should we compensate these privacy-concerned users? And before answering this question, we need to understand how one can guarantee privacy.’’
In their research, Makhdoumi and colleagues use ‘differential privacy,’ an approach widely adopted in the tech industry.
[This article has been reproduced with permission from Duke University's Fuqua School of Business. This piece originally appeared on Duke Fuqua Insights]