A Burden Shared is a Burden Halved: A Fairness-Adjusted Approach to Classification
We study fairness in classification, where one wishes to make automated decisions for people from different protected groups.
When individuals are classified, the decision errors can be unfairly concentrated in certain protected groups.
We develop a fairness-adjusted selective inference (fasi)framework and data-driven algorithms that achieve statistical parity in the sense that the false selection rate (fsr) is controlled and equalized among protected groups.
The effectiveness of the algorithms is demonstrated through both simulated and real data.
Bradley Rava, Wenguang Sun, Gareth M. James, Xin Tong