Public perception of accuracy-fairness trade-offs in algorithmic decisions in the United States.

 0 Người đánh giá. Xếp hạng trung bình 0

Tác giả: Neil Brigden, Mehdi Mourali, Dallas Novakowski, Ruth Pogacar

Ngôn ngữ: eng

Ký hiệu phân loại: 201.4 General classes of religion

Thông tin xuất bản: United States : PloS one , 2025

Mô tả vật lý:

Bộ sưu tập: NCBI

ID: 711330

The naive approach to preventing discrimination in algorithmic decision-making is to exclude protected attributes from the model's inputs. This approach, known as "equal treatment," aims to treat all individuals equally regardless of their demographic characteristics. However, this practice can still result in unequal impacts across different groups. Recently, alternative notions of fairness have been proposed to reduce unequal impact. However, these alternative approaches may require sacrificing predictive accuracy. The present research investigates public attitudes toward these trade-offs in the United States. When are individuals more likely to support equal treatment algorithms (ETAs), characterized by higher predictive accuracy, and when do they prefer equal impact algorithms (EIAs) that reduce performance gaps between groups? A randomized conjoint experiment and a follow-up choice experiment revealed that support for the EIAs decreased sharply as their accuracy gap grew, although impact parity was prioritized more when ETAs produced large outcome discrepancies. Additionally, preferences polarized along partisan identities, with Democrats favoring impact parity over accuracy maximization while Republicans displayed the reverse preference. Gender and social justice orientations also significantly predicted EIA support. Overall, findings demonstrate multidimensional drivers of algorithmic fairness attitudes, underscoring divisions around equality versus equity principles. Achieving standards around fair AI requires addressing conflicting human values through good governance.
Tạo bộ sưu tập với mã QR

THƯ VIỆN - TRƯỜNG ĐẠI HỌC CÔNG NGHỆ TP.HCM

ĐT: (028) 36225755 | Email: tt.thuvien@hutech.edu.vn

Copyright @2024 THƯ VIỆN HUTECH