Measuring and Addressing Indexical Bias in Information Retrieval

Caleb Ziems, William Held, Jane Dwivedi-Yu, Diyi Yang


Abstract
Information Retrieval (IR) systems are designed to deliver relevant content, but traditional systems may not optimize rankings for fairness, neutrality, or the balance of ideas. Consequently, IR can often introduce indexical biases, or biases in the positional order of documents. Although indexical bias can demonstrably affect people’s opinion, voting patterns, and other behaviors, these issues remain understudied as the field lacks reliable metrics and procedures for automatically measuring indexical bias. Towards this end, we introduce the PAIR framework, which supports automatic bias audits for ranked documents or entire IR systems. After introducing DUO, the first general-purpose automatic bias metric, we run an extensive evaluation of 8 IR systems on a new corpus of 32k synthetic and 4.7k natural documents, with 4k queries spanning 1.4k controversial issue topics. A human behavioral study validates our approach, showing that our bias metric can help predict when and how indexical bias will shift a reader’s opinion.
Anthology ID:
2024.findings-acl.763
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12860–12877
Language:
URL:
https://aclanthology.org/2024.findings-acl.763
DOI:
10.18653/v1/2024.findings-acl.763
Bibkey:
Cite (ACL):
Caleb Ziems, William Held, Jane Dwivedi-Yu, and Diyi Yang. 2024. Measuring and Addressing Indexical Bias in Information Retrieval. In Findings of the Association for Computational Linguistics ACL 2024, pages 12860–12877, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Measuring and Addressing Indexical Bias in Information Retrieval (Ziems et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.763.pdf