Referral Augmentation for Zero-Shot Information Retrieval

Michael Tang, Shunyu Yao, John Yang, Karthik Narasimhan


Abstract
We propose Referral-Augmented Retrieval (RAR), a simple technique that concatenates document indices with referrals: text from other documents that cite or link to the given document. We find that RAR provides significant performance gains for tasks across paper retrieval, entity retrieval, and open-domain question-answering in both zero-shot and in-domain (e.g., fine-tuned) settings. We examine how RAR provides especially strong improvements on more structured tasks, and can greatly outperform generative text expansion techniques such as DocT5Query and Query2Doc, with a 37% and 21% absolute improvement on ACL paper retrieval, respectively. We also compare three ways to aggregate referrals for RAR. Overall, we believe RAR can help revive and re-contextualize the classic information retrieval idea of using anchor texts to improve the representations of documents in a wide variety of corpuses in the age of neural retrieval.
Anthology ID:
2024.findings-acl.798
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13452–13461
Language:
URL:
https://aclanthology.org/2024.findings-acl.798
DOI:
10.18653/v1/2024.findings-acl.798
Bibkey:
Cite (ACL):
Michael Tang, Shunyu Yao, John Yang, and Karthik Narasimhan. 2024. Referral Augmentation for Zero-Shot Information Retrieval. In Findings of the Association for Computational Linguistics: ACL 2024, pages 13452–13461, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Referral Augmentation for Zero-Shot Information Retrieval (Tang et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.798.pdf