Learning Interpretable Legal Case Retrieval via Knowledge-Guided Case Reformulation

Chenlong Deng, Kelong Mao, Zhicheng Dou


Abstract
Legal case retrieval for sourcing similar cases is critical in upholding judicial fairness. Different from general web search, legal case retrieval involves processing lengthy, complex, and highly specialized legal documents. Existing methods in this domain often overlook the incorporation of legal expert knowledge, which is crucial for accurately understanding and modeling legal cases, leading to unsatisfactory retrieval performance. This paper introduces KELLER, a legal knowledge-guided case reformulation approach based on large language models (LLMs) for effective and interpretable legal case retrieval. By incorporating professional legal knowledge about crimes and law articles, we enable large language models to accurately reformulate the original legal case into concise sub-facts of crimes, which contain the essential information of the case. Extensive experiments on two legal case retrieval benchmarks demonstrate superior retrieval performance and robustness on complex legal case queries of KELLER over existing methods.
Anthology ID:
2024.emnlp-main.73
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1253–1265
Language:
URL:
https://aclanthology.org/2024.emnlp-main.73/
DOI:
10.18653/v1/2024.emnlp-main.73
Bibkey:
Cite (ACL):
Chenlong Deng, Kelong Mao, and Zhicheng Dou. 2024. Learning Interpretable Legal Case Retrieval via Knowledge-Guided Case Reformulation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 1253–1265, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Learning Interpretable Legal Case Retrieval via Knowledge-Guided Case Reformulation (Deng et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.73.pdf