Jina-ColBERT-v2: A General-Purpose Multilingual Late Interaction Retriever

Rohan Jha, Bo Wang, Michael Günther, Georgios Mastrapas, Saba Sturua, Isabelle Mohr, Andreas Koukounas, Mohammad Kalim Wang, Nan Wang, Han Xiao


Abstract
Multi-vector dense models, such as ColBERT, have proven highly effective in information retrieval. ColBERT’s late interaction scoring approximates the joint query-document attention seen in cross-encoders while maintaining inference efficiency closer to traditional dense retrieval models, thanks to its bi-encoder architecture and recent optimizations in indexing and search. In this paper, we introduce a novel architecture and a training framework to support long context window and multilingual retrieval. Leveraging Matryoshka Representation Loss, we further demonstrate that the reducing the embedding dimensionality from 128 to 64 has insignificant impact on the model’s retrieval performance and cut storage requirements by up to 50%. Our new model, Jina-ColBERT-v2, demonstrates strong performance across a range of English and multilingual retrieval tasks,
Anthology ID:
2024.mrl-1.11
Volume:
Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024)
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Jonne Sälevä, Abraham Owodunni
Venue:
MRL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
159–166
Language:
URL:
https://aclanthology.org/2024.mrl-1.11/
DOI:
10.18653/v1/2024.mrl-1.11
Bibkey:
Cite (ACL):
Rohan Jha, Bo Wang, Michael Günther, Georgios Mastrapas, Saba Sturua, Isabelle Mohr, Andreas Koukounas, Mohammad Kalim Wang, Nan Wang, and Han Xiao. 2024. Jina-ColBERT-v2: A General-Purpose Multilingual Late Interaction Retriever. In Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024), pages 159–166, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Jina-ColBERT-v2: A General-Purpose Multilingual Late Interaction Retriever (Jha et al., MRL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.mrl-1.11.pdf