LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of Large Language Models

Renzhi Wang, Piji Li


Abstract
Large language models (LLMs) require continual knowledge updates to stay abreast of the ever-changing world facts, prompting the formulation of lifelong model editing task. While recent years have witnessed the development of various techniques for single and batch editing, these methods either fail to apply or perform sub-optimally when faced with lifelong editing. In this paper, we introduce LEMoE, an advanced Mixture of Experts (MoE) adaptor for lifelong model editing. We first analyze the factors influencing the effectiveness of conventional MoE adaptor in lifelong editing, including catastrophic forgetting, inconsistent routing and order sensitivity. Based on these insights, we propose a tailored module insertion method to achieve lifelong editing, incorporating a novel KV anchor routing to enhance routing consistency between training and inference stage, along with a concise yet effective clustering-based editing order planning. Experimental results demonstrate the effectiveness of our method in lifelong editing, surpassing previous model editing techniques while maintaining outstanding performance in batch editing task. Our code will be available.
Anthology ID:
2024.emnlp-main.149
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2551–2575
Language:
URL:
https://aclanthology.org/2024.emnlp-main.149/
DOI:
10.18653/v1/2024.emnlp-main.149
Bibkey:
Cite (ACL):
Renzhi Wang and Piji Li. 2024. LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of Large Language Models. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 2551–2575, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
LEMoE: Advanced Mixture of Experts Adaptor for Lifelong Model Editing of Large Language Models (Wang & Li, EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.149.pdf