LPC: A Logits and Parameter Calibration Framework for Continual Learning

Xiaodi Li, Zhuoyi Wang, Dingcheng Li, Latifur Khan, Bhavani Thuraisingham


Abstract
When we execute the typical fine-tuning paradigm on continuously sequential tasks, the model will suffer from the catastrophic forgetting problem (i.e., the model tends to adjust old parameters according to the new knowledge, which leads to the loss of previously acquired concepts). People proposed replay-based methods by accessing old data from extra storage and maintaining the parameters of old concepts, which actually raise the privacy issue and larger memory requirements. In this work, we aim to achieve the sequential/continual learning of knowledge without accessing the old data. The core idea is to calibrate the parameters and logits (output) so that preserving old parameters and generalized learning on new concepts can be solved simultaneously. Our proposed framework includes two major components, Logits Calibration (LC) and Parameter Calibration (PC). The LC focuses on calibrating the learning of novel models with old models, and PC aims to preserve the parameters of old models. These two operations can maintain the old knowledge while learning new tasks without storing previous data. We conduct experiments on various scenarios of the GLUE (the General Language Understanding Evaluation) benchmark. The experimental results show that our model achieves state-of-the-art performance in all scenarios.
Anthology ID:
2022.findings-emnlp.529
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7142–7155
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.529
DOI:
10.18653/v1/2022.findings-emnlp.529
Bibkey:
Cite (ACL):
Xiaodi Li, Zhuoyi Wang, Dingcheng Li, Latifur Khan, and Bhavani Thuraisingham. 2022. LPC: A Logits and Parameter Calibration Framework for Continual Learning. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 7142–7155, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
LPC: A Logits and Parameter Calibration Framework for Continual Learning (Li et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.529.pdf
Software:
 2022.findings-emnlp.529.software.zip
Video:
 https://aclanthology.org/2022.findings-emnlp.529.mp4