Updating Large Language Models’ Memories with Time Constraints

Xin Wu, Yuqi Bu, Yi Cai, Tao Wang


Abstract
By incorporating the latest external knowledge, large language models (LLMs) can modify their internal memory. However, in practical applications, LLMs may encounter outdated information, necessitating the filtering of such data and updating of knowledge beyond internal memory. This paper explores whether LLMs can selectively update their memories based on the time constraints between internal memory and external knowledge. We evaluate existing LLMs using three types of data that exhibit different time constraints. Our experimental results reveal the challenges most LLMs face with time-constrained knowledge and highlight the differences in how various LLMs handle such information. Additionally, to address the difficulties LLMs encounter in understanding time constraints, we propose a two-stage decoupling framework that separates the identification and computation of time constraint into a symbolic system. Experimental results demonstrate that the proposed framework yields an improvement of over 60% in ChatGPT’s performance, and achieves a 12-24% enhancement in state-of-the-art LLM GPT-4.
Anthology ID:
2024.findings-emnlp.801
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
13693–13702
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.801/
DOI:
10.18653/v1/2024.findings-emnlp.801
Bibkey:
Cite (ACL):
Xin Wu, Yuqi Bu, Yi Cai, and Tao Wang. 2024. Updating Large Language Models’ Memories with Time Constraints. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 13693–13702, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Updating Large Language Models’ Memories with Time Constraints (Wu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.801.pdf
Software:
 2024.findings-emnlp.801.software.zip
Data:
 2024.findings-emnlp.801.data.zip