Commonsense Knowledge Editing Based on Free-Text in LLMs

Xiusheng Huang, Yequan Wang, Jun Zhao, Kang Liu


Abstract
Knowledge editing technology is crucial for maintaining the accuracy and timeliness of large language models (LLMs) . However, the setting of this task overlooks a significant portion of commonsense knowledge based on free-text in the real world, characterized by broad knowledge scope, long content and non instantiation. The editing objects of previous methods (e.g., MEMIT) were single token or entity, which were not suitable for commonsense knowledge in free-text form. To address the aforementioned challenges, we conducted experiments from two perspectives: knowledge localization and knowledge editing. Firstly, we introduced Knowledge Localization for Free-Text(KLFT) method, revealing the challenges associated with the distribution of commonsense knowledge in MLP and Attention layers, as well as in decentralized distribution. Next, we propose a Dynamics-aware Editing Method(DEM), which utilizes a Dynamics-aware Module to locate the parameter positions corresponding to commonsense knowledge, and uses Knowledge Editing Module to update knowledge. The DEM method fully explores the potential of the MLP and Attention layers, and successfully edits commonsense knowledge based on free-text. The experimental results indicate that the DEM can achieve excellent editing performance.
Anthology ID:
2024.emnlp-main.826
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14870–14880
Language:
URL:
https://aclanthology.org/2024.emnlp-main.826/
DOI:
10.18653/v1/2024.emnlp-main.826
Bibkey:
Cite (ACL):
Xiusheng Huang, Yequan Wang, Jun Zhao, and Kang Liu. 2024. Commonsense Knowledge Editing Based on Free-Text in LLMs. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 14870–14880, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Commonsense Knowledge Editing Based on Free-Text in LLMs (Huang et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.826.pdf