KE-GCL: Knowledge Enhanced Graph Contrastive Learning for Commonsense Question Answering

Lihui Zhang, Ruifan Li


Abstract
Commonsense question answering (CQA) aims to choose the correct answers for commonsense questions. Most existing works focus on extracting and reasoning over external knowledge graphs (KG). However, the noise in KG prevents these models from learning effective representations. In this paper, we propose a Knowledge Enhanced Graph Contrastive Learning model (KE-GCL) by incorporating the contextual descriptions of entities and adopting a graph contrastive learning scheme. Specifically, for QA pairs we represent the knowledge from KG and contextual descriptions. Then, the representations of contextual descriptions as context nodes are inserted into KG, forming the knowledge-enhanced graphs.Moreover, we design a contrastive learning method on graphs. For knowledge-enhanced graphs, we build their augmented views with an adaptive sampling strategy. After that, we reason over graphs to update their representations by scattering edges and aggregating nodes. To further improve GCL, hard graph negatives are chosen based on incorrect answers. Extensive experiments on two benchmark datasets demonstrate the effectiveness of our proposed KE-GCL, which outperforms previous methods consistently.
Anthology ID:
2022.findings-emnlp.6
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
76–87
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.6
DOI:
10.18653/v1/2022.findings-emnlp.6
Bibkey:
Cite (ACL):
Lihui Zhang and Ruifan Li. 2022. KE-GCL: Knowledge Enhanced Graph Contrastive Learning for Commonsense Question Answering. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 76–87, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
KE-GCL: Knowledge Enhanced Graph Contrastive Learning for Commonsense Question Answering (Zhang & Li, Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.6.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.6.mp4