Representation Degeneration Problem in Prompt-based Models for Natural Language Understanding

Qingyan Zhao, Ruifang He, Jinpeng Zhang, Chang Liu, Bo Wang


Abstract
Prompt-based fine-tuning (PF), by aligning with the training objective of pre-trained language models (PLMs), has shown improved performance on many few-shot natural language understanding (NLU) benchmarks. However, the word embedding space of PLMs exhibits anisotropy, which is called the representation degeneration problem. In this paper, we explore the self-similarity within the same context and identify the anisotropy of the feature embedding space in PF model. Given that the performance of PF models is dependent on feature embeddings, we inevitably pose the hypothesis: this anisotropy limits the performance of the PF models. Based on our experimental findings, we propose CLMA, a Contrastive Learning framework based on the [MASK] token and Answers, to alleviate the anisotropy in the embedding space. By combining our proposed counter-intuitive SSD, a Supervised Signal based on embedding Distance, our approach outperforms mainstream methods on the many NLU benchmarks in the few-shot experimental settings. In subsequent experiments, we analyze the capability of our method to capture deep semantic cues and the impact of the anisotropy in the feature embedding space on the performance of the PF model.
Anthology ID:
2024.lrec-main.1217
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
13946–13957
Language:
URL:
https://aclanthology.org/2024.lrec-main.1217
DOI:
Bibkey:
Cite (ACL):
Qingyan Zhao, Ruifang He, Jinpeng Zhang, Chang Liu, and Bo Wang. 2024. Representation Degeneration Problem in Prompt-based Models for Natural Language Understanding. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 13946–13957, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Representation Degeneration Problem in Prompt-based Models for Natural Language Understanding (Zhao et al., LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.1217.pdf