Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling

Peijie Jiang, Dingkun Long, Yanzhao Zhang, Pengjun Xie, Meishan Zhang, Min Zhang


Abstract
Boundary information is critical for various Chinese language processing tasks, such as word segmentation, part-of-speech tagging, and named entity recognition. Previous studies usually resorted to the use of a high-quality external lexicon, where lexicon items can offer explicit boundary information. However, to ensure the quality of the lexicon, great human effort is always necessary, which has been generally ignored. In this work, we suggest unsupervised statistical boundary information instead, and propose an architecture to encode the information directly into pre-trained language models, resulting in Boundary-Aware BERT (BABERT). We apply BABERT for feature induction of Chinese sequence labeling tasks. Experimental results on ten benchmarks of Chinese sequence labeling demonstrate that BABERT can provide consistent improvements on all datasets. In addition, our method can complement previous supervised lexicon exploration, where further improvements can be achieved when integrated with external lexicon information.
Anthology ID:
2022.emnlp-main.34
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
526–537
Language:
URL:
https://aclanthology.org/2022.emnlp-main.34
DOI:
10.18653/v1/2022.emnlp-main.34
Bibkey:
Cite (ACL):
Peijie Jiang, Dingkun Long, Yanzhao Zhang, Pengjun Xie, Meishan Zhang, and Min Zhang. 2022. Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 526–537, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Boundary-Aware Language Model Pretraining for Chinese Sequence Labeling (Jiang et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.34.pdf
Software:
 2022.emnlp-main.34.software.zip