%0 Conference Proceedings %T Z-Code++: A Pre-trained Language Model Optimized for Abstractive Summarization %A He, Pengcheng %A Peng, Baolin %A Wang, Song %A Liu, Yang %A Xu, Ruochen %A Hassan, Hany %A Shi, Yu %A Zhu, Chenguang %A Xiong, Wayne %A Zeng, Michael %A Gao, Jianfeng %A Huang, Xuedong %Y Rogers, Anna %Y Boyd-Graber, Jordan %Y Okazaki, Naoaki %S Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) %D 2023 %8 July %I Association for Computational Linguistics %C Toronto, Canada %F he-etal-2023-z %R 10.18653/v1/2023.acl-long.279 %U https://aclanthology.org/2023.acl-long.279/ %U https://doi.org/10.18653/v1/2023.acl-long.279 %P 5095-5112