Improve Student’s Reasoning Generalizability through Cascading Decomposed CoTs Distillation

Chengwei Dai, Kun Li, Wei Zhou, Songlin Hu


Abstract
Large language models (LLMs) exhibit enhanced reasoning at larger scales, driving efforts to distill these capabilities into smaller models via teacher-student learning.Previous works simply fine-tune student models on teachers’ generated Chain-of-Thoughts (CoTs) data. Although these methods enhance in-domain (IND) reasoning performance, they struggle to generalize to out-of-domain (OOD) tasks.We believe that the widespread spurious correlations between questions and answers may lead the model to preset a specific answer which restricts the diversity and generalizability of its reasoning process.In this paper, we propose Cascading Decomposed CoTs Distillation (CasCoD) to address these issues by decomposing the traditional single-step learning process into two cascaded learning steps. Specifically, by restructuring the training objectives—removing the answer from outputs and concatenating the question with the rationale as input—CasCoD’s two-step learning process ensures that students focus on learning rationales without interference from the preset answers, thus improving reasoning generalizability. Extensive experiments demonstrate the effectiveness of CasCoD on both IND and OOD benchmark reasoning datasets
Anthology ID:
2024.emnlp-main.875
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15623–15643
Language:
URL:
https://aclanthology.org/2024.emnlp-main.875/
DOI:
10.18653/v1/2024.emnlp-main.875
Bibkey:
Cite (ACL):
Chengwei Dai, Kun Li, Wei Zhou, and Songlin Hu. 2024. Improve Student’s Reasoning Generalizability through Cascading Decomposed CoTs Distillation. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 15623–15643, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Improve Student’s Reasoning Generalizability through Cascading Decomposed CoTs Distillation (Dai et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.875.pdf