Ziyu Wang
2024
PROC2PDDL: Open-Domain Planning Representations from Texts
Tianyi Zhang
|
Li Zhang
|
Zhaoyi Hou
|
Ziyu Wang
|
Yuling Gu
|
Peter Clark
|
Chris Callison-Burch
|
Niket Tandon
Proceedings of the 2nd Workshop on Natural Language Reasoning and Structured Explanations (@ACL 2024)
Planning in a text-based environment continues to be a significant challenge for AI systems. Recent approaches have utilized language models to predict planning domain definitions (e.g., PDDL) but have only been evaluated in closed-domain simulated environments. To address this, we present Proc2PDDL, the first dataset containing open-domain procedural texts paired with expert-annotated PDDL representations. Using this dataset, we evaluate the task of predicting domain actions (parameters, preconditions, and effects). We experiment with various large language models (LLMs) and prompting mechanisms, including a novel instruction inspired by the zone of proximal development (ZPD), which reconstructs the task as incremental basic skills. Our results demonstrate that Proc2PDDL is highly challenging for end-to-end LLMs, with GPT-3.5’s success rate close to 0% and GPT-4o’s 38%. With ZPD instructions, GPT-4o’s success rate increases to 45%, outperforming regular chain-of-thought prompting’s 34%. Our analysis systematically examines both syntactic and semantic errors, providing insights into the strengths and weaknesses of language models in generating domain-specific programs.
2020
BUTTER: A Representation Learning Framework for Bi-directional Music-Sentence Retrieval and Generation
Yixiao Zhang
|
Ziyu Wang
|
Dingsu Wang
|
Gus Xia
Proceedings of the 1st Workshop on NLP for Music and Audio (NLP4MusA)
Search
Co-authors
- Tianyi Zhang 1
- Li Zhang 1
- Zhaoyi Hou 1
- Yuling Gu 1
- Peter Clark 1
- show all...