MingyuJ666 / The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models

[ACL'24] Chain of Thought (CoT) is significant in improving the reasoning abilities of large language models (LLMs). However, the correlation between the effectiveness of CoT and the length of reasoning steps in prompts remains largely unknown. To shed light on this, we have conducted several empirical experiments to explore the relations.
28Updated last month

Related projects: