MingyuJ666 / The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models

[ACL'24] Chain of Thought (CoT) is significant in improving the reasoning abilities of large language models (LLMs). However, the correlation between the effectiveness of CoT and the length of reasoning steps in prompts remains largely unknown. To shed light on this, we have conducted several empirical experiments to explore the relations.
31Updated 2 months ago

Related projects

Alternatives and complementary repositories for The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models