MingyuJ666 / The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models

[ACL'24] Chain of Thought (CoT) is significant in improving the reasoning abilities of large language models (LLMs). However, the correlation between the effectiveness of CoT and the length of reasoning steps in prompts remains largely unknown. To shed light on this, we have conducted several empirical experiments to explore the relations.
38Updated 2 weeks ago

Alternatives and similar repositories for The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models:

Users that are interested in The-Impact-of-Reasoning-Step-Length-on-Large-Language-Models are comparing it to the libraries listed below