zhqwqwq / Learning-Parity-with-CoT
View external linksLinks

[ICLR 2025] This repository contains the code to reproduce the results from our paper From Sparse Dependence to Sparse Attention: Unveiling How Chain-of-Thought Enhances Transformer Sample Efficiency.
12Mar 7, 2025Updated 11 months ago

Alternatives and similar repositories for Learning-Parity-with-CoT

Users that are interested in Learning-Parity-with-CoT are comparing it to the libraries listed below

Sorting:

Are these results useful?