jxiw / BiGSView on GitHub
Official Repository of Pretraining Without Attention (BiGS), BiGS is the first model to achieve BERT-level transfer learning on the GLUE benchmark with subquadratic complexity in length (or without attention).
118Mar 16, 2024Updated 2 years ago

Alternatives and similar repositories for BiGS

Users that are interested in BiGS are comparing it to the libraries listed below

Sorting:

Are these results useful?