jxiw / BiGSLinks

Official Repository of Pretraining Without Attention (BiGS), BiGS is the first model to achieve BERT-level transfer learning on the GLUE benchmark with subquadratic complexity in length (or without attention).
112Updated last year

Alternatives and similar repositories for BiGS

Users that are interested in BiGS are comparing it to the libraries listed below

Sorting: