PiotrNawrot / nanoT5Links
Fast & Simple repository for pre-training and fine-tuning T5-style models
☆1,009Updated last year
Alternatives and similar repositories for nanoT5
Users that are interested in nanoT5 are comparing it to the libraries listed below
Sorting:
- Cramming the training of a (BERT-type) language model into limited compute.☆1,349Updated last year
- Ungreedy subword tokenizer and vocabulary trainer for Python, Go & Javascript☆600Updated last year
- Public repo for the NeurIPS 2023 paper "Unlimiformer: Long-Range Transformers with Unlimited Length Input"