explosion / curated-transformersLinks
π€ A PyTorch library of curated Transformer models and their composable components
β893Updated last year
Alternatives and similar repositories for curated-transformers
Users that are interested in curated-transformers are comparing it to the libraries listed below
Sorting:
- Fast & Simple repository for pre-training and fine-tuning T5-style modelsβ1,010Updated last year
- Explore and understand your training and validation data.β845Updated 9 months ago
- Organize your experiments into discrete steps that can be cached and reused throughout the lifetime of your research project.β565Updated last year
- Ungreedy subword tokenizer and vocabulary trainer for Python, Go & Javascriptβ603Updated last year
- Cramming the training of a (BERT-type) language model into limited compute.β1,349Updated last year
- Blazing fast framework for fine-tuning similarity learning modelsβ657Updated last week
- Public repo for the NeurIPS 2023 paper "Unlimiformer: Long-Range Transformers with Unlimited Length Input"β1,062Updated last year
- Fine-tune mistral-7B on 3090s, a100s, h100sβ714Updated 2 years ago
- just a bunch of useful embeddings for scikit-learn pipelinesβ517Updated 2 weeks ago
- The repository for the code of the UltraFastBERT paperβ518Updated last year
- Minimalistic, extremely fast, and hackable researcher's toolbench for GPT models in 307 lines of code. Reaches <3.8 validation loss on wiβ¦β349Updated last year
- A Simple Bulk Labelling Toolβ597Updated 2 months ago
- Prompt programming with FMs.β443Updated last year
- Language Modeling with the H3 State Space Modelβ517Updated 2 years ago
- Neural Searchβ363Updated 7 months ago
- String-to-String Algorithms for Natural Language Processingβ555Updated last year
- Exact structure out of any language model completion.β512Updated 2 years ago
- Mistral: A strong, northwesterly wind: Framework for transparent and accessible large-scale language model training, built with Hugging Fβ¦β574Updated last year
- Extend existing LLMs way beyond the original training length with constant memory usage, without retrainingβ721Updated last year
- An open collection of implementation tips, tricks and resources for training large language modelsβ482Updated 2 years ago
- Puzzles for exploring transformersβ371Updated 2 years ago
- MinT: Minimal Transformer Library and Tutorialsβ258Updated 3 years ago
- Explore and interpret large embeddings in your browser with interactive visualization! πβ501Updated 2 months ago
- A tiny library for coding with large language models.β1,234Updated last year
- Highly commented implementations of Transformers in PyTorchβ136Updated 2 years ago
- A tool to analyze and debug neural networks in pytorch. Use a GUI to traverse the computation graph and view the data from many differentβ¦β290Updated 10 months ago
- SpanMarker for Named Entity Recognitionβ458Updated 9 months ago
- Convolutions for Sequence Modelingβ899Updated last year
- Inference code for Persimmon-8Bβ413Updated 2 years ago
- git extension for {collaborative, communal, continual} model developmentβ215Updated 11 months ago