luyug / magixLinks
Supercharge huggingface transformers with model parallelism.
☆77Updated 6 months ago
Alternatives and similar repositories for magix
Users that are interested in magix are comparing it to the libraries listed below
Sorting:
- some common Huggingface transformers in maximal update parametrization (µP)☆87Updated 3 years ago
- Anchored Preference Optimization and Contrastive Revisions: Addressing Underspecification in Alignment☆61Updated last year
- minimal pytorch implementation of bm25 (with sparse tensors)☆104Updated 3 months ago
- ☆48Updated last year
- A toolkit implementing advanced methods to transfer models and model knowledge across tokenizers.☆61Updated 6 months ago
- PyTorch implementation for MRL☆20Updated last year
- ☆59Updated last year
- Code for Zero-Shot Tokenizer Transfer☆142Updated last year
- ☆77Updated last year
- Reference implementation for Reward-Augmented Decoding: Efficient Controlled Text Generation With a Unidirectional Reward Model☆45Updated 4 months ago
- Large language models (LLMs) made easy, EasyLM is a one stop solution for pre-training, finetuning, evaluating and serving LLMs in JAX/Fl…☆78Updated last year
- ReBase: Training Task Experts through Retrieval Based Distillation☆29Updated last year
- ☆41Updated last year
- ☆86Updated 2 years ago
- A repository for research on medium sized language models.☆77Updated last year
- Codebase accompanying the Summary of a Haystack paper.☆80Updated last year
- ☆53Updated 11 months ago
- Repository containing the SPIN experiments on the DIBT 10k ranked prompts☆23Updated last year
- Aioli: A unified optimization framework for language model data mixing☆32Updated last year
- Improving Text Embedding of Language Models Using Contrastive Fine-tuning☆64Updated last year
- ☆59Updated 2 months ago
- Trully flash implementation of DeBERTa disentangled attention mechanism.☆74Updated last week
- Embedding Recycling for Language models☆38Updated 2 years ago
- The source code of our work "Prepacking: A Simple Method for Fast Prefilling and Increased Throughput in Large Language Models" [AISTATS …☆60Updated last year
- ☆68Updated last year
- Utilities for Training Very Large Models☆58Updated last year
- Experiments for efforts to train a new and improved t5☆76Updated last year
- ☆56Updated last year
- ☆53Updated 2 years ago
- This is a new metric that can be used to evaluate faithfulness of text generated by LLMs. The work behind this repository can be found he…☆31Updated 2 years ago