facebookresearch / fairseqLinks
Facebook AI Research Sequence-to-Sequence Toolkit written in Python.
β31,889Updated last month
Alternatives and similar repositories for fairseq
Users that are interested in fairseq are comparing it to the libraries listed below
Sorting:
- π€ Transformers: the model-definition framework for state-of-the-art machine learning models in text, vision, audio, and multimodal modelβ¦β151,652Updated last week
- Open Source Neural Machine Translation and (Large) Language Models in PyTorchβ6,958Updated 2 weeks ago
- Large-scale Self-supervised Pre-training Across Tasks, Languages, and Modalitiesβ21,798Updated 4 months ago
- Unsupervised text tokenizer for Neural Network-based text generation.β11,394Updated 3 weeks ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"β6,446Updated 6 months ago
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β9,256Updated last week
- An open-source NLP research library, built on PyTorch.β11,882Updated 2 years ago
- A scalable generative AI framework built for researchers and developers working on Large Language Models, Multimodal, and Speech AI (Autoβ¦β16,017Updated this week
- TensorFlow code and pre-trained models for BERTβ39,619Updated last year
- Google AI 2018 BERT pytorch implementationβ6,492Updated 2 years ago
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ10,196Updated 2 weeks ago
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β19,959Updated this week
- A library for efficient similarity search and clustering of dense vectors.β37,735Updated this week
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,719Updated 5 months ago
- A PyTorch implementation of the Transformer model in "Attention is All You Need".β9,475Updated last year
- Ongoing research training transformer models at scaleβ13,976Updated this week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β40,538Updated this week
- An annotated implementation of the Transformer paper.β6,648Updated last year
- XLNet: Generalized Autoregressive Pretraining for Language Understandingβ6,180Updated 2 years ago
- Fast and memory-efficient exact attentionβ20,280Updated this week
- State-of-the-Art Text Embeddingsβ17,774Updated last week
- Pretrain, finetune ANY AI model of ANY size on 1 or 10,000+ GPUs with zero code changes.β30,349Updated this week
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.β10,623Updated 2 years ago
- Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languagesβ7,642Updated this week
- Google Researchβ36,605Updated last week
- Library of deep learning models and datasets designed to make deep learning more accessible and accelerate ML research.β16,650Updated 2 years ago
- A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.β23,521Updated 2 months ago
- π€ The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation toolsβ20,801Updated this week
- An open source AutoML toolkit for automate machine learning lifecycle, including feature engineering, neural architecture search, model cβ¦β14,287Updated last year
- Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"β12,843Updated 10 months ago