huggingface / datasets
π€ The largest hub of ready-to-use datasets for ML models with fast, easy-to-use and efficient data manipulation tools
β19,819Updated this week
Alternatives and similar repositories for datasets:
Users that are interested in datasets are comparing it to the libraries listed below
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ9,509Updated this week
- Facebook AI Research Sequence-to-Sequence Toolkit written in Python.β31,179Updated 2 months ago
- π€ Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.β141,466Updated this week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β8,497Updated last week
- State-of-the-Art Text Embeddingsβ16,265Updated this week
- An open-source NLP research library, built on PyTorch.β11,826Updated 2 years ago
- Unsupervised text tokenizer for Neural Network-based text generation.β10,707Updated 3 weeks ago
- Code for the paper "Exploring the Limits of Transfer Learning with a Unified Text-to-Text Transformer"β6,302Updated 3 weeks ago
- Repository to track the progress in Natural Language Processing (NLP), including the datasets and the current state-of-the-art for the moβ¦β22,822Updated 7 months ago
- Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and moreβ31,708Updated this week
- π« Industrial-strength Natural Language Processing (NLP) in Pythonβ31,175Updated last month
- A library for efficient similarity search and clustering of dense vectors.β33,821Updated this week
- A very simple framework for state-of-the-art Natural Language Processing (NLP)β14,112Updated this week
- Distributed training framework for TensorFlow, Keras, PyTorch, and Apache MXNet.β14,423Updated last month
- BertViz: Visualize Attention in NLP Models (BERT, GPT2, BART, etc.)β7,237Updated last year
- Pretrain, finetune ANY AI model of ANY size on multiple GPUs, TPUs with zero code changes.β29,161Updated this week
- A minimal PyTorch re-implementation of the OpenAI GPT (Generative Pretrained Transformer) trainingβ21,600Updated 7 months ago
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β17,837Updated this week
- Label Studio is a multi-type data labeling and annotation tool with standardized output formatβ21,293Updated this week
- DeepSpeed is a deep learning optimization library that makes distributed training and inference easy, efficient, and effective.β37,533Updated this week
- Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conveβ¦β4,162Updated 9 months ago
- Trax β Deep Learning with Clear Code and Speedβ8,177Updated last month
- Train transformer language models with reinforcement learning.β12,591Updated this week
- A framework for training and evaluating AI models on a variety of openly available dialogue datasets.β10,516Updated last year
- State-of-the-Art Deep Learning scripts organized by models - easy to train and deploy with reproducible accuracy and performance on enterβ¦β14,058Updated 7 months ago
- A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework based on decision tree algorithms, used β¦β17,054Updated this week
- A system for quickly generating training data with weak supervisionβ5,840Updated 10 months ago
- XLNet: Generalized Autoregressive Pretraining for Language Understandingβ6,184Updated last year
- Flax is a neural network library for JAX that is designed for flexibility.β6,421Updated this week
- TensorFlow code and pre-trained models for BERTβ38,882Updated 7 months ago