huggingface / model-evaluatorLinks
Evaluate Transformers from the Hub 🔥
☆13Updated last year
Alternatives and similar repositories for model-evaluator
Users that are interested in model-evaluator are comparing it to the libraries listed below
Sorting:
- ☆10Updated 3 years ago
- A starter kit for evaluating benchmarks on the 🤗 Hub☆14Updated last year
- Accelerated inference of 🤗 models using FuriosaAI NPU chips.☆26Updated 2 months ago
- **ARCHIVED** Filesystem interface to 🤗 Hub☆58Updated 2 years ago
- Blazing fast training of 🤗 Transformers on Graphcore IPUs☆86Updated last year
- Public helpers for huggingface.co. Now lives in https://github.com/huggingface/huggingface_hub☆12Updated 3 years ago
- Teacher - student distillation using DeepSpeed☆18Updated 2 years ago
- Developing tools to automatically analyze datasets☆74Updated 10 months ago
- ☆14Updated 2 years ago
- 3rd party dependencies for DALI project☆10Updated last week
- The package used to build the documentation of our Hugging Face repos☆125Updated this week
- ☆67Updated 3 years ago
- ☆18Updated last week
- Techniques used to run BLOOM at inference in parallel☆37Updated 2 years ago
- Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)☆193Updated last week
- ☆19Updated last year
- ☆33Updated 2 years ago
- Implementation of the specific Transformer architecture from PaLM - Scaling Language Modeling with Pathways - in Jax (Equinox framework)☆187Updated 3 years ago
- Inference code for LLaMA models in JAX☆118Updated last year
- Hugging Face and Pyserini interoperability☆20Updated 2 years ago
- Examples using 🤗 Hub to share and reload machine learning models☆32Updated 2 years ago
- ☆171Updated 6 months ago
- [WIP] A 🔥 interface for running code in the cloud☆85Updated 2 years ago
- 🤗 Disaggregators: Curated data labelers for in-depth analysis.☆65Updated 2 years ago
- DeepSpeed is a deep learning optimization library that makes distributed training easy, efficient, and effective.☆168Updated last month
- Large scale 4D parallelism pre-training for 🤗 transformers in Mixture of Experts *(still work in progress)*☆87Updated last year
- Hugging Face's Zapier Integration 🤗⚡️☆47Updated 2 years ago
- An implementation of model parallel autoregressive transformers on GPUs, based on the DeepSpeed library.☆23Updated 2 years ago
- A Streamlit app to add structured tags to a dataset card☆22Updated 3 years ago
- URL downloader supporting checkpointing and continuous checksumming.☆19Updated last year