huggingface / huggingface_hubLinks
The official Python client for the Huggingface Hub.
β2,829Updated this week
Alternatives and similar repositories for huggingface_hub
Users that are interested in huggingface_hub are comparing it to the libraries listed below
Sorting:
- π Accelerate inference and training of π€ Transformers, Diffusers, TIMM and Sentence Transformers with easy to use hardware optimizationβ¦β3,016Updated this week
- π€ Evaluate: A library for easily evaluating machine learning models and datasets.β2,285Updated last month
- Docs of the Hugging Face Hubβ425Updated this week
- π A simple way to launch, train, and use PyTorch models on almost any device and distributed configuration, automatic mixed precision (iβ¦β9,010Updated this week
- Use Hugging Face with JavaScriptβ2,192Updated this week
- The Hugging Face course on Transformersβ3,217Updated last week
- π€ A list of wonderful open-source projects & applications integrated with Hugging Face libraries.β992Updated last year
- Simple, safe way to store and distribute tensorsβ3,393Updated this week
- Notebooks using the Hugging Face libraries π€β4,235Updated last week
- Backend that powers the dataset viewer on Hugging Face dataset pages through a public API.β774Updated 2 weeks ago
- Accessible large language models via k-bit quantization for PyTorch.β7,450Updated this week
- An easy-to-use LLMs quantization package with user-friendly apis, based on GPTQ algorithm.β4,918Updated 4 months ago
- Large Language Model Text Generation Inferenceβ10,413Updated last week
- Public repo for HF blog postsβ3,070Updated this week
- π€ AutoTrain Advancedβ4,471Updated 6 months ago
- Train transformer language models with reinforcement learning.β14,989Updated this week
- MTEB: Massive Text Embedding Benchmarkβ2,753Updated last week
- Ongoing research training transformer models at scaleβ13,130Updated this week
- A Unified Library for Parameter-Efficient and Modular Transfer Learningβ2,750Updated this week
- PyTorch native post-training libraryβ5,399Updated this week
- Efficient few-shot learning with Sentence Transformersβ2,544Updated last week
- Welcome to the Llama Cookbook! This is your go to guide for Building with Llama: Getting started with Inference, Fine-Tuning, RAG. We alsβ¦β17,731Updated this week
- π€ The largest hub of ready-to-use datasets for AI models with fast, easy-to-use and efficient data manipulation toolsβ20,473Updated last week
- Hackable and optimized Transformers building blocks, supporting a composable construction.β9,821Updated last week
- Transformer related optimization, including BERT, GPTβ6,270Updated last year
- π€ PEFT: State-of-the-art Parameter-Efficient Fine-Tuning.β19,252Updated last week
- General technology for enabling AI capabilities w/ LLMs and MLLMsβ4,082Updated last month
- Fast and memory-efficient exact attentionβ18,776Updated last week
- π₯ Fast State-of-the-Art Tokenizers optimized for Research and Productionβ9,980Updated 2 weeks ago
- SGLang is a fast serving framework for large language models and vision language models.β16,773Updated this week