huggingface / optimum-neuronLinks
Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.
☆233Updated this week
Alternatives and similar repositories for optimum-neuron
Users that are interested in optimum-neuron are comparing it to the libraries listed below
Sorting:
- ☆111Updated 5 months ago
- Example code for AWS Neuron SDK developers building inference and training applications☆148Updated last month
- ☆264Updated 2 months ago
- Large Language Model Hosting Container☆89Updated last week
- ☆59Updated 2 weeks ago
- ☆62Updated 2 months ago
- Powering AWS purpose-built machine learning chips. Blazing fast and cost effective, natively integrated into PyTorch and TensorFlow and i…☆529Updated this week
- Toolkit for allowing inference and serving with PyTorch on SageMaker. Dockerfiles used for building SageMaker Pytorch Containers are at h…☆141Updated 9 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆18Updated 2 weeks ago
- A helper library to connect into Amazon SageMaker with AWS Systems Manager and SSH (Secure Shell)☆247Updated last week
- ☆199Updated last year
- Google TPU optimizations for transformers models☆114Updated 5 months ago
- The package used to build the documentation of our Hugging Face repos☆122Updated this week
- experiments with inference on llama☆104Updated last year
- Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)☆190Updated this week
- Hands-on workshop for distributed training and hosting on SageMaker☆143Updated this week
- A tool to configure, launch and manage your machine learning experiments.☆171Updated this week