huggingface / optimum-neuronLinks
Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.
☆230Updated this week
Alternatives and similar repositories for optimum-neuron
Users that are interested in optimum-neuron are comparing it to the libraries listed below
Sorting:
- ☆110Updated 5 months ago
- Example code for AWS Neuron SDK developers building inference and training applications☆149Updated 2 weeks ago
- Large Language Model Hosting Container☆89Updated 3 weeks ago
- ☆57Updated last month
- ☆263Updated 2 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆18Updated this week
- Toolkit for allowing inference and serving with PyTorch on SageMaker. Dockerfiles used for building SageMaker Pytorch Containers are at h…☆141Updated 8 months ago
- ☆62Updated last month
- Powering AWS purpose-built machine learning chips. Blazing fast and cost effective, natively integrated into PyTorch and TensorFlow and i…☆524Updated last week
- experiments with inference on llama☆104Updated last year
- A helper library to connect into Amazon SageMaker with AWS Systems Manager and SSH (Secure Shell)☆246Updated 3 months ago
- Google TPU optimizations for transformers models☆113Updated 5 months ago
- ☆38Updated 6 months ago
- The package used to build the documentation of our Hugging Face repos☆117Updated this week
- Foundation model benchmarking tool. Run any model on any AWS platform and benchmark for performance across instance type and serving stac…