deepjavalibrary / djl-servingLinks
A universal scalable machine learning model deployment solution
☆233Updated this week
Alternatives and similar repositories for djl-serving
Users that are interested in djl-serving are comparing it to the libraries listed below
Sorting:
- Demo applications showcasing DJL☆338Updated this week
- ☆111Updated 8 months ago
- ☆296Updated last week
- Easy, fast and very cheap training and inference on AWS Trainium and Inferentia chips.☆241Updated this week
- Example code for AWS Neuron SDK developers building inference and training applications☆149Updated last week
- LLMPerf is a library for validating and benchmarking LLMs☆1,001Updated 9 months ago
- Triton backend that enables pre-process, post-processing and other logic to be implemented in Python.☆638Updated last week
- 🆕 Find the k-nearest neighbors (k-NN) for your vector data☆194Updated last week
- Powering AWS purpose-built machine learning chips. Blazing fast and cost effective, natively integrated into PyTorch and TensorFlow and i…☆540Updated this week
- Toolkit for allowing inference and serving with PyTorch on SageMaker. Dockerfiles used for building SageMaker Pytorch Containers are at h…☆142Updated 11 months ago
- The Triton TensorRT-LLM Backend☆887Updated last week
- This repository contains tutorials and examples for Triton Inference Server☆768Updated last week
- ☆266Updated 4 months ago
- A high-throughput and memory-efficient inference and serving engine for LLMs☆20Updated last month
- Examples on how to use LangChain and Ray☆229Updated 2 years ago
- Large Language Model Hosting Container☆91Updated this week
- ☆60Updated last week
- Triton CLI is an open source command line interface that enables users to create, deploy, and profile models served by the Triton Inferen…☆67Updated last week
- The Triton backend for the ONNX Runtime.☆161Updated last week
- Common source, scripts and utilities for creating Triton backends.☆347Updated last week
- Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Serv…☆490Updated last week
- Triton Python, C++ and Java client libraries, and GRPC-generated client examples for go, java and scala.☆645Updated last week
- This is suite of the hands-on training materials that shows how to scale CV, NLP, time-series forecasting workloads with Ray.☆432Updated last year
- Hands-on workshop for distributed training and hosting on SageMaker☆146Updated 3 weeks ago
- 🏋️ A unified multi-backend utility for benchmarking Transformers, Timm, PEFT, Diffusers and Sentence-Transformers with full support of O…☆315Updated this week
- A helper library to connect into Amazon SageMaker with AWS Systems Manager and SSH (Secure Shell)☆252Updated 2 months ago
- ☆107Updated last week
- ☆318Updated last year
- ☆73Updated last year
- AWS Deep Learning Containers are pre-built Docker images that make it easier to run popular deep learning frameworks and tools on AWS.☆1,105Updated this week