clearml / clearml-serving
ClearML - Model-Serving Orchestration and Repository Solution
β148Updated 3 months ago
Alternatives and similar repositories for clearml-serving:
Users that are interested in clearml-serving are comparing it to the libraries listed below
- ClearML Agent - ML-Ops made easy. ML-Ops scheduler & orchestration solutionβ263Updated 2 weeks ago
- π² A curated list of MLOps projects, tools and resourcesβ186Updated 11 months ago
- Triton Model Navigator is an inference toolkit designed for optimizing and deploying Deep Learning models with a focus on NVIDIA GPUs.β199Updated 3 months ago
- π·οΈ Git Tag Ops. Turn your Git repository into Artifact Registry or Model Registry.β146Updated 3 weeks ago
- Plugin for deploying MLflow models to TorchServeβ108Updated 2 years ago
- aim-mlflow integrationβ208Updated last year
- MLOps Python Libraryβ118Updated 3 years ago
- ClearML - Auto-Magical CI/CD to streamline your AI workload. Experiment Management, Data Management, Pipeline, Orchestration, Scheduling β¦β409Updated 3 weeks ago
- Dataset registry DVC projectβ74Updated 11 months ago
- TorchX is a universal job launcher for PyTorch applications. TorchX is designed to have fast iteration time for training/research and supβ¦β358Updated this week
- DagsHub client librariesβ93Updated last week
- π Log and track ML metrics, parameters, models with Git and/or DVCβ172Updated this week
- Helm chart repository for the new unified way to deploy ClearML on Kubernetes. ClearML - Auto-Magical CI/CD to streamline your AI workloaβ¦β39Updated last month
- πΆ A tool to package, serve, and deploy any ML model on any platform. Archived to be resurrected one dayπ€β719Updated last year
- FIL backend for the Triton Inference Serverβ77Updated last week
- β32Updated 2 years ago
- The Triton backend for the ONNX Runtime.β140Updated this week
- A Streamlit component integrating Label Studio Frontend in Streamlit applicationsβ70Updated 9 months ago
- BentoML Example Projects π¨β138Updated 3 months ago
- Toolkit for developing and maintaining ML modelsβ154Updated 10 months ago
- PyTriton is a Flask/FastAPI-like interface that simplifies Triton's deployment in Python environments.β786Updated 2 months ago
- An inference server for your machine learning models, including support for multiple frameworks, multi-model serving and moreβ794Updated this week
- Management Dashboard for Torchserveβ120Updated 2 years ago
- DVC support for Airflow workflowsβ6Updated 2 years ago
- Triton Model Analyzer is a CLI tool to help with better understanding of the compute and memory requirements of the Triton Inference Servβ¦β470Updated last month
- Label Studio SDKβ127Updated last week
- Deploying PyTorch Model to Production with FastAPI in CUDA-supported Dockerβ101Updated 3 years ago
- ClearML Fractional GPU - Run multiple containers on the same GPU with driver level memory limitation β¨ and compute time-slicingβ76Updated 8 months ago
- The Triton backend that allows running GPU-accelerated data pre-processing pipelines implemented in DALI's python API.β132Updated last week
- ClearML Remote - CLI for launching JupyterLab / VSCode on a remote machineβ25Updated 3 months ago