The AzureML Inference Server is a python package that allows user to easily expose machine learning models as HTTP Endpoints. The server is included by default in AzureML's pre-built docker images for inference.
☆27Dec 18, 2025Updated 4 months ago
Alternatives and similar repositories for azureml-inference-server
Users that are interested in azureml-inference-server are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Semantic Ranking Solution for Azure Database for PostgreSQL☆14Apr 29, 2025Updated last year
- On-premises Azure Application Insights telemetry storage☆15Jan 4, 2021Updated 5 years ago
- Classes and command-line utilities for interacting with BibTeX style databases☆17Nov 12, 2023Updated 2 years ago
- ☆17Oct 7, 2021Updated 4 years ago
- ☆13Feb 13, 2024Updated 2 years ago
- Deploy to Railway using AI coding agents - Free Credits Offer • AdUse Claude Code, Codex, OpenCode, and more. Autonomous software development now has the infrastructure to match with Railway.
- Implementing CI/CD Using Azure Pipelines, published by Packt☆13Nov 29, 2023Updated 2 years ago
- Benchmarking STT service TTFB and semantic WER for real-time AI applications☆60Mar 20, 2026Updated last month
- A boilerplate project for getting started with Azure Functions v4☆19Nov 7, 2022Updated 3 years ago
- A self-hostable base for building and running custom agents that's heavily inspired by Claude Code on the web.☆28Updated this week
- ☆25May 15, 2023Updated 2 years ago
- Terraform Azure Verified Resource Module for Machine Learning Services Workspace☆14Mar 18, 2026Updated last month
- Quiz generation from your docs - Streamlit LLM Hackathon project☆12Sep 30, 2023Updated 2 years ago
- Azure OpenAI benchmarking tool