intel / ai-containersLinks
This repository contains Dockerfiles, scripts, yaml files, Helm charts, etc. used to scale out AI containers with versions of TensorFlow and PyTorch that have been optimized for Intel platforms. Scaling is done with python, Docker, kubernetes, kubeflow, cnvrg.io, Helm, and other container orchestration frameworks for use in the cloud and on-prem…
☆48Updated last week
Alternatives and similar repositories for ai-containers
Users that are interested in ai-containers are comparing it to the libraries listed below
Sorting:
- Explore our open source AI portfolio! Develop, train, and deploy your AI solutions with performance- and productivity-optimized tools fro…☆49Updated 3 months ago
- OpenVINO Tokenizers extension☆37Updated this week
- AMD related optimizations for transformer models☆80Updated 3 weeks ago
- A curated list of OpenVINO based AI projects☆141Updated 2 weeks ago
- Large Language Model Text Generation Inference on Habana Gaudi☆34Updated 3 months ago
- Open Source AI with Granite and Granite Code☆21Updated last month
- No-code CLI designed for accelerating ONNX workflows☆201Updated last month
- ☆20Updated last month
- Evaluation, benchmark, and scorecard, targeting for performance on throughput and latency, accuracy on popular evaluation harness, safety…☆37Updated last week
- The Docker VSCode EAP is an extension for VSCode which provides an early preview into new features by Docker.☆13Updated 8 months ago
- vLLM: A high-throughput and memory-efficient inference and serving engine for LLMs☆87Updated this week
- Setup and Installation Instructions for Habana binaries, docker image creation☆25Updated last month
- Intel® Extension for DeepSpeed* is an extension to DeepSpeed that brings feature support with SYCL kernels on Intel GPU(XPU) device. Note…☆61Updated 2 weeks ago
- Easy and lightning fast training of 🤗 Transformers on Habana Gaudi processor (HPU)☆190Updated this week
- This repo contains documents of the OPEA project☆42Updated last week
- Accelerate your Gen AI with NVIDIA NIM and NVIDIA AI Workbench☆174Updated 2 months ago
- ☆86Updated last week
- Source Code and Usage Samples for the Resources hosted in the NVIDIA AI Enterprise AzureML Registry☆21Updated 11 months ago
- Explainable AI Tooling (XAI). XAI is used to discover and explain a model's prediction in a way that is interpretable to the user. Releva…☆39Updated 2 months ago
- ☆48Updated this week
- RAPIDS Documentation Site☆45Updated last week
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆303Updated this week
- Tutorials for running models on First-gen Gaudi and Gaudi2 for Training and Inference. The source files for the tutorials on https://dev…☆62Updated 3 weeks ago
- For individual users, watsonx Code Assistant can access a local IBM Granite model☆34Updated 3 weeks ago
- ☆26Updated this week
- ☆228Updated last week
- An Awesome list of oneAPI projects☆146Updated 7 months ago
- GenAI Studio is a low code platform to enable users to construct, evaluate, and benchmark GenAI applications. The platform also provide c…☆45Updated 2 weeks ago
- GenAI components at micro-service level; GenAI service composer to create mega-service☆163Updated this week
- Pre-built components and code samples to help you build and deploy production-grade AI applications with the OpenVINO™ Toolkit from Intel☆160Updated this week