amd / RyzenAI-SWLinks
AMD Ryzen™ AI Software includes the tools and runtime libraries for optimizing and deploying AI inference on AMD Ryzen™ AI powered PCs.
☆717Updated 2 weeks ago
Alternatives and similar repositories for RyzenAI-SW
Users that are interested in RyzenAI-SW are comparing it to the libraries listed below
Sorting:
- ☆502Updated this week
- Intel® NPU Acceleration Library☆703Updated 8 months ago
- The HIP Environment and ROCm Kit - A lightweight open source build system for HIP and ROCm☆661Updated last week
- Intel® NPU (Neural Processing Unit) Driver☆366Updated last week
- ☆420Updated 8 months ago
- Onboarding documentation source for the AMD Ryzen™ AI Software Platform. The AMD Ryzen™ AI Software Platform enables developers to take…☆88Updated 2 weeks ago
- build scripts for ROCm☆188Updated last year
- No-code CLI designed for accelerating ONNX workflows☆222Updated 6 months ago
- Run LLM Agents on Ryzen AI PCs in Minutes☆822Updated last week
- Dockerfiles for the various software layers defined in the ROCm software platform☆506Updated 3 weeks ago
- ☆147Updated last week
- A collection of examples for the ROCm software stack☆265Updated last week
- A small OpenCL benchmark program to measure peak GPU/CPU performance.☆270Updated last month
- DLPrimitives/OpenCL out of tree backend for pytorch☆382Updated last month
- ☆155Updated last week
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆403Updated this week
- 8-bit CUDA functions for PyTorch☆69Updated 3 months ago
- AI Tensor Engine for ROCm☆325Updated last week
- HIPIFY: Convert CUDA to Portable C++ Code☆637Updated this week
- AMD related optimizations for transformer models☆96Updated 2 months ago
- Run LLMs on AMD Ryzen™ AI NPUs in minutes. Just like Ollama - but purpose-built and deeply optimized for the AMD NPUs.☆560Updated last week
- Fork of LLVM to support AMD AIEngine processors☆178Updated last week
- A high-throughput and memory-efficient inference and serving engine for LLMs☆113Updated this week
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆216Updated last month
- Low-bit LLM inference on CPU/NPU with lookup table☆903Updated 6 months ago
- llama.cpp fork with additional SOTA quants and improved performance☆1,399Updated this week
- Fast and memory-efficient exact attention☆205Updated this week
- [DEPRECATED] Moved to ROCm/rocm-systems repo☆149Updated 2 weeks ago
- See how to play with ROCm, run it with AMD GPUs!☆39Updated 7 months ago
- AMDGPU Driver with KFD used by the ROCm project. Also contains the current Linux Kernel that matches this base driver☆404Updated last month