Firstbober / rocm-pytorch-gfx803-dockerLinks
A Docker image based on rocm/pytorch with support for gfx803(Polaris 20-21 (XT/PRO/XL); RX580; RX570; RX560) and Python 3.8
☆24Updated 2 years ago
Alternatives and similar repositories for rocm-pytorch-gfx803-docker
Users that are interested in rocm-pytorch-gfx803-docker are comparing it to the libraries listed below
Sorting:
- Run stable-diffusion-webui with Radeon RX 580 8GB on Ubuntu 22.04.2 LTS☆68Updated 2 years ago
- ROCm docker images with fixes/support for extra architectures, such as gfx803/gfx1010.☆31Updated 2 years ago
- ☆234Updated 2 years ago
- Install guide of ROCm and Tensorflow on Ubuntu for the RX580☆129Updated last year
- Copy of rocm/pytorch with gfx803 cards compiled in (see https://github.com/xuhuisheng/rocm-build/blob/develop/docs/gfx803.md)☆20Updated 2 months ago
- AMD (Radeon GPU) ROCm based setup for popular AI tools on Ubuntu 24.04.1☆216Updated 2 weeks ago
- build scripts for ROCm☆188Updated last year
- Stable Diffusion Docker image preconfigured for usage with AMD Radeon cards☆140Updated last year
- ROCm docker images with fixes/support for legecy architecture gfx803. eg.Radeon RX 590/RX 580/RX 570/RX 480☆76Updated 6 months ago
- Download models from the Ollama library, without Ollama☆109Updated last year
- Fork of ollama for vulkan support☆107Updated 9 months ago
- General Site for the GFX803 ROCm Stuff☆126Updated 2 months ago
- Make PyTorch models at least run on APUs.☆57Updated last year
- AI Inferencing at the Edge. A simple one-file way to run various GGML models with KoboldAI's UI with AMD ROCm offloading☆714Updated last month
- AMD related optimizations for transformer models☆95Updated last month
- Triton for AMD MI25/50/60. Development repository for the Triton language and compiler☆32Updated 3 weeks ago
- DLPrimitives/OpenCL out of tree backend for pytorch☆377Updated last year
- Pure C++ implementation of several models for real-time chatting on your computer (CPU & GPU)☆746Updated this week
- ☆414Updated 7 months ago
- Unlock Unlimited Potential! Share Your GPU Power Across Your Local Network!☆69Updated 5 months ago
- 8-bit CUDA functions for PyTorch Rocm compatible☆41Updated last year
- AMD Ryzen™ AI Software includes the tools and runtime libraries for optimizing and deploying AI inference on AMD Ryzen™ AI powered PCs.☆692Updated 2 weeks ago
- NVIDIA Linux open GPU with P2P support☆78Updated 2 weeks ago
- llama.cpp fork with additional SOTA quants and improved performance☆1,329Updated this week
- Hackable and optimized Transformers building blocks, supporting a composable construction.☆32Updated last week
- 8-bit CUDA functions for PyTorch☆68Updated last month
- A python library to interact with AI-Horde's free generative AI APIs☆34Updated last week
- My develoopment fork of llama.cpp. For now working on RK3588 NPU and Tenstorrent backend☆108Updated last week
- FORK of VLLM for AMD MI25/50/60. A high-throughput and memory-efficient inference and serving engine for LLMs☆65Updated 6 months ago
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆216Updated 3 months ago