amd / gaiaLinks
Run LLM Agents on Ryzen AI PCs in Minutes
☆529Updated 2 weeks ago
Alternatives and similar repositories for gaia
Users that are interested in gaia are comparing it to the libraries listed below
Sorting:
- AI PC starter app for doing AI image creation, image stylizing, and chatbot on a PC powered by an Intel® Arc™ GPU.☆593Updated last week
- No-code CLI designed for accelerating ONNX workflows☆210Updated 2 months ago
- AMD Ryzen™ AI Software includes the tools and runtime libraries for optimizing and deploying AI inference on AMD Ryzen™ AI powered PCs.☆617Updated 2 weeks ago
- Lemonade helps users run local LLMs with the highest performance by configuring state-of-the-art inference engines for their NPUs and GPU…☆1,178Updated this week
- Lightweight Inference server for OpenVINO☆202Updated last week
- The HIP Environment and ROCm Kit - A lightweight open source build system for HIP and ROCm☆331Updated this week
- llama.cpp fork with additional SOTA quants and improved performance☆1,111Updated this week
- Fully Open Language Models with Stellar Performance☆247Updated last month
- ☆315Updated this week
- Intel® AI Assistant Builder☆98Updated 2 weeks ago
- Run LLMs on AMD Ryzen™ AI NPUs. Just like Ollama - but purpose-built and deeply optimized for the AMD NPUs.☆148Updated this week
- ☆463Updated this week
- LM inference server implementation based on *.cpp.☆271Updated 2 weeks ago
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆325Updated this week
- Review/Check GGUF files and estimate the memory usage and maximum tokens per second.☆202Updated 2 weeks ago
- Docs for GGUF quantization (unofficial)☆254Updated last month
- ☆385Updated 4 months ago
- ☆541Updated last week
- Download models from the Ollama library, without Ollama☆95Updated 9 months ago
- ☆166Updated last week
- VS Code extension for LLM-assisted code/text completion☆941Updated this week
- LLM Benchmark for Throughput via Ollama (Local LLMs)☆286Updated 3 weeks ago
- ☆144Updated 3 weeks ago
- Intel® NPU (Neural Processing Unit) Driver☆310Updated this week
- Intel® NPU Acceleration Library☆689Updated 4 months ago
- AI Studio is an independent app for utilizing LLMs.☆298Updated this week
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 7 months ago
- 🌟 Yi-Coder is a series of open-source code language models that delivers state-of-the-art coding performance with fewer than 10 billion …☆427Updated 11 months ago
- LM Studio Python SDK☆613Updated last week
- MLPerf Client is a benchmark for Windows and macOS, focusing on client form factors in ML inference scenarios.☆47Updated last month