intel / intel-ai-assistant-builderLinks
Intel® AI Assistant Builder
☆106Updated this week
Alternatives and similar repositories for intel-ai-assistant-builder
Users that are interested in intel-ai-assistant-builder are comparing it to the libraries listed below
Sorting:
- Lightweight Inference server for OpenVINO☆211Updated this week
- No-code CLI designed for accelerating ONNX workflows☆214Updated 3 months ago
- MLPerf Client is a benchmark for Windows and macOS, focusing on client form factors in ML inference scenarios.☆51Updated last month
- ☆43Updated this week
- Run LLM Agents on Ryzen AI PCs in Minutes☆587Updated last week
- Run Generative AI models with simple C++/Python API and using OpenVINO Runtime☆342Updated this week
- A platform to self-host AI on easy mode☆167Updated this week
- Simple node proxy for llama-server that enables MCP use☆13Updated 4 months ago
- InferX: Inference as a Service Platform☆135Updated this week
- AI PC starter app for doing AI image creation, image stylizing, and chatbot on a PC powered by an Intel® Arc™ GPU.☆612Updated last week
- Serving LLMs in the HF-Transformers format via a PyFlask API☆71Updated last year
- Enhancing LLMs with LoRA☆137Updated 2 weeks ago
- ☆100Updated last month
- Onboarding documentation source for the AMD Ryzen™ AI Software Platform. The AMD Ryzen™ AI Software Platform enables developers to take…☆78Updated this week
- AI Studio is an independent app for utilizing LLMs.☆304Updated 3 weeks ago
- The HIP Environment and ROCm Kit - A lightweight open source build system for HIP and ROCm☆398Updated this week
- ☆143Updated 2 weeks ago
- llama.cpp fork used by GPT4All☆56Updated 7 months ago
- A curated list of OpenVINO based AI projects☆155Updated 2 months ago
- LLM Ripper is a framework for component extraction (embeddings, attention heads, FFNs), activation capture, functional analysis, and adap…☆46Updated this week
- Locally running LLM with internet access☆96Updated 2 months ago
- ☆29Updated 5 months ago
- An extension that lets the AI take the wheel, allowing it to use the mouse and keyboard, recognize UI elements, and prompt itself :3...no…☆126Updated 11 months ago
- Bring clippy back to windows☆45Updated 9 months ago
- Benchmark llm performance☆104Updated last year
- The Fastest Way to Fine-Tune LLMs Locally☆321Updated 6 months ago
- VSCode AI coding assistant powered by self-hosted llama.cpp endpoint.☆183Updated 7 months ago
- Native gui to serveral AI services plus llama.cpp local AIs.☆116Updated last year
- ☆132Updated 5 months ago
- This project benchmarks 41 open-source large language models across 19 evaluation tasks using the lm-evaluation-harness library.☆72Updated 3 weeks ago