withcatai / cataiLinks
Run AI β¨ assistant locally! with simple API for Node.js π
β479Updated last year
Alternatives and similar repositories for catai
Users that are interested in catai are comparing it to the libraries listed below
Sorting:
- π local.ai - Run AI locally on your PC!β708Updated 2 years ago
- An autonomous AI agent extension for Oobabooga's web uiβ173Updated 2 years ago
- A llama.cpp drop-in replacement for OpenAI's GPT endpoints, allowing GPT-powered apps to run off local llama.cpp models instead of OpenAIβ¦β597Updated 2 years ago
- BabyAGI to run with GPT4Allβ248Updated 2 years ago
- An AI assistant beyond the chat box.β328Updated last year
- Web UI for Alpaca.cpp - Locally run an Instruction-Tuned Chat-Style LLMβ77Updated 2 years ago
- A prompt/context management systemβ170Updated 2 years ago
- An easy way to host your own AI API and expose alternative models, while being compatible with "open" AI clients.β332Updated last year
- A gradio web UI for running Large Language Models like GPT-J 6B, OPT, GALACTICA, LLaMA, and Pygmalion.β308Updated 2 years ago
- C++ implementation for π«StarCoderβ455Updated 2 years ago
- TheBloke's Dockerfilesβ307Updated last year
- Visual Studio Code extension for WizardCoderβ148Updated 2 years ago
- Uses Auto-GPT with Llama.cppβ386Updated last year
- Falcon LLM ggml framework with CPU and GPU supportβ247Updated last year
- Lord of LLMSβ294Updated last month
- An open source UI for OpenChat modelsβ288Updated last year
- An Autonomous LLM Agent that runs on Wizcoder-15Bβ333Updated last year
- β666Updated 2 weeks ago
- This repository represents my final assignment of "Module 3 - Android App Development" at Syntax Institut.β27Updated last year
- Believe in AI democratization. llama for nodejs backed by llama-rs, llama.cpp and rwkv.cpp, work locally on your laptop CPU. support llamβ¦β870Updated 2 years ago
- Erudito: Easy API/CLI to ask questions about your documentationβ99Updated 2 years ago
- π¬ Chatbot web app + HTTP and Websocket endpoints for LLM inference with the Petals clientβ316Updated last year
- BabyAGI to run with locally hosted models using the API from https://github.com/oobabooga/text-generation-webuiβ87Updated 2 years ago
- β137Updated 2 years ago
- AgentLLM is a PoC for browser-native autonomous agentsβ444Updated 2 years ago
- Use local llama LLM or openai to chat, discuss/summarize your documents, youtube videos, and so on.β154Updated 10 months ago
- Self-evaluating interview for AI codersβ597Updated 4 months ago
- The "vicuna-installation-guide" provides step-by-step instructions for installing and configuring Vicuna 13 and 7Bβ282Updated 2 years ago
- Minimalistic UI for Ollama LMs - This powerful react interface for LLMs drastically improves the chatbot experience and works offline.β342Updated last year
- oobaboga -text-generation-webui implementation of wafflecomposite - langchain-ask-pdf-localβ70Updated 2 years ago