keldenl / fleeceLinks
Llama for VSCode
☆102Updated 2 years ago
Alternatives and similar repositories for fleece
Users that are interested in fleece are comparing it to the libraries listed below
Sorting:
- Augment GPT-4 Environment Access☆285Updated 2 years ago
- Tensor library for machine learning☆273Updated 2 years ago
- Run inference on replit-3B code instruct model using CPU☆160Updated 2 years ago
- A fast, light, open chat UI with full tool use support across many models☆220Updated 8 months ago
- Structured LLM APIs☆156Updated 2 years ago
- Run GGML models with Kubernetes.☆175Updated 2 years ago
- Turing machines, Rule 110, and A::B reversal using Claude 3 Opus.☆58Updated last year
- Some of the scripts I use for scribepod @ https://scribepod.substack.com/, an automated AI podcast☆174Updated 2 years ago
- A repository fully generated by ChatGPT making it believed it checked out a this repository which I described like the first line of the …☆120Updated 3 years ago
- ☆113Updated 2 years ago
- GPT-3 on your command line☆131Updated 2 years ago
- Layered, depth-first reading—start with summaries, tap to explore details, and gain clarity on complex topics.☆277Updated 2 years ago
- Call any LLM with a single API. Zero dependencies.☆215Updated 2 years ago
- 🐤 A minimal viable logger for Prompt/LLM Engineering. Use your IDE as Logging UI - a fast, simple, extensible, zero dependency Node.js l…☆147Updated last year
- Tool to create a dataset of semantic segmentation on website screenshots from their DOM☆89Updated 3 years ago
- Supercharge Open-Source AI Models☆349Updated 2 years ago
- LLM plugin providing access to Mistral models using the Mistral API☆206Updated 6 months ago
- iterate quickly with llama.cpp hot reloading. use the llama.cpp bindings with bun.sh☆50Updated 2 years ago
- ☆335Updated 3 years ago
- Command-line script for inferencing from models such as falcon-7b-instruct☆75Updated 2 years ago
- ☆107Updated 2 years ago
- All Spellcraft CLI tools.☆77Updated 2 years ago
- Generates grammer files from typescript for LLM generation☆38Updated last year
- OpenAI-compatible Python client that can call any LLM☆372Updated 2 years ago
- Ask questions, let GPT do the SQL.☆133Updated 2 years ago
- ☆144Updated 2 years ago
- LLaMA Cog template☆303Updated 2 years ago
- An HTTP serving framework by Banana☆101Updated 2 years ago
- LLaMa retrieval plugin script using OpenAI's retrieval plugin☆323Updated 2 years ago
- Command-line script for inferencing from models such as MPT-7B-Chat☆100Updated 2 years ago