nvms / wingman
Your pair programming wingman. Supports OpenAI, Anthropic, or any LLM on your local inference server.
☆70Updated 10 months ago
Alternatives and similar repositories for wingman:
Users that are interested in wingman are comparing it to the libraries listed below
- Host the GPTQ model using AutoGPTQ as an API that is compatible with text generation UI API.☆91Updated last year
- No-messing-around sh client for llama.cpp's server☆31Updated 8 months ago
- An AI assistant beyond the chat box.☆326Updated last year
- ☆130Updated last week
- Keeping my personal experiments separate from the main repo☆65Updated 2 months ago
- A frontend for creative writing with LLMs☆123Updated 9 months ago
- Convert Files / Folders / GitHub Repos Into AI / LLM-ready Files☆155Updated 2 months ago
- A web-app to explore topics using LLM (less typing and more clicks)☆66Updated last year
- auto fine tune of models with synthetic data☆75Updated last year
- Host LLM via text-generation-inference☆15Updated last year
- Your friendly terminal-based AI pair programmer☆42Updated last year
- klmbr - a prompt pre-processing technique to break through the barrier of entropy while generating text with LLMs☆71Updated 7 months ago
- Embed anything.☆29Updated 11 months ago
- ☆101Updated 8 months ago
- Experimental LLM Inference UX to aid in creative writing☆116Updated 4 months ago
- A fast batching API to serve LLM models☆182Updated last year
- A multimodal, function calling powered LLM webui.☆214Updated 7 months ago
- Landmark Attention: Random-Access Infinite Context Length for Transformers QLoRA☆123Updated last year
- A python package for developing AI applications with local LLMs.☆147Updated 3 months ago
- ☆63Updated 5 months ago
- Automated prompting and scoring framework to evaluate LLMs using updated human knowledge prompts☆112Updated last year
- ☆54Updated last year
- A comprehensive platform for managing, testing, and leveraging Ollama AI models with advanced features for customization, workflow automa…☆47Updated last month
- automatically generate @openai plugins by specifying your API in markdown in smol-developer style☆120Updated last year
- A simple experiment on letting two local LLM have a conversation about anything!☆110Updated 9 months ago
- Simple, Opinionated benchmark for testing the viability of Efficient Language Models (ELMs) for personal use cases.☆47Updated 11 months ago
- ☆45Updated 11 months ago
- Something similar to Apple Intelligence?☆60Updated 9 months ago
- This code implements a Local LLM Selector from the list of Local Installed Ollama LLMs for your specific user Query☆102Updated last year
- LLM Use Case: LLM Powered, Reusable, Domain Agnostic Autocompletes☆60Updated 11 months ago