nova-land / gbnf-compilerLinks
Plug n Play GBNF Compiler for llama.cpp
☆26Updated last year
Alternatives and similar repositories for gbnf-compiler
Users that are interested in gbnf-compiler are comparing it to the libraries listed below
Sorting:
- A guidance compatibility layer for llama-cpp-python☆35Updated last year
- ☆31Updated last year
- Generates grammer files from typescript for LLM generation☆38Updated last year
- The one who calls upon functions - Function-Calling Language Model☆36Updated last year
- Experimental LLM Inference UX to aid in creative writing☆114Updated 7 months ago
- A simple speech-to-text and text-to-speech AI chatbot that can be run fully offline.☆45Updated last year
- Unofficial python bindings for the rust llm library. 🐍❤️🦀☆75Updated last year
- GPT-2 small trained on phi-like data☆66Updated last year
- Easily create LLM automation/agent workflows☆59Updated last year
- A web-app to explore topics using LLM (less typing and more clicks)☆67Updated last year
- Python console application designed to provide an engaging and visually appealing LLM chat experience on Unix-like consoles or Terminals.☆24Updated 3 weeks ago
- a lightweight, open-source blueprint for building powerful and scalable LLM chat applications☆28Updated last year
- A simple experiment on letting two local LLM have a conversation about anything!☆110Updated last year
- LMQL implementation of tree of thoughts☆34Updated last year
- Complex RAG backend☆28Updated last year
- A Python library to orchestrate LLMs in a neural network-inspired structure☆49Updated 9 months ago
- An OpenAI API compatible LLM inference server based on ExLlamaV2.☆25Updated last year
- Python package wrapping llama.cpp for on-device LLM inference☆75Updated this week
- Verbosity control for AI agents☆64Updated last year
- ☆44Updated last year
- Auto Data is a library designed for quick and effortless creation of datasets tailored for fine-tuning Large Language Models (LLMs).☆102Updated 8 months ago
- ☆66Updated last year
- A pythonic library providing light-weighted interface with LLMs☆127Updated last month
- A fast batching API to serve LLM models☆183Updated last year
- ☆16Updated last year
- Embed anything.☆28Updated last year
- A stable, fast and easy-to-use inference library with a focus on a sync-to-async API☆45Updated 9 months ago
- Client-side toolkit for using large language models, including where self-hosted☆111Updated 7 months ago
- klmbr - a prompt pre-processing technique to break through the barrier of entropy while generating text with LLMs☆78Updated 9 months ago
- One Repo To Quickly Build One Docker File for HuggingChat Front and BackEnd☆26Updated 2 years ago