Artefact2 / llm-samplingLinks
A very simple interactive demo to understand the common LLM samplers.
☆40Updated last year
Alternatives and similar repositories for llm-sampling
Users that are interested in llm-sampling are comparing it to the libraries listed below
Sorting:
- Easily view and modify JSON datasets for large language models☆84Updated 6 months ago
- Create text chunks which end at natural stopping points without using a tokenizer☆26Updated 2 weeks ago
- AI management tool☆121Updated last year
- After my server ui improvements were successfully merged, consider this repo a playground for experimenting, tinkering and hacking around…☆53Updated last year
- ☆49Updated 9 months ago
- klmbr - a prompt pre-processing technique to break through the barrier of entropy while generating text with LLMs☆86Updated last year
- ☆164Updated 4 months ago
- Transplants vocabulary between language models, enabling the creation of draft models for speculative decoding WITHOUT retraining.☆47Updated last month
- ☆117Updated 11 months ago
- A frontend for creative writing with LLMs☆140Updated last year
- cli tool to quantize gguf, gptq, awq, hqq and exl2 models☆76Updated 11 months ago
- This project is a reverse-engineered version of Figma's tone changer. It uses Groq's Llama-3-8b for high-speed inference and to adjust th…☆90Updated last year
- Experimental LLM Inference UX to aid in creative writing☆127Updated last year
- ☆331Updated 4 months ago
- entropix style sampling + GUI☆27Updated last year
- The hearth of The Pulsar App, fast, secure and shared inference with modern UI☆59Updated last year
- Self-hosted LLM chatbot arena, with yourself as the only judge☆41Updated last year
- Glyphs, acting as collaboratively defined symbols linking related concepts, add a layer of multidimensional semantic richness to user-AI …☆54Updated 10 months ago
- Gradio UI for a Cog API☆72Updated last year
- Serving LLMs in the HF-Transformers format via a PyFlask API☆72Updated last year
- ☆68Updated last year
- Gradio based tool to run opensource LLM models directly from Huggingface☆96Updated last year
- an auto-sleeping and -waking framework around llama.cpp☆12Updated 10 months ago
- ☆134Updated 7 months ago
- An extension that lets the AI take the wheel, allowing it to use the mouse and keyboard, recognize UI elements, and prompt itself :3...no…☆127Updated last year
- ☆23Updated last year
- Easy to use, High Performant Knowledge Distillation for LLMs☆97Updated 7 months ago
- Automated LLM novelist☆46Updated last year
- Python package wrapping llama.cpp for on-device LLM inference☆95Updated 2 months ago
- "a towel is about the most massively useful thing an interstellar AI hitchhiker can have"☆48Updated last year