3eeps / cherry-py
simple prompt script to convert hf/ggml files to gguf, and to quantize
β25Updated last year
Alternatives and similar repositories for cherry-py:
Users that are interested in cherry-py are comparing it to the libraries listed below
- After my server ui improvements were successfully merged, consider this repo a playground for experimenting, tinkering and hacking aroundβ¦β56Updated 6 months ago
- 100% Private & Simple. OSS π Code Interpreter for LLMs π¦β35Updated last year
- β111Updated 2 months ago
- run ollama & gguf easily with a single commandβ49Updated 9 months ago
- β28Updated 11 months ago
- Text generation in Python, as easy as possibleβ55Updated this week
- β16Updated 2 months ago
- A Python library to orchestrate LLMs in a neural network-inspired structureβ46Updated 5 months ago
- An unsupervised model merging algorithm for Transformers-based language models.β106Updated 10 months ago
- My version of an LLM Websearch Agent using a local SearXNG server because SearXNG is great.β25Updated this week
- Discord chatbot interface to train an LLM on user message historyβ27Updated last year
- Mycomind Daemon: A mycelium-inspired, advanced Mixture-of-Memory-RAG-Agents (MoMRA) cognitive assistant that combines multiple AI models β¦β31Updated 7 months ago
- Easily view and modify JSON datasets for large language modelsβ71Updated last week
- Gradio based tool to run opensource LLM models directly from Huggingfaceβ91Updated 8 months ago
- Model REVOLVER, a human in the loop model mixing system.β33Updated last year
- β20Updated last year
- β24Updated last month
- Mistral7B playing DOOMβ28Updated 11 months ago
- β27Updated last year
- Let's create synthetic textbooks together :)β73Updated last year
- Local LLM inference & management server with built-in OpenAI APIβ31Updated 10 months ago
- β28Updated 5 months ago
- β65Updated 9 months ago
- β24Updated last year
- An OpenAI API compatible LLM inference server based on ExLlamaV2.β25Updated last year
- Distributed Inference for mlx LLmβ84Updated 7 months ago