matt-c1 / llama-3-quant-comparisonLinks
Comparison of the output quality of quantization methods, using Llama 3, transformers, GGUF, EXL2.
☆154Updated last year
Alternatives and similar repositories for llama-3-quant-comparison
Users that are interested in llama-3-quant-comparison are comparing it to the libraries listed below
Sorting:
- 1.58-bit LLaMa model☆81Updated last year
- A fast batching API to serve LLM models☆183Updated last year
- ☆94Updated 6 months ago
- A multimodal, function calling powered LLM webui.☆214Updated 8 months ago
- ☆301Updated 2 months ago
- This is our own implementation of 'Layer Selective Rank Reduction'☆239Updated last year
- Easily view and modify JSON datasets for large language models☆76Updated last month
- ☆77Updated this week
- Dataset Crafting w/ RAG/Wikipedia ground truth and Efficient Fine-Tuning Using MLX and Unsloth. Includes configurable dataset annotation …☆185Updated 11 months ago
- An optimized quantization and inference library for running LLMs locally on modern consumer-class GPUs☆408Updated this week
- Low-Rank adapter extraction for fine-tuned transformers models☆173Updated last year
- automatically quant GGUF models☆184Updated last week
- Automated Identification of Redundant Layer Blocks for Pruning in Large Language Models☆238Updated last year
- klmbr - a prompt pre-processing technique to break through the barrier of entropy while generating text with LLMs☆76Updated 9 months ago
- An easy-to-understand framework for LLM samplers that rewind and revise generated tokens☆140Updated 4 months ago
- AI management tool☆116Updated 7 months ago
- A pipeline parallel training script for LLMs.☆149Updated last month
- InferX is a Inference Function as a Service Platform☆109Updated this week
- Experimental LLM Inference UX to aid in creative writing☆114Updated 6 months ago
- An efficent implementation of the method proposed in "The Era of 1-bit LLMs"☆154Updated 8 months ago
- Fast parallel LLM inference for MLX☆192Updated 11 months ago
- Generate Synthetic Data Using OpenAI, MistralAI or AnthropicAI☆222Updated last year
- A local AI companion that uses a collection of free, open source AI models in order to create two virtual companions that will follow you…☆217Updated last week
- ☆157Updated 11 months ago
- Open source LLM UI, compatible with all local LLM providers.☆174Updated 9 months ago
- ☆129Updated last month
- Transplants vocabulary between language models, enabling the creation of draft models for speculative decoding WITHOUT retraining.☆31Updated 2 months ago
- Guaranteed Structured Output from any Language Model via Hierarchical State Machines☆136Updated 2 weeks ago
- Serving LLMs in the HF-Transformers format via a PyFlask API☆71Updated 9 months ago
- ☆132Updated 10 months ago