akx / ggify
Tool to download models from Huggingface Hub and convert them to GGML/GGUF for llama.cpp
☆134Updated 7 months ago
Alternatives and similar repositories for ggify:
Users that are interested in ggify are comparing it to the libraries listed below
- Download models from the Ollama library, without Ollama☆69Updated 5 months ago
- A fast batching API to serve LLM models☆182Updated last year
- Distributed Inference for mlx LLm☆87Updated 8 months ago
- This is our own implementation of 'Layer Selective Rank Reduction'☆235Updated 11 months ago
- ☆153Updated 9 months ago
- Train your own small bitnet model☆67Updated 6 months ago
- Python bindings for ggml☆140Updated 7 months ago
- Low-Rank adapter extraction for fine-tuned transformers models