esoltys / o1lama
o1lama: Use Ollama with Llama 3.2 3B and other models locally to create reasoning chains that are similar in appearance to OpenAI's o1.
☆22Updated 4 months ago
Alternatives and similar repositories for o1lama:
Users that are interested in o1lama are comparing it to the libraries listed below
- Extract structured data from local or remote LLM models☆41Updated 7 months ago
- A python package for serving LLM on OpenAI-compatible API endpoints with prompt caching using MLX.☆70Updated 2 months ago
- Simple, Fast, Parallel Huggingface GGML model downloader written in python☆24Updated last year
- Local LLM inference & management server with built-in OpenAI API☆31Updated 9 months ago
- Embedding models from Jina AI☆58Updated last year
- Experimental Code for StructuredRAG: JSON Response Formatting with Large Language Models