drhino / ollama-clientLinks
Ollama API client in ECMAScript / JavaScript / ESM.
☆10Updated 2 years ago
Alternatives and similar repositories for ollama-client
Users that are interested in ollama-client are comparing it to the libraries listed below
Sorting:
- Use Discord as your interface for ollama☆12Updated last year
- Create keyboard shortcuts for an LLM using OpenAI GPT, Ollama, HuggingFace with Automator on macOS.☆153Updated last year
- Ollama function calling demo☆29Updated last year
- A simple Web / UI / App / Frontend to Ollama.☆84Updated 7 months ago
- ☆134Updated last year
- The easiest way to run the fastest MLX-based LLMs locally☆304Updated last year
- GenAI & agent toolkit for Apple Silicon Mac, implementing JSON schema-steered structured output (3SO) and tool-calling in Python. For mor…☆128Updated 2 months ago
- Simple front-end interface for querying a local Ollama API server☆25Updated last year
- For inferring and serving local LLMs using the MLX framework☆109Updated last year
- Run an AI-powered Discord bot from the comfort of your laptop.☆163Updated 11 months ago
- Demo of AI chatbot that predicts user message to generate response quickly.☆103Updated last year
- converts url content into JSON with a simple prefix☆71Updated last year
- Start a server from the MLX library.☆192Updated last year
- SiLLM simplifies the process of training and running Large Language Models (LLMs) on Apple Silicon by leveraging the MLX framework.☆280Updated 4 months ago
- AutoNL - Natural Language Automation tool☆86Updated last year
- ☆61Updated 2 years ago
- A fast, light, open chat UI with full tool use support across many models☆219Updated 5 months ago
- Gradio based tool to run opensource LLM models directly from Huggingface☆96Updated last year
- Client-side toolkit for using large language models, including where self-hosted☆112Updated this week
- Dagger functions to import Hugging Face GGUF models into a local ollama instance and optionally push them to ollama.com.☆119Updated last year
- auto fine tune of models with synthetic data☆75Updated last year
- Chat with MLX is a high-performance macOS application that connects your local documents to a personalized large language model (LLM).☆175Updated last year
- Information on optimizing python libraries specifically for oobabooga to take advantage of Apple Silicon and Accelerate Framework.☆75Updated 8 months ago
- A simple Jupyter Notebook for learning MLX text-completion fine-tuning!☆122Updated 11 months ago
- ☆116Updated 10 months ago
- ☆47Updated last year
- Grammar checker with a keyboard shortcut for Ollama and Apple MLX with Automator on macOS.☆82Updated last year
- ☆15Updated 2 years ago
- An LLM agnostic desktop and mobile client.☆301Updated last month
- The library for character-driven AI experiences.☆89Updated last year