lxe / wasm-gptLinks
Tensor library for machine learning
☆274Updated 2 years ago
Alternatives and similar repositories for wasm-gpt
Users that are interested in wasm-gpt are comparing it to the libraries listed below
Sorting:
- JS tokenizer for LLaMA 1 and 2☆359Updated last year
- WebGPU LLM inference tuned by hand☆151Updated 2 years ago
- 🦜️🔗 This is a very simple re-implementation of LangChain, in ~100 lines of code☆254Updated last year
- TypeScript generator for llama.cpp Grammar directly from TypeScript interfaces☆140Updated last year
- Add local LLMs to your Web or Electron apps! Powered by Rust + WebGPU☆104Updated 2 years ago
- ☆145Updated 2 years ago
- OpenAI-compatible Python client that can call any LLM☆372Updated 2 years ago
- LLaMA Cog template☆306Updated last year
- Easy-to-use headless React Hooks to run LLMs in the browser with WebGPU. Just useLLM().☆697Updated 2 years ago
- Layered, depth-first reading—start with summaries, tap to explore details, and gain clarity on complex topics.☆273Updated 2 years ago
- Enforce structured output from LLMs 100% of the time☆250Updated last year
- Run GGML models with Kubernetes.☆174Updated last year
- Augment GPT-4 Environment Access☆285Updated 2 years ago
- Web-optimized vector database (written in Rust).☆255Updated 7 months ago
- Vercel and web-llm template to run wasm models directly in the browser.☆161Updated last year
- Revealing example of self-attention, the building block of transformer AI models☆131Updated 2 years ago
- Call any LLM with a single API. Zero dependencies.☆215Updated 2 years ago
- https://ermine.ai -- 100% client-side live audio transcription, powered by transformers.js☆325Updated 2 years ago
- Code ChatGPT Plugin is a TypeScript Code Analyzer that enables ChatGPT to "talk" with YOUR code☆239Updated last year
- A fully in-browser privacy solution to make Conversational AI privacy-friendly☆230Updated 11 months ago
- Latent web browser☆273Updated 7 months ago
- A program synthesis agent that autonomously fixes its output by running tests!☆465Updated last year
- LLaMa retrieval plugin script using OpenAI's retrieval plugin☆324Updated 2 years ago
- An implementation of bucketMul LLM inference☆223Updated last year
- Marsha is a functional, higher-level, English-based programming language that gets compiled into tested Python software by an LLM☆470Updated last year
- 🐤 A minimal viable logger for Prompt/LLM Engineering. Use your IDE as Logging UI - a fast, simple, extensible, zero dependency Node.js l…☆142Updated last year
- Command-line script for inferencing from models such as MPT-7B-Chat☆100Updated 2 years ago
- An AI-driven tool to analyze your profile and gain insights into how ChatGPT interprets your personality.☆183Updated 2 years ago
- Simple repo that compiles and runs llama2.c on the Web☆56Updated last year
- Next-token prediction in JavaScript — build fast language and diffusion models.☆143Updated last year