yacineMTB / just-large-models
Just large language models. Hackable, with as little abstraction as possible. Done for my own purposes, feel free to rip.
☆44Updated last year
Alternatives and similar repositories for just-large-models:
Users that are interested in just-large-models are comparing it to the libraries listed below
- Simple Transformer in Jax☆136Updated 10 months ago
- ☆61Updated last year
- ☆38Updated 9 months ago
- A really tiny autograd engine☆92Updated last year
- look how they massacred my boy☆63Updated 6 months ago
- ☆48Updated last year
- inference code for mixtral-8x7b-32kseqlen☆99Updated last year
- Comprehensive analysis of difference in performance of QLora, Lora, and Full Finetunes.☆82Updated last year
- Just a bunch of benchmark logs for different LLMs☆119Updated 9 months ago
- compute, storage, and networking infra at home☆65Updated last year
- papers.day☆93Updated last year
- An introduction to LLM Sampling☆77Updated 4 months ago
- Synthetic data derived by templating, few shot prompting, transformations on public domain corpora, and monte carlo tree search.☆32Updated 2 months ago
- Turing machines, Rule 110, and A::B reversal using Claude 3 Opus.☆59Updated 11 months ago
- Simplex Random Feature attention, in PyTorch☆74Updated last year
- ☆22Updated last year
- ☆27Updated 9 months ago
- prime-rl is a codebase for decentralized RL training at scale☆85Updated this week
- Stream of my favorite papers and links☆41Updated last month
- Simple embedding -> text model trained on a small subset of Wikipedia sentences.☆153Updated last year
- Full finetuning of large language models without large memory requirements☆94Updated last year
- Cerule - A Tiny Mighty Vision Model☆67Updated 8 months ago
- ☆20Updated last year
- an implementation of Self-Extend, to expand the context window via grouped attention☆119Updated last year
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆98Updated last month
- A repository of prompts and Python scripts for intelligent transformation of raw text into diverse formats.☆30Updated last year
- ☆49Updated last year
- ☆55Updated 2 months ago
- Helpers and such for working with Lambda Cloud☆51Updated last year
- A tree-based prefix cache library that allows rapid creation of looms: hierarchal branching pathways of LLM generations.☆68Updated 2 months ago