attentionmech / tensorlens
TensorLens
☆10Updated this week
Alternatives and similar repositories for tensorlens:
Users that are interested in tensorlens are comparing it to the libraries listed below
- Optimizing Causal LMs through GRPO with weighted reward functions and automated hyperparameter tuning using Optuna☆39Updated 2 months ago
- ☆38Updated 8 months ago
- alternative way to calculating self attention☆18Updated 10 months ago
- Collection of autoregressive model implementation☆85Updated 2 months ago
- Train, tune, and infer Bamba model☆88Updated 3 months ago
- look how they massacred my boy☆63Updated 6 months ago
- ☆52Updated last month
- ☆48Updated 5 months ago
- [WIP] Transformer to embed Danbooru labelsets☆13Updated last year
- implementation of https://arxiv.org/pdf/2312.09299☆20Updated 9 months ago
- Simple GRPO scripts and configurations.☆58Updated 2 months ago
- Training hybrid models for dummies.☆20Updated 3 months ago
- Latent Large Language Models☆17Updated 8 months ago
- Very minimal (and stateless) agent framework☆42Updated 3 months ago
- ☆61Updated last year
- Using multiple LLMs for ensemble Forecasting☆16Updated last year
- A fast, local, and secure approach for training LLMs for coding tasks using GRPO with WebAssembly and interpreter feedback.☆22Updated 2 weeks ago
- NanoGPT-speedrunning for the poor T4 enjoyers☆62Updated this week
- A public implementation of the ReLoRA pretraining method, built on Lightning-AI's Pytorch Lightning suite.☆33Updated last year
- An introduction to LLM Sampling☆77Updated 4 months ago
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆96Updated last month
- ☆49Updated last year
- ☆41Updated 2 months ago
- Testing paligemma2 finetuning on reasoning dataset☆18Updated 3 months ago
- NanoGPT (124M) quality in 2.67B tokens☆28Updated this week
- ☆22Updated last year
- ☆27Updated 9 months ago
- NanoGPT (124M) in 5 minutes☆9Updated 2 months ago
- Train a SmolLM-style llm on fineweb-edu in JAX/Flax with an assortment of optimizers.☆17Updated last month
- ☆16Updated last month