ksadov / FREUDLinks
Visualization and sparse autoencoder training for mechanistic interpretability on audio models
☆20Updated 2 months ago
Alternatives and similar repositories for FREUD
Users that are interested in FREUD are comparing it to the libraries listed below
Sorting:
- An introduction to LLM Sampling☆78Updated 6 months ago
- smolLM with Entropix sampler on pytorch☆150Updated 7 months ago
- ☆29Updated last month
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆101Updated 3 months ago
- Video+code lecture on building nanoGPT from scratch☆68Updated last year
- A simple, hackable text-to-speech system in PyTorch and MLX☆164Updated 4 months ago
- Speaker Diarization with Transformers☆67Updated 2 weeks ago
- A simple MLX implementation for pretraining LLMs on Apple Silicon.☆80Updated last month
- Collection of autoregressive model implementation☆85Updated last month
- Simple Transformer in Jax☆137Updated last year
- ☆66Updated last year
- lossily compress representation vectors using product quantization☆57Updated 2 months ago
- ☆114Updated 5 months ago
- Joint speech-language model - respond directly to audio!☆30Updated last year
- ☆16Updated 4 months ago
- ☆132Updated 10 months ago
- NanoGPT-speedrunning for the poor T4 enjoyers☆66Updated 2 months ago
- MLX port for xjdr's entropix sampler (mimics jax implementation)☆64Updated 7 months ago
- Cerule - A Tiny Mighty Vision Model☆67Updated 9 months ago
- Lightweight package that tracks and summarizes code changes using LLMs (Large Language Models)☆34Updated 3 months ago
- ☆114Updated 6 months ago
- look how they massacred my boy☆63Updated 8 months ago
- Run GGML models with Kubernetes.☆173Updated last year
- run paligemma in real time☆131Updated last year
- smol models are fun too☆93Updated 7 months ago
- This repository's goal is to precompile all past presentations of the Huggingface reading group☆48Updated 9 months ago
- ☆152Updated 6 months ago
- An easy-to-understand framework for LLM samplers that rewind and revise generated tokens☆140Updated 4 months ago
- Full finetuning of large language models without large memory requirements☆94Updated last year
- Training code for Sparse Autoencoders on Embedding models☆38Updated 3 months ago