goodfire-ai / sdxl-turbo-interpretabilityLinks
☆45Updated 5 months ago
Alternatives and similar repositories for sdxl-turbo-interpretability
Users that are interested in sdxl-turbo-interpretability are comparing it to the libraries listed below
Sorting:
- ☆142Updated last month
- an open source reproduction of NVIDIA's nGPT (Normalized Transformer with Representation Learning on the Hypersphere)☆107Updated 7 months ago
- Simple Transformer in Jax☆139Updated last year
- look how they massacred my boy☆63Updated last year
- Sparse autoencoders for Contra text embedding models☆25Updated last year
- ☆40Updated last year
- DeMo: Decoupled Momentum Optimization☆194Updated 10 months ago
- ☆28Updated last year
- Plotting (entropy, varentropy) for small LMs☆98Updated 5 months ago
- Applying SAEs for fine-grained control☆24Updated 10 months ago
- WIP☆93Updated last year
- SIMD quantization kernels☆89Updated last month
- Simple RL gym for vision models in JAX☆104Updated this week
- A toy Inspect implementation of the Bliss Attractor eval from Claude 4 System Card Welfare Assessment☆33Updated 4 months ago
- Minimal (400 LOC) implementation Maximum (multi-node, FSDP) GPT training☆132Updated last year
- A graph visualization of attention☆57Updated 5 months ago
- Modify Entropy Based Sampling to work with Mac Silicon via MLX☆49Updated 11 months ago
- Code for the Fractured Entangled Representation Hypothesis position paper!☆203Updated 5 months ago
- H-Net Dynamic Hierarchical Architecture☆80Updated last month
- Focused on fast experimentation and simplicity☆75Updated 10 months ago
- ☆24Updated 5 months ago
- Training-Ready RL Environments + Evals☆132Updated last week
- σ-GPT: A New Approach to Autoregressive Models☆68Updated last year
- Grokking on modular arithmetic in less than 150 epochs in MLX☆14Updated last year
- The Prime Intellect CLI provides a powerful command-line interface for managing GPU resources across various providers☆100Updated this week
- smolLM with Entropix sampler on pytorch☆150Updated 11 months ago
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for tr…☆64Updated 3 weeks ago
- Approximating the joint distribution of language models via MCTS☆22Updated 11 months ago
- Repository to create traveling waves integrate special information through time☆55Updated 2 months ago
- ☆21Updated 9 months ago