KhoomeiK / complexity-scaling
gzip Predicts Data-dependent Scaling Laws
☆33Updated 8 months ago
Alternatives and similar repositories for complexity-scaling:
Users that are interested in complexity-scaling are comparing it to the libraries listed below
- Sparse and discrete interpretability tool for neural networks☆59Updated 11 months ago
- Evaluation of neuro-symbolic engines☆34Updated 5 months ago
- ☆27Updated 6 months ago
- ☆118Updated last week
- Experiments for efforts to train a new and improved t5☆77Updated 9 months ago
- Understanding how features learned by neural networks evolve throughout training☆32Updated 3 months ago
- Code for minimum-entropy coupling.☆31Updated 7 months ago
- ☆22Updated last year
- Minimum Description Length probing for neural network representations☆18Updated this week
- ☆60Updated last year
- ☆37Updated 6 months ago
- A MAD laboratory to improve AI architecture designs 🧪☆102Updated last month
- Functional Benchmarks and the Reasoning Gap☆82Updated 3 months ago
- ☆25Updated 9 months ago
- ☆20Updated 9 months ago
- ☆81Updated 3 months ago
- ☆26Updated last year
- ☆50Updated 5 months ago
- Q-Probe: A Lightweight Approach to Reward Maximization for Language Models☆40Updated 7 months ago
- ☆53Updated last year
- Code for reproducing our paper "Not All Language Model Features Are Linear"☆66Updated 2 months ago
- ☆70Updated 5 months ago
- An unofficial implementation of the Infini-gram model proposed by Liu et al. (2024)☆29Updated 7 months ago
- The repository contains code for Adaptive Data Optimization☆20Updated last month
- Transformer with Mu-Parameterization, implemented in Jax/Flax. Supports FSDP on TPU pods.☆30Updated last month
- ☆55Updated this week
- Proof-of-concept of global switching between numpy/jax/pytorch in a library.☆18Updated 7 months ago
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for tr…☆53Updated 3 months ago
- A mechanistic approach for understanding and detecting factual errors of large language models.☆39Updated 6 months ago
- The simplest, fastest repository for training/finetuning medium-sized GPTs.☆91Updated 2 months ago