KhoomeiK / complexity-scalingLinks
gzip Predicts Data-dependent Scaling Laws
โ35Updated last year
Alternatives and similar repositories for complexity-scaling
Users that are interested in complexity-scaling are comparing it to the libraries listed below
Sorting:
- A MAD laboratory to improve AI architecture designs ๐งชโ125Updated 8 months ago
- โ139Updated last week
- โ61Updated last year
- โ27Updated last year
- Scaling is a distributed training library and installable dependency designed to scale up neural networks, with a dedicated module for trโฆโ64Updated 9 months ago
- โ82Updated last year
- Understand and test language model architectures on synthetic tasks.โ222Updated last month
- Code to reproduce "Transformers Can Do Arithmetic with the Right Embeddings", McLeish et al (NeurIPS 2024)โ191Updated last year
- The simplest, fastest repository for training/finetuning medium-sized GPTs.โ153Updated 2 months ago
- Experiments for efforts to train a new and improved t5โ76Updated last year
- some common Huggingface transformers in maximal update parametrization (ยตP)โ82Updated 3 years ago
- โ53Updated last year
- โ105Updated 6 months ago
- โ69Updated last year
- Sparse and discrete interpretability tool for neural networksโ63Updated last year
- Official Repository of Pretraining Without Attention (BiGS), BiGS is the first model to achieve BERT-level transfer learning on the GLUE โฆโ114Updated last year
- โ93Updated last year
- Code for reproducing our paper "Not All Language Model Features Are Linear"โ77Updated 9 months ago
- โ22Updated last year
- โ21Updated 9 months ago
- Large scale 4D parallelism pre-training for ๐ค transformers in Mixture of Experts *(still work in progress)*โ87Updated last year
- Simple Transformer in Jaxโ139Updated last year
- Simplex Random Feature attention, in PyTorchโ74Updated last year
- โ56Updated 10 months ago
- nanoGPT-like codebase for LLM trainingโ102Updated 3 months ago
- โ101Updated last month
- Extract full next-token probabilities via language model APIsโ247Updated last year
- โ38Updated last year
- โ87Updated last year
- Yet another random morning idea to be quickly tried and architecture shared if it works; to allow the transformer to pause for any amountโฆโ53Updated last year