sap-ient-ai / FFF
FastFeedForward Networks
☆20Updated last year
Alternatives and similar repositories for FFF:
Users that are interested in FFF are comparing it to the libraries listed below
- ☆31Updated last year
- Official repository for the paper "Approximating Two-Layer Feedforward Networks for Efficient Transformers"☆37Updated last year
- ☆49Updated last year
- Jax like function transformation engine but micro, microjax☆30Updated 6 months ago
- ☆54Updated 7 months ago
- ☆79Updated last year
- ☆53Updated last year
- ☆52Updated 6 months ago
- ☆27Updated 9 months ago
- Implementation of Spectral State Space Models☆16Updated last year
- Collection of autoregressive model implementation☆85Updated 2 months ago
- Implementation of GateLoop Transformer in Pytorch and Jax☆87Updated 10 months ago
- RWKV-7: Surpassing GPT☆83Updated 5 months ago
- GoldFinch and other hybrid transformer components☆45Updated 9 months ago
- Demo of the unit_scaling library, showing how a model can be easily adapted to train in FP8.☆45Updated 9 months ago
- Memory Mosaics are networks of associative memories working in concert to achieve a prediction task.☆40Updated 2 months ago
- ☆27Updated last year
- ☆94Updated 3 months ago
- ☆43Updated last year
- Train and evaluate 1.58 bits Neural Networks☆25Updated 10 months ago
- Token Omission Via Attention☆126Updated 6 months ago
- ☆13Updated last month
- Latent Large Language Models☆17Updated 8 months ago
- Here we will test various linear attention designs.☆60Updated last year
- research impl of Native Sparse Attention (2502.11089)☆53Updated 2 months ago
- ☆33Updated 10 months ago
- Understanding how features learned by neural networks evolve throughout training☆34Updated 6 months ago
- train with kittens!☆57Updated 6 months ago
- Experiments on the impact of depth in transformers and SSMs.☆25Updated 5 months ago
- Code for the examples presented in the talk "Training a Llama in your backyard: fine-tuning very large models on consumer hardware" given…☆14Updated last year