omihub777 / MLP-Mixer-CIFARView external linksLinks
PyTorch implementation of Mixer-nano (#parameters is 0.67M, originally Mixer-S/16 has 18M) with 90.83 % acc. on CIFAR-10. Training from scratch.
☆36Nov 6, 2021Updated 4 years ago
Alternatives and similar repositories for MLP-Mixer-CIFAR
Users that are interested in MLP-Mixer-CIFAR are comparing it to the libraries listed below
Sorting:
- Pytorch reimplementation of the Mixer (MLP-Mixer: An all-MLP Architecture for Vision)