omihub777 / MLP-Mixer-CIFAR

PyTorch implementation of Mixer-nano (#parameters is 0.67M, originally Mixer-S/16 has 18M) with 90.83 % acc. on CIFAR-10. Training from scratch.
28Updated 3 years ago

Related projects

Alternatives and complementary repositories for MLP-Mixer-CIFAR