omihub777 / MLP-Mixer-CIFAR

PyTorch implementation of Mixer-nano (#parameters is 0.67M, originally Mixer-S/16 has 18M) with 90.83 % acc. on CIFAR-10. Training from scratch.
29Updated 3 years ago

Alternatives and similar repositories for MLP-Mixer-CIFAR:

Users that are interested in MLP-Mixer-CIFAR are comparing it to the libraries listed below