Gsunshine / Enjoy-HamburgerLinks
[ICLR 2021 top 3%] Is Attention Better Than Matrix Decomposition?
☆333Updated 2 years ago
Alternatives and similar repositories for Enjoy-Hamburger
Users that are interested in Enjoy-Hamburger are comparing it to the libraries listed below
Sorting:
- ☆192Updated 2 years ago
- Two simple and effective designs of vision transformer, which is on par with the Swin transformer☆605Updated 2 years ago
- [ICLR'22 Oral] Implementation of "CycleMLP: A MLP-like Architecture for Dense Prediction"☆289Updated 3 years ago
- An implementation of the efficient attention module.☆319Updated 4 years ago
- Official code for paper "On the Connection between Local Attention and Dynamic Depth-wise Convolution" ICLR 2022 Spotlight☆186Updated 2 years ago
- Official MegEngine implementation of RepLKNet☆277Updated 3 years ago
- [NeurIPS 2021] [T-PAMI] Global Filter Networks for Image Classification☆482Updated 2 years ago
- ConvMAE: Masked Convolution Meets Masked Autoencoders☆507Updated 2 years ago
- Dense Contrastive Learning (DenseCL) for self-supervised representation learning, CVPR 2021 Oral.☆559Updated last year
- ☆216Updated 3 years ago
- This is an official implementation for "ResT: An Efficient Transformer for Visual Recognition".☆287Updated 2 years ago
- Unofficial Implementation of MLP-Mixer, gMLP, resMLP, Vision Permutator, S2MLP, S2MLPv2, RaftMLP, HireMLP, ConvMLP, AS-MLP, SparseMLP, Co…☆170Updated 3 years ago
- ☆200Updated 11 months ago
- reproduction of semantic segmentation using masked autoencoder (mae)☆164Updated 3 years ago
- [NeurIPS 2022] HorNet: Efficient High-Order Spatial Interactions with Recursive Gated Convolutions☆338Updated last year
- Unofficial implementation of MLP-Mixer: An all-MLP Architecture for Vision☆218Updated 4 years ago
- Implementation of the 😇 Attention layer from the paper, Scaling Local Self-Attention For Parameter Efficient Visual Backbones☆199Updated 4 years ago
- [ICLR'22] This is an official implementation for "AS-MLP: An Axial Shifted MLP Architecture for Vision".