Implementation of Axial attention - attending to multi-dimensional data efficiently
☆396Aug 26, 2021Updated 4 years ago
Alternatives and similar repositories for axial-attention
Users that are interested in axial-attention are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- This is a PyTorch re-implementation of Axial-DeepLab (ECCV 2020 Spotlight)☆460Jun 22, 2021Updated 4 years ago
- Implementation of Invariant Point Attention, used for coordinate refinement in the structure module of Alphafold2, as a standalone Pytorc…☆171Nov 25, 2022Updated 3 years ago
- A simple Transformer where the softmax has been replaced with normalization☆20Sep 11, 2020Updated 5 years ago
- Axial Positional Embedding for Pytorch☆84Feb 25, 2025Updated last year
- Implementation of Token Shift GPT - An autoregressive model that solely relies on shifting the sequence space for mixing☆49Jan 27, 2022Updated 4 years ago
- Managed hosting for WordPress and PHP on Cloudways • AdManaged hosting for WordPress, Magento, Laravel, or PHP apps, on multiple cloud providers. Deploy in minutes on Cloudways by DigitalOcean.
- Official Code for ECCV2022: Learning Semantic Correspondence with Sparse Annotations☆18Aug 22, 2022Updated 3 years ago
- Implementation of some personal helper functions for Einops, my most favorite tensor manipulation library ❤️☆57Jan 5, 2023Updated 3 years ago
- A better PyTorch implementation of image local attention which reduces the GPU memory by an order of magnitude.☆141Dec 21, 2021Updated 4 years ago
- PyTorch Codes for Haar Graph Pooling☆11Feb 16, 2023Updated 3 years ago
- CCNet: Criss-Cross Attention for Semantic Segmentation (TPAMI 2020 & ICCV 2019).☆1,484Mar 19, 2021Updated 5 years ago
- Implementation of Uformer, Attention-based Unet, in Pytorch☆96Oct 26, 2021Updated 4 years ago
- Locally Enhanced Self-Attention: Rethinking Self-Attention as Local and Context Terms☆11Nov 29, 2021Updated 4 years ago
- Another attempt at a long-context / efficient transformer by me