lucidrains / memory-efficient-attention-pytorch

Implementation of a memory efficient multi-head attention as proposed in the paper, "Self-attention Does Not Need O(n²) Memory"
360Updated last year

Related projects

Alternatives and complementary repositories for memory-efficient-attention-pytorch