LukasHedegaard / continual-transformers

Official Pytorch Implementation for "Continual Transformers: Redundancy-Free Attention for Online Inference" [ICLR 2023]
28Updated last year

Alternatives and similar repositories for continual-transformers:

Users that are interested in continual-transformers are comparing it to the libraries listed below