lucidrains / compositional-attention-pytorch

Implementation of "compositional attention" from MILA, a multi-head attention variant that is reframed as a two-step attention process with disentangled search and retrieval head aggregation, in Pytorch
50Updated 2 years ago

Alternatives and similar repositories for compositional-attention-pytorch:

Users that are interested in compositional-attention-pytorch are comparing it to the libraries listed below