lucidrains / compositional-attention-pytorchLinks

Implementation of "compositional attention" from MILA, a multi-head attention variant that is reframed as a two-step attention process with disentangled search and retrieval head aggregation, in Pytorch
51Updated 3 years ago

Alternatives and similar repositories for compositional-attention-pytorch

Users that are interested in compositional-attention-pytorch are comparing it to the libraries listed below

Sorting: