JohnGiorgi / compact-multi-head-self-attention-pytorch

A PyTorch implementation of the Compact Multi-Head Self-Attention Mechanism from the paper: "Low Rank Factorization for Compact Multi-Head Self-Attention"
24Updated 5 years ago

Alternatives and similar repositories for compact-multi-head-self-attention-pytorch:

Users that are interested in compact-multi-head-self-attention-pytorch are comparing it to the libraries listed below