songlab-cal / factored-attentionLinks
This repository contains code for reproducing results in our paper Interpreting Potts and Transformer Protein Models Through the Lens of Simplified Attention
☆58Updated 3 years ago
Alternatives and similar repositories for factored-attention
Users that are interested in factored-attention are comparing it to the libraries listed below
Sorting:
- ☆34Updated 5 years ago
- Implementation of trRosetta and trDesign for Pytorch, made into a convenient package, for protein structure prediction and design☆83Updated 4 years ago
- ☆25Updated 3 years ago
- pytorch implementation of trDesign☆45Updated 4 years ago
- Replication attempt for the Protein Folding Model described in https://www.biorxiv.org/content/10.1101/2021.08.02.454840v1☆37Updated 3 years ago
- Pytorch reimplementation of Molecule Attention Transformer, which uses a transformer to tackle the graph-like structure of molecules☆58Updated 4 years ago
- RITA is a family of autoregressive protein models, developed by LightOn in collaboration with the OATML group at Oxford and the Debora Ma…☆98Updated 2 years ago
- ☆34Updated 6 months ago
- Implementation and replication of ProGen, Language Modeling for Protein Generation, in Jax☆112Updated 4 years ago
- Predicting protein structure through sequence modeling☆112Updated 5 years ago
- Energy-based models for atomic-resolution protein conformations