NVIDIA / cuEquivarianceLinks
cuEquivariance is a math library that is a collective of low-level primitives and tensor ops to accelerate widely-used models, like DiffDock, MACE, Allegro and NEQUIP, based on equivariant neural networks. Also includes kernels for accelerated structure prediction.
☆308Updated this week
Alternatives and similar repositories for cuEquivariance
Users that are interested in cuEquivariance are comparing it to the libraries listed below
Sorting:
- Official implementation of All Atom Diffusion Transformers (ICML 2025)☆261Updated last month
- Torch-native, batchable, atomistic simulations.☆291Updated this week
- [ICLR 2024] EquiformerV2: Improved Equivariant Transformer for Scaling to Higher-Degree Representations☆292Updated 7 months ago
- Training neural network potentials☆439Updated 3 weeks ago
- A collection of QM data for training potential functions☆183Updated 7 months ago
- jax library for E3 Equivariant Neural Networks☆216Updated last month
- Build neural networks for machine learning force fields with JAX☆125Updated 4 months ago
- Neural Network Force Field based on PyTorch☆281Updated last month
- Boltzmann Generators and Normalizing Flows in PyTorch☆175Updated last year
- [ICLR 2024 Spotlight] Official Implementation of "Enabling Efficient Equivariant Operations in the Fourier Basis via Gaunt Tensor Product…☆66Updated 11 months ago
- [ICLR 2025] GotenNet: Rethinking Efficient 3D Equivariant Graph Neural Networks☆47Updated 2 months ago
- OpenEquivariance: a fast, open-source GPU JIT kernel generator for the Clebsch-Gordon Tensor Product.☆86Updated last month
- Allegro is an open-source code for building highly scalable and accurate equivariant deep learning interatomic potentials☆432Updated last month
- OpenMM plugin to define forces with neural networks☆206Updated 7 months ago
- A plugin to use Nvidia GPU in PySCF package☆221Updated last week
- SO3krates and Universal Pairwise Force Field for Molecular Simulation☆132Updated this week
- [ICLR 2023 Spotlight] Equiformer: Equivariant Graph Attention Transformer for 3D Atomistic Graphs☆253Updated 7 months ago
- E3x is a JAX library for constructing efficient E(3)-equivariant deep learning architectures built on top of Flax.☆108Updated 6 months ago
- Implementation of the Euclidean fast attention (EFA) algorithm☆55Updated 3 weeks ago
- Pytorch differentiable molecular dynamics☆180Updated 3 years ago
- ORB forcefield models from Orbital Materials