fishmoon1234 / Nonlocal-Attention-OperatorLinks
Attention mechanism-based neural operator models to solve both forward and inverse problems.
☆16Updated 6 months ago
Alternatives and similar repositories for Nonlocal-Attention-Operator
Users that are interested in Nonlocal-Attention-Operator are comparing it to the libraries listed below
Sorting:
- Simple demo on implementing data driven and physics informed Deep O Nets in pytorch☆19Updated last year
- PyTorch implemention of the Position-induced Transformer for operator learning in partial differential equations☆25Updated 6 months ago
- DON-LSTM: Multi-Resolution Learning with DeepONets and Long-Short Term Memory Neural Networks☆11Updated 3 months ago
- Separabale Physics-Informed DeepONets in JAX☆16Updated last year
- ☆12Updated last week
- Code for Mesh Transformer describes in the EAGLE dataset☆42Updated 9 months ago
- ☆11Updated 11 months ago
- ☆47Updated 9 months ago
- ☆29Updated 3 years ago
- Code for "Beyond Regular Grids: Fourier-Based Neural Operators on Arbitrary Domains"☆24Updated last year
- Official Code for ICML 2024 paper "TENG: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets Toward Machine Precision"☆14Updated last year
- ☆44Updated 3 years ago
- ☆37Updated 5 months ago
- library for querying the Johns Hopkins Turbulence Database (JHTDB)☆16Updated 3 weeks ago
- ☆18Updated last year
- ☆10Updated 2 years ago
- Learning two-phase microstructure evolution using neural operators and autoencoder architectures☆25Updated last year
- PDE Preserved Neural Network☆58Updated 6 months ago
- GCA-ROM is a library which implements graph convolutional autoencoder architecture as a nonlinear model order reduction strategy.☆36Updated last month
- Tackling the Curse of Dimensionality with Physics-Informed Neural Networks☆15Updated last year
- Source code of "Learning nonlinear operators in latent spaces for real-time predictions of complex dynamics in physical systems."☆75Updated 7 months ago
- ☆16Updated last year
- Reduced-Order Modeling of Fluid Flows with Transformers☆24Updated 2 years ago
- Reliable extrapolation of deep neural operators informed by physics or sparse observations☆28Updated 2 years ago
- ☆13Updated 2 years ago
- This repository contains code, which was used to generate large-scale results in the HINTS paper.☆34Updated last year
- This repository consists of two-step training DeepONet code.☆14Updated last year
- ☆45Updated this week
- ☆54Updated 3 years ago
- Code for "Robust flow field reconstruction from limited measurements vis sparse representation" (J. Callaham, K. Maeda, and S. Brunton 20…☆14Updated 7 years ago