pierreablin / ksddescentLinks
Kernel Stein Discrepancy Descent : a method to sample from unnormalized densities
☆22Updated last year
Alternatives and similar repositories for ksddescent
Users that are interested in ksddescent are comparing it to the libraries listed below
Sorting:
- Riemannian Convex Potential Maps☆67Updated 2 years ago
- Normalizing Flows with a resampled base distribution☆47Updated 3 years ago
- Source code for Large-Scale Wasserstein Gradient Flows (NeurIPS 2021)☆39Updated 3 years ago
- Demos for the paper Generalized Variational Inference (Knoblauch, Jewson & Damoulas, 2019)☆20Updated 6 years ago
- Implementation of Action Matching for the Schrödinger equation☆25Updated 2 years ago
- Library for normalizing flows and neural flows.☆25Updated 3 years ago
- Neural likelihood-free methods in PyTorch.☆39Updated 5 years ago
- ☆52Updated 2 years ago
- Stochastic Normalizing Flows☆78Updated 4 years ago
- Normalizing Flows using JAX☆85Updated 2 years ago
- Code for "'Hey, that's not an ODE:' Faster ODE Adjoints via Seminorms" (ICML 2021)☆88Updated 3 years ago
- Riemannian Optimization Using JAX☆53Updated 2 years ago
- Code for Gaussian Score Matching Variational Inference☆35Updated 9 months ago
- Code for "Infinitely Deep Bayesian Neural Networks with Stochastic Differential Equations"☆171Updated 3 years ago
- Convex potential flows☆84Updated 4 years ago
- Hamiltonian Dynamics with Non-Newtonian Momentum for Rapid Sampling☆36Updated 4 years ago
- ☆48Updated 2 years ago
- Differentiable and numerically stable implementation of the matrix exponential☆33Updated 5 years ago
- ☆37Updated 5 years ago
- [NeurIPS'19] Deep Equilibrium Models Jax Implementation☆42Updated 5 years ago
- Regularized Neural ODEs (RNODE)☆83Updated 4 years ago
- PyTorch implementation of the OT-Flow approach in arXiv:2006.00104☆57Updated last year
- Laplace Redux -- Effortless Bayesian Deep Learning☆44Updated 6 months ago
- [NeurIPS 2020] Task-Agnostic Amortized Inference of Gaussian Process Hyperparameters (AHGP)☆23Updated 5 years ago
- Implicit Deep Adaptive Design (iDAD): Policy-Based Experimental Design without Likelihoods☆21Updated 3 years ago
- ☆54Updated last year
- Learning the optimal transport map via input convex neural neworks☆42Updated 5 years ago
- A set of tests for evaluating large-scale algorithms for Wasserstein-2 transport maps computation (NeurIPS 2021)☆43Updated 3 years ago
- Lightweight MCMC sampling for PyTorch Models aka My Corona Project☆51Updated 4 months ago
- code for "Neural Conservation Laws A Divergence-Free Perspective".☆41Updated 3 years ago