thoglu / jammy_flows
A package to describe amortized (conditional) normalizing-flow PDFs defined jointly on tensor products of manifolds with coverage control. The connection between different manifolds is fixed via an autoregressive structure.
☆43Updated 3 weeks ago
Alternatives and similar repositories for jammy_flows:
Users that are interested in jammy_flows are comparing it to the libraries listed below
- Probabilistic modeling of tabular data with normalizing flows.☆55Updated last month
- Machine learning assisted marginal likelihood (Bayesian evidence) estimation for Bayesian model selection☆60Updated last month
- ☆21Updated last year
- ☆24Updated last year
- Density Estimation Likelihood-Free Inference with neural density estimators and adaptive acquisition of simulations☆108Updated last year
- Provides differentiable versions of common HEP operations and objectives.☆24Updated last year
- Gravitational-wave data analysis tools in Jax☆66Updated last week
- Unleash the true power of scheduling☆30Updated last week
- A system for scientific simulation-based inference at scale.☆163Updated 11 months ago
- ☆16Updated 4 years ago
- Notebook to go along with a lecture for the MIT course 8.16: Data Science in Physics on neural simulation-based inference.☆52Updated last year
- Normalizing flow models allowing for a conditioning context, implemented using Jax, Flax, and Distrax.☆15Updated last year
- Tutorial Code for MLHEP pyprob☆18Updated last year
- nessai: Nested Sampling with Artificial Intelligence☆38Updated 2 weeks ago
- Probabilistic Programming and Nested sampling in JAX☆172Updated 3 months ago
- ☆31Updated 4 years ago
- Investigating the information content in the cosmic web through dark matter halo graphs. Arxiv paper: https://arxiv.org/abs/2207.05202☆12Updated last year
- A differentiable cosmology library in JAX☆193Updated 3 months ago
- ☆16Updated 2 years ago
- Upstream optimisation for downstream inference☆69Updated 2 weeks ago
- Bind any function written in another language to JAX with support for JVP/VJP/batching/jit compilation☆66Updated this week
- Using Graph Neural Networks to regress baryonic properties directly from full dark matter merger trees.☆24Updated last year
- Using neural networks to extract sufficient statistics from data by maximising the Fisher information☆31Updated last year
- Machine learning–based inference toolkit for particle physics☆86Updated 4 months ago
- differentiable (binned) likelihoods with JAX☆20Updated this week
- JAX bindings for the NVIDIA cuDecomp library☆32Updated 3 weeks ago
- madjax☆14Updated 4 months ago
- ☆25Updated 2 years ago
- Simulation-based inference in JAX☆31Updated last month