google / neural-tangents
Fast and Easy Infinite Neural Networks in Python
☆2,321Updated 11 months ago
Alternatives and similar repositories for neural-tangents:
Users that are interested in neural-tangents are comparing it to the libraries listed below
- JAX-based neural network library☆2,960Updated 3 weeks ago
- Optax is a gradient processing and optimization library for JAX.☆1,802Updated this week
- Flax is a neural network library for JAX that is designed for flexibility.☆6,342Updated this week
- A highly efficient implementation of Gaussian Processes in PyTorch☆3,643Updated last week
- functorch is JAX-like composable function transforms for PyTorch.☆1,410Updated this week
- A Graph Neural Network Library in Jax☆1,403Updated 11 months ago
- A pedagogical implementation of Autograd☆970Updated 4 years ago
- JAX - A curated list of resources https://github.com/google/jax☆1,694Updated this week
- A simple probabilistic programming language.☆687Updated last month
- Bayesian optimization in PyTorch☆3,174Updated this week
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆572Updated last month
- torch-optimizer -- collection of optimizers for Pytorch☆3,081Updated 10 months ago
- Differentiable convex optimization layers☆1,873Updated 2 months ago
- Normalizing flows in PyTorch. Current intended use is education not production.☆855Updated 5 years ago
- Hopfield Networks is All You Need☆1,772Updated last year
- This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural P…☆990Updated 4 years ago
- High-quality implementations of standard and SOTA methods on a variety of tasks.☆1,485Updated last week
- A Python toolbox for performing gradient-free optimization☆3,999Updated this week
- ☆770Updated last year
- Hardware accelerated, batchable and differentiable optimizers in JAX.☆947Updated 5 months ago
- ☆1,288Updated 3 weeks ago
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,606Updated 2 years ago
- Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.☆2,396Updated this week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆8,741Updated last week
- Differentiable SDE solvers with GPU support and efficient sensitivity analysis.☆1,612Updated last month
- KErnel OPerationS, on CPUs and GPUs, with autodiff and without memory overflows☆1,071Updated this week
- A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods☆1,439Updated 9 months ago
- ☆761Updated last month
- PyTorch, TensorFlow, JAX and NumPy — all of them natively using the same code☆696Updated last year
- Probabilistic Torch is library for deep generative models that extends PyTorch☆887Updated 9 months ago