google / neural-tangents
Fast and Easy Infinite Neural Networks in Python
☆2,277Updated 8 months ago
Related projects ⓘ
Alternatives and complementary repositories for neural-tangents
- JAX-based neural network library☆2,894Updated last week
- functorch is JAX-like composable function transforms for PyTorch.☆1,395Updated this week
- torch-optimizer -- collection of optimizers for Pytorch☆3,038Updated 7 months ago
- A highly efficient implementation of Gaussian Processes in PyTorch☆3,570Updated last week
- A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods☆1,395Updated 6 months ago
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,589Updated 2 years ago
- Bayesian optimization in PyTorch☆3,096Updated this week
- Differentiable SDE solvers with GPU support and efficient sensitivity analysis.☆1,580Updated 5 months ago
- Code for visualizing the loss landscape of neural nets☆2,824Updated 2 years ago
- Hopfield Networks is All You Need☆1,724Updated last year
- Differentiable ODE solvers with full GPU support and O(1)-memory backpropagation.☆5,567Updated last year
- Flax is a neural network library for JAX that is designed for flexibility.☆6,101Updated this week
- Optax is a gradient processing and optimization library for JAX.☆1,687Updated this week
- JAX - A curated list of resources https://github.com/google/jax☆1,538Updated 3 months ago
- A Graph Neural Network Library in Jax☆1,373Updated 7 months ago
- This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural P…☆987Updated 3 years ago
- Constrained optimization toolkit for PyTorch☆656Updated 2 years ago
- KErnel OPerationS, on CPUs and GPUs, with autodiff and without memory overflows☆1,054Updated this week
- A Python toolbox for performing gradient-free optimization☆3,957Updated 3 weeks ago
- Elegant easy-to-use neural networks + scientific computing in JAX. https://docs.kidger.site/equinox/☆2,098Updated last week
- Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.☆1,400Updated 3 months ago
- Hardware accelerated, batchable and differentiable optimizers in JAX.☆929Updated last month
- [NeurIPS'19] Deep Equilibrium Models☆727Updated 2 years ago
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆561Updated 6 months ago
- Profiling and inspecting memory in pytorch☆1,018Updated 3 months ago
- Tips for releasing research code in Machine Learning (with official NeurIPS 2020 recommendations)☆2,584Updated last year
- A pedagogical implementation of Autograd☆954Updated 4 years ago
- High-quality implementations of standard and SOTA methods on a variety of tasks.☆1,448Updated 3 weeks ago
- Probabilistic programming with NumPy powered by JAX for autograd and JIT compilation to GPU/TPU/CPU.☆2,185Updated this week
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆8,500Updated 3 weeks ago