isaaccorley / jax-enhanceLinks
minimal library for image super-resolution implemented in jax
☆12Updated 3 years ago
Alternatives and similar repositories for jax-enhance
Users that are interested in jax-enhance are comparing it to the libraries listed below
Sorting:
- wavelet implicit neural representations☆161Updated last year
- Implementation of a U-net complete with efficient attention as well as the latest research findings☆282Updated last year
- High-order spline interpolation in PyTorch☆67Updated 8 months ago
- 🚀 A powerful library for efficient training of Neural Fields at scale.☆29Updated last year
- ☆16Updated 3 years ago
- Integral Neural Networks in PyTorch☆125Updated 6 months ago
- Run PyTorch in JAX. 🤝☆246Updated 3 months ago
- FID computation in Jax/Flax.☆27Updated 10 months ago
- Implementation of Uformer, Attention-based Unet, in Pytorch☆95Updated 3 years ago
- Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains☆47Updated 4 years ago
- JAX implementation of Learning to learn by gradient descent by gradient descent☆27Updated 7 months ago
- Dataclasses manipulated as numpy arrays (with batching, reshape, slicing,...)☆49Updated 8 months ago
- ☆89Updated last year
- NumPy-style histograms in PyTorch☆54Updated last year
- Functional N-dimensional convolution in Pytorch, recursively calling convNd until reaching conv3d.☆79Updated last month
- Code for PnP-Flow: Plug-and-Play image restoration with Flow Matching (ICLR 2025)☆82Updated 3 months ago
- Code repository for the ICLR 2022 paper "FlexConv: Continuous Kernel Convolutions With Differentiable Kernel Sizes" https://openreview.ne…☆115Updated 2 years ago
- Differentiable and gpu enabled fast wavelet transforms in JAX.☆41Updated 11 months ago
- The 2D discrete wavelet transform for JAX☆43Updated 2 years ago
- Official implementation of the Reconstruct Anything Model (RAM)☆37Updated 3 weeks ago
- Implementation of LogAvgExp for Pytorch☆36Updated last month
- Implementation of the proposed Adam-atan2 from Google Deepmind in Pytorch☆106Updated 6 months ago
- Wraps PyTorch code in a JIT-compatible way for JAX. Supports automatically defining gradients for reverse-mode AutoDiff.☆53Updated last month
- ☆117Updated 6 months ago
- Implementation of "Fourier Features Let Networks Learn High Frequency Functions in Low Dimensional Domains" by Tancik et al.☆93Updated 2 years ago
- Source code for Fathony, Sahu, Willmott, & Kolter, "Multiplicative Filter Networks", ICLR 2021.☆95Updated 4 years ago
- Implementation of Hourglass Transformer, in Pytorch, from Google and OpenAI☆91Updated 3 years ago
- Free-form flows are a generative model training a pair of neural networks via maximum likelihood☆45Updated this week
- Running Jax in PyTorch Lightning☆102Updated 5 months ago
- Implicit Convolutional Kernels for Steerable CNNs [NeurIPS'23]☆29Updated 3 months ago