jloveric / high-order-layers-torchLinks
High order and sparse layers in pytorch. Lagrange Polynomial, Piecewise Lagrange Polynomial, Piecewise Discontinuous Lagrange Polynomial (Chebyshev nodes) and Fourier Series layers of arbitrary order. Piecewise implementations could be thought of as a 1d grid (for each neuron) where each grid element is Lagrange polynomial. Both full connecte…
☆44Updated last year
Alternatives and similar repositories for high-order-layers-torch
Users that are interested in high-order-layers-torch are comparing it to the libraries listed below
Sorting:
- ☆92Updated last year
- This code implements a Radial Basis Function (RBF) based Kolmogorov-Arnold Network (KAN) for function approximation.☆29Updated last year
- Kolmogorov-Arnold Networks with various basis functions like B-Splines, Fourier, Chebyshev, Wavelets etc☆35Updated last year
- A State-Space Model with Rational Transfer Function Representation.☆78Updated last year
- Clifford-Steerable Convolutional Neural Networks [ICML'24]☆47Updated last month
- Kolmogorov-Arnold networks (KAN) as implicit functions (like NeRF but simpler)☆14Updated last year
- Free-form flows are a generative model training a pair of neural networks via maximum likelihood☆45Updated last week
- Benchmark for efficiency in memory and time of different KAN implementations.☆126Updated 10 months ago
- Diffusion models in PyTorch☆104Updated last week
- A simple example of VAEs with KANs☆12Updated last year
- Deep Networks Grok All the Time and Here is Why☆37Updated last year
- C++ and Cuda ops for fused FourierKAN☆79Updated last year
- Implementation of Gradient Agreement Filtering, from Chaubard et al. of Stanford, but for single machine microbatches, in Pytorch☆25Updated 5 months ago
- Implicit Convolutional Kernels for Steerable CNNs [NeurIPS'23]☆28Updated 4 months ago
- Benchmarking and Testing FastKAN☆78Updated last year
- ☆12Updated last year
- Kolmogorov–Arnold Networks with modified activation (using MLP to represent the activation)☆105Updated 7 months ago
- ☆53Updated 8 months ago
- Lightning-like training API for JAX with Flax☆41Updated 6 months ago
- 🚀 A powerful library for efficient training of Neural Fields at scale.☆29Updated last year
- Neural Optimal Transport with Lagrangian Costs☆56Updated last month
- Pytorch implementation of a simple way to enable (Stochastic) Frame Averaging for any network☆50Updated 11 months ago
- Convolutional layer for Kolmogorov-Arnold Network (KAN)☆100Updated 3 months ago
- The Gaussian Histogram Loss (HL-Gauss) proposed by Imani et al. with a few convenient wrappers for regression, in Pytorch☆64Updated 3 weeks ago
- ☆16Updated 3 years ago
- Kolmogorov-Arnold Networks (KAN) using Jacobi polynomials instead of B-splines.☆38Updated last year
- Flow-matching algorithms in JAX☆97Updated 10 months ago
- ☆40Updated last year
- Implementation of MambaFormer in Pytorch ++ Zeta from the paper: "Can Mamba Learn How to Learn? A Comparative Study on In-Context Learnin…☆21Updated this week
- Kolmogorov-Arnold Networks (KAN) using orthogonal polynomials instead of B-splines.☆38Updated 7 months ago