Klassikcat / KANElectra
Transformer model based on Kolmogorov–Arnold Network(KAN), which is an alternative of Multi-Layer Perceptron(MLP)
☆25Updated last week
Related projects ⓘ
Alternatives and complementary repositories for KANElectra
- ☆119Updated 6 months ago
- Convolutional layer for Kolmogorov-Arnold Network (KAN)☆80Updated 5 months ago
- Fast Convolutional KAN☆55Updated 6 months ago
- Simba☆182Updated 8 months ago
- Ofiicial Implementation for Mamba-ND: Selective State Space Modeling for Multi-Dimensional Data☆50Updated 4 months ago
- An implementation of mLSTM and sLSTM in PyTorch.☆25Updated 5 months ago
- This repository contains the codes to replicate the simulations from the paper: "Wav-KAN: Wavelet Kolmogorov-Arnold Networks." It showca…☆105Updated this week
- My implementation of the original transformer model (Vaswani et al.). I've additionally included the playground.py file for visualizing o…☆41Updated 11 months ago
- ☆41Updated 7 months ago
- A modified CNN architecture using Kolmogorov-Arnold Networks☆65Updated 6 months ago
- Official repository for CVPR24 Precognition Workshop Paper: VMRNN: Integrating Vision Mamba and LSTM for Efficient and Accurate Spatiotem…☆101Updated 7 months ago
- Implementation of Griffin from the paper: "Griffin: Mixing Gated Linear Recurrences with Local Attention for Efficient Language Models"☆50Updated last week
- ☆62Updated last month
- State Space Models☆63Updated 6 months ago
- MNIST example using Kolmogorov-Arnold Networks☆27Updated 6 months ago
- Kolmogorov-Arnold Networks (KAN) using Jacobi polynomials instead of B-splines.☆32Updated 6 months ago
- 🕹️The toy examples of Kolmogorov-Arnold Network (Get Started Quickly)☆72Updated 6 months ago
- Pan-Mamba: Effective Pan-Sharpening with State Space Model☆81Updated 8 months ago
- Only implemented through torch: "bi - mamba2" , "vision- mamba2 -torch". support 1d/2d/3d/nd and support export by jit.script/onnx;☆187Updated this week
- Implementation of xLSTM in Pytorch from the paper: "xLSTM: Extended Long Short-Term Memory"☆106Updated last week
- Official code release of our paper "EViT: An Eagle Vision Transformer with Bi-Fovea Self-Attention"☆17Updated last month
- ☆103Updated this week
- Wavelet-Attention CNN for Image Classification☆22Updated 2 years ago
- A Triton Kernel for incorporating Bi-Directionality in Mamba2☆50Updated 2 months ago
- Minimal Mamba-2 implementation in PyTorch☆137Updated 5 months ago
- A easy to use implementation of xLSTM☆28Updated 2 months ago
- Mamba or Transformer for Time Series Forecasting? Mixture of Universals (MoU) Is All You Need.☆34Updated 2 months ago
- Implementation of Switch Transformers from the paper: "Switch Transformers: Scaling to Trillion Parameter Models with Simple and Efficien…☆55Updated last week
- Cuda implementation of Extended Long Short Term Memory (xLSTM) with C++ and PyTorch ports☆75Updated 5 months ago
- Trainable Highly-expressive Activation Functions. ECCV 2024☆34Updated 2 weeks ago