a-r-r-o-w / kanformer
Naively combining transformers and Kolmogorov-Arnold Networks to learn and experiment
☆34Updated 8 months ago
Alternatives and similar repositories for kanformer:
Users that are interested in kanformer are comparing it to the libraries listed below
- Integrating Mamba/SSMs with Transformer for Enhanced Long Context and High-Quality Sequence Modeling☆191Updated last week
- my attempts at implementing various bits of Sepp Hochreiter's new xLSTM architecture☆130Updated 11 months ago
- A single repo with all scripts and utils to train / fine-tune the Mamba model with or without FIM☆54Updated last year
- An easy to use PyTorch implementation of the Kolmogorov Arnold Network and a few novel variations☆177Updated 4 months ago
- Pytorch (Lightning) implementation of the Mamba model☆25Updated 11 months ago
- PyTorch Implementation of Jamba: "Jamba: A Hybrid Transformer-Mamba Language Model"☆166Updated 2 weeks ago
- Variations of Kolmogorov-Arnold Networks☆114Updated 11 months ago
- ☆88Updated 10 months ago
- Benchmark for efficiency in memory and time of different KAN implementations.☆121Updated 7 months ago
- Kolmogorov-Arnold Networks (KAN) using Chebyshev polynomials instead of B-splines.☆369Updated 11 months ago
- ☆40Updated 2 months ago
- Benchmarking and Testing FastKAN☆74Updated 10 months ago
- First-principle implementations of groundbreaking AI algorithms using a wide range of deep learning frameworks, accompanied by supporting…☆156Updated 2 weeks ago
- A More Fair and Comprehensive Comparison between KAN and MLP☆164Updated 8 months ago
- Implementation of Agent Attention in Pytorch☆90Updated 9 months ago
- KAN for Vision Transformer☆246Updated 6 months ago
- LORA: Low-Rank Adaptation of Large Language Models implemented using PyTorch☆101Updated last year
- A modified CNN architecture using Kolmogorov-Arnold Networks☆76Updated 10 months ago
- Implementation of xLSTM in Pytorch from the paper: "xLSTM: Extended Long Short-Term Memory"☆119Updated 2 weeks ago
- ☆16Updated 5 months ago
- This repository contains a better implementation of Kolmogorov-Arnold networks☆61Updated 11 months ago
- Official PyTorch Implementation of "The Hidden Attention of Mamba Models"☆218Updated 10 months ago
- ☆57Updated 2 months ago
- Notes on the Mamba and the S4 model (Mamba: Linear-Time Sequence Modeling with Selective State Spaces)☆162Updated last year
- ☆128Updated 11 months ago
- Kolmogorov–Arnold Networks with modified activation (using MLP to represent the activation)☆103Updated 5 months ago
- Trying out the Mamba architecture on small examples (cifar-10, shakespeare char level etc.)☆45Updated last year
- Implementation of MoE Mamba from the paper: "MoE-Mamba: Efficient Selective State Space Models with Mixture of Experts" in Pytorch and Ze…☆102Updated last week
- Collection of tests performed during the study of the new Kolmogorov-Arnold Neural Networks (KAN)☆39Updated last month
- Unofficial Implementation of Selective Attention Transformer☆16Updated 5 months ago