raminmh / liquid_time_constant_networksLinks
Code Repository for Liquid Time-Constant Networks (LTCs)
☆1,733Updated last year
Alternatives and similar repositories for liquid_time_constant_networks
Users that are interested in liquid_time_constant_networks are comparing it to the libraries listed below
Sorting:
- PyTorch and TensorFlow implementation of NCP, LTC, and CfC wired neural models☆2,207Updated last year
- Closed-form Continuous-time Neural Networks☆976Updated last year
- Liquid Structural State-Space Models☆372Updated last year
- ☆175Updated 2 years ago
- Structured state space sequence models☆2,731Updated last year
- TensorFlow implementation of Liquid Time-Constant Neural Network layers☆21Updated 3 years ago
- Hopfield Networks is All You Need☆1,848Updated 2 years ago
- Official repository of the xLSTM.☆1,982Updated 3 weeks ago
- Implementation of Hinton's forward-forward (FF) algorithm - an alternative to back-propagation☆1,488Updated 2 years ago
- Implementation of "SpikeGPT: Generative Pre-trained Language Model with Spiking Neural Networks"☆849Updated 2 months ago
- Tutorial to implement Liquid Time-Constant Neural Network from scratch (eng\rus)☆30Updated last year
- Differentiable SDE solvers with GPU support and efficient sensitivity analysis.☆1,673Updated 8 months ago
- A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods☆1,521Updated last year
- Implementation of Liquid Nets in Pytorch☆67Updated last week
- A Library for Differentiable Logic Gate Networks☆729Updated last year
- Deep learning with spiking neural networks (SNNs) in PyTorch.☆759Updated this week
- ☆740Updated last year
- Code for "Neural Controlled Differential Equations for Irregular Time Series" (Neurips 2020 Spotlight)☆665Updated 2 years ago
- Advanced evolutionary computation library built directly on top of PyTorch, created at NNAISENSE.☆1,087Updated last week
- The official code 👩💻 for - TOTEM: TOkenized Time Series EMbeddings for General Time Series Analysis☆344Updated 7 months ago
- Official repository for the paper "Grokfast: Accelerated Grokking by Amplifying Slow Gradients"☆563Updated last year
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆5,588Updated last week
- Convolutions for Sequence Modeling☆898Updated last year
- Resources about xLSTM by Sepp Hochreiter☆316Updated 10 months ago
- ☆786Updated 3 weeks ago
- Build high-performance AI models with modular building blocks☆550Updated this week
- A collection of resources regarding the interplay between differential equations, deep learning, dynamical systems, control and numerical…☆1,476Updated last year
- Continuous Thought Machines, because thought takes time and reasoning is a process.☆1,298Updated 2 months ago
- Generative Flow Networks☆652Updated 2 years ago
- An implementation of "Retentive Network: A Successor to Transformer for Large Language Models"☆1,206Updated last year