ml-jku / hopfield-layersLinks
Hopfield Networks is All You Need
☆1,891Updated 2 years ago
Alternatives and similar repositories for hopfield-layers
Users that are interested in hopfield-layers are comparing it to the libraries listed below
Sorting:
- Fast and Easy Infinite Neural Networks in Python☆2,364Updated last year
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,627Updated 3 years ago
- Long Range Arena for Benchmarking Efficient Transformers☆771Updated 2 years ago
- Pytorch library for fast transformer implementations☆1,756Updated 2 years ago
- JAX-based neural network library☆3,158Updated 2 weeks ago
- [NeurIPS'19] Deep Equilibrium Models☆785Updated 3 years ago
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,756Updated last month
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,169Updated 3 years ago
- torch-optimizer -- collection of optimizers for Pytorch☆3,160Updated last year
- ☆792Updated last month
- A Graph Neural Network Library in Jax☆1,461Updated last year
- Fast, differentiable sorting and ranking in PyTorch☆847Updated 6 months ago
- Fast, general, and tested differentiable structured prediction in PyTorch☆1,123Updated 3 years ago
- Pytorch Lightning code guideline for conferences☆1,284Updated 2 years ago
- Reformer, the efficient Transformer, in Pytorch☆2,191Updated 2 years ago
- A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods☆1,548Updated last year
- Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch☆1,188Updated 2 years ago
- Fast Block Sparse Matrices for Pytorch☆550Updated 4 years ago
- Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.☆1,468Updated 8 months ago
- ML Collections is a library of Python Collections designed for ML use cases.☆1,005Updated 2 months ago
- disentanglement_lib is an open-source library for research on learning disentangled representations.☆1,415Updated 4 years ago
- Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)☆1,093Updated 9 months ago
- ☆1,398Updated 3 weeks ago
- Structured state space sequence models☆2,812Updated last year
- VISSL is FAIR's library of extensible, modular and scalable components for SOTA Self-Supervised Learning with images.☆3,292Updated last year
- Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"☆1,067Updated last year
- functorch is JAX-like composable function transforms for PyTorch.☆1,438Updated 4 months ago
- 🧠🗼☆1,279Updated last year
- Advanced evolutionary computation library built directly on top of PyTorch, created at NNAISENSE.☆1,112Updated 3 weeks ago
- Transformer based on a variant of attention that is linear complexity in respect to sequence length☆822Updated last year