ml-jku / hopfield-layers
Hopfield Networks is All You Need
☆1,793Updated 2 years ago
Alternatives and similar repositories for hopfield-layers:
Users that are interested in hopfield-layers are comparing it to the libraries listed below
- Pytorch library for fast transformer implementations☆1,697Updated 2 years ago
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,122Updated 3 years ago
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,613Updated 3 years ago
- torch-optimizer -- collection of optimizers for Pytorch☆3,103Updated last year
- Reformer, the efficient Transformer, in Pytorch☆2,163Updated last year
- [NeurIPS'19] Deep Equilibrium Models☆746Updated 2 years ago
- Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.☆1,428Updated 8 months ago
- Fast and Easy Infinite Neural Networks in Python☆2,332Updated last year
- A Graph Neural Network Library in Jax☆1,425Updated last year
- Pytorch Lightning code guideline for conferences☆1,261Updated last year
- Long Range Arena for Benchmarking Efficient Transformers☆751Updated last year
- Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch☆1,137Updated last year
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,718Updated 2 weeks ago
- Transformer based on a variant of attention that is linear complexity in respect to sequence length☆758Updated 11 months ago
- Fast, differentiable sorting and ranking in PyTorch☆808Updated last year
- Fast Block Sparse Matrices for Pytorch☆545Updated 4 years ago
- Structured state space sequence models☆2,611Updated 9 months ago
- Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"☆1,059Updated 8 months ago
- JAX-based neural network library☆3,013Updated this week
- ML Collections is a library of Python Collections designed for ML use cases.☆945Updated last week
- functorch is JAX-like composable function transforms for PyTorch.☆1,422Updated this week
- disentanglement_lib is an open-source library for research on learning disentangled representations.☆1,400Updated 3 years ago
- Fast, general, and tested differentiable structured prediction in PyTorch☆1,112Updated 3 years ago
- Python 3.8+ toolbox for submitting jobs to Slurm☆1,415Updated this week
- Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"☆1,569Updated 4 years ago
- ☆376Updated last year
- Machine learning metrics for distributed, scalable PyTorch applications.☆2,253Updated this week
- Implementation of LambdaNetworks, a new approach to image recognition that reaches SOTA with less compute☆1,531Updated 4 years ago
- The entmax mapping and its loss, a family of sparse softmax alternatives.☆432Updated 10 months ago
- A PyTorch library entirely dedicated to neural differential equations, implicit models and related numerical methods☆1,469Updated 11 months ago