ml-jku / hopfield-layers
Hopfield Networks is All You Need
☆1,724Updated last year
Related projects ⓘ
Alternatives and complementary repositories for hopfield-layers
- Fast and Easy Infinite Neural Networks in Python☆2,277Updated 8 months ago
- Pytorch library for fast transformer implementations☆1,642Updated last year
- higher is a pytorch library allowing users to obtain higher order gradients over losses spanning training loops rather than individual tr…☆1,589Updated 2 years ago
- torch-optimizer -- collection of optimizers for Pytorch☆3,038Updated 7 months ago
- Toolbox of models, callbacks, and datasets for AI/ML researchers.☆1,691Updated this week
- Type annotations and dynamic checking for a tensor's shape, dtype, names, etc.☆1,400Updated 3 months ago
- Reformer, the efficient Transformer, in Pytorch☆2,116Updated last year
- JAX-based neural network library☆2,894Updated last week
- An implementation of Performer, a linear attention-based transformer, in Pytorch☆1,093Updated 2 years ago
- Long Range Arena for Benchmarking Efficient Transformers☆726Updated 10 months ago
- Implementation of Perceiver, General Perception with Iterative Attention, in Pytorch☆1,092Updated last year
- Profiling and inspecting memory in pytorch☆1,018Updated 3 months ago
- functorch is JAX-like composable function transforms for PyTorch.☆1,395Updated this week
- Repository for NeurIPS 2020 Spotlight "AdaBelief Optimizer: Adapting stepsizes by the belief in observed gradients"☆1,050Updated 3 months ago
- A Graph Neural Network Library in Jax☆1,373Updated 7 months ago
- Fast, general, and tested differentiable structured prediction in PyTorch☆1,108Updated 2 years ago
- Flexible and powerful tensor operations for readable and reliable code (for pytorch, jax, TF and others)☆8,500Updated 3 weeks ago
- VISSL is FAIR's library of extensible, modular and scalable components for SOTA Self-Supervised Learning with images.☆3,257Updated 8 months ago
- Model interpretability and understanding for PyTorch☆4,918Updated this week
- Pytorch Lightning code guideline for conferences☆1,239Updated last year
- A high-performance Python-based I/O system for large (and small) deep learning problems, with strong support for PyTorch.☆2,303Updated last month
- Simple transformer implementation from scratch in pytorch.☆1,045Updated 5 months ago
- Fast Block Sparse Matrices for Pytorch☆545Updated 3 years ago
- Python 3.8+ toolbox for submitting jobs to Slurm☆1,299Updated last month
- disentanglement_lib is an open-source library for research on learning disentangled representations.☆1,384Updated 3 years ago
- Examples of using sparse attention, as in "Generating Long Sequences with Sparse Transformers"☆1,524Updated 4 years ago
- This repository contains notebook implementations of the following Neural Process variants: Conditional Neural Processes (CNPs), Neural P…☆987Updated 3 years ago
- A concise but complete full-attention transformer with a set of promising experimental features from various papers☆4,760Updated this week
- High-quality implementations of standard and SOTA methods on a variety of tasks.☆1,448Updated 3 weeks ago
- Machine learning metrics for distributed, scalable PyTorch applications.☆2,134Updated this week