OATML / non-parametric-transformersLinks
Code for "Self-Attention Between Datapoints: Going Beyond Individual Input-Output Pairs in Deep Learning"
☆415Updated last year
Alternatives and similar repositories for non-parametric-transformers
Users that are interested in non-parametric-transformers are comparing it to the libraries listed below
Sorting:
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆480Updated 3 years ago
- ☆470Updated 2 months ago
- ☆376Updated last year
- ☆312Updated 4 months ago
- A repository for explaining feature attributions and feature interactions in deep neural networks.☆187Updated 3 years ago
- Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions☆258Updated last year
- Fast Differentiable Sorting and Ranking☆606Updated last year
- Fast, differentiable sorting and ranking in PyTorch☆817Updated last month
- My implementation of DeepMind's Perceiver☆63Updated 4 years ago
- An alternative to convolution in neural networks☆256Updated last year
- This library would form a permanent home for reusable components for deep probabilistic programming. The library would form and harness a…☆306Updated 2 weeks ago
- Betty: an automatic differentiation library for generalized meta-learning and multilevel optimization☆338Updated last year
- BackPACK - a backpropagation package built on top of PyTorch which efficiently computes quantities other than the gradient.☆588Updated 6 months ago
- ☆240Updated 2 years ago
- Enabling easy statistical significance testing for deep neural networks.☆335Updated last year
- Hopular: Modern Hopfield Networks for Tabular Data☆311Updated 3 years ago
- A library to inspect and extract intermediate layers of PyTorch models.☆473Updated 3 years ago
- Code for Parameter Prediction for Unseen Deep Architectures (NeurIPS 2021)☆491Updated 2 years ago
- Lightweight Hyperparameter Optimization 🚂☆147Updated 10 months ago
- Deep Learning project template best practices with Pytorch Lightning, Hydra, Tensorboard.☆159Updated 4 years ago
- This repository contains the results for the paper: "Descending through a Crowded Valley - Benchmarking Deep Learning Optimizers"☆180Updated 3 years ago
- A Machine Learning workflow for Slurm.☆149Updated 4 years ago
- The entmax mapping and its loss, a family of sparse softmax alternatives.☆441Updated last year
- Library that contains implementations of machine learning components in the hyperbolic space☆138Updated last year
- Official codebase for Pretrained Transformers as Universal Computation Engines.☆249Updated 3 years ago
- ☆100Updated 3 years ago
- Implementation of Estimating Training Data Influence by Tracing Gradient Descent (NeurIPS 2020)☆233Updated 3 years ago
- Laplace approximations for Deep Learning.☆514Updated 2 months ago
- Drift Detection for your PyTorch Models☆317Updated 2 years ago
- The official PyTorch implementation of recent paper - SAINT: Improved Neural Networks for Tabular Data via Row Attention and Contrastive …☆438Updated 3 years ago