samiraabnar / attention_flowLinks
☆263Updated 4 years ago
Alternatives and similar repositories for attention_flow
Users that are interested in attention_flow are comparing it to the libraries listed below
Sorting:
- Concept Bottleneck Models, ICML 2020☆239Updated 2 years ago
- ☆163Updated 7 months ago
- Implementation of Visual Transformer for Small-size Datasets☆129Updated 3 years ago
- Compare neural networks by their feature similarity☆378Updated 2 years ago
- Code used in "Understanding Dimensional Collapse in Contrastive Self-supervised Learning" paper.☆78Updated 3 years ago
- A Domain-Agnostic Benchmark for Self-Supervised Learning☆106Updated 2 years ago
- EsViT: Efficient self-supervised Vision Transformers☆411Updated 2 years ago
- PyTorch implementation of SimCLR: supports multi-GPU training and closely reproduces results☆211Updated last year
- Pretrained SimCLRv2 models in Pytorch☆105Updated 5 years ago
- ☆65Updated 3 years ago
- Pytorch implementation of the paper Exploring Simple Siamese Representation Learning.☆76Updated 5 years ago
- Code for the paper "Post-hoc Concept Bottleneck Models". Spotlight @ ICLR 2023☆89Updated last year
- Meaningfully debugging model mistakes with conceptual counterfactual explanations. ICML 2022☆75Updated 3 years ago
- Confidence-Aware Learning for Deep Neural Networks (ICML2020)☆74Updated 5 years ago
- Self-supervised vIsion Transformer (SiT)☆337Updated 3 years ago
- Official code for ICML 2022: Mitigating Neural Network Overconfidence with Logit Normalization☆154Updated 3 years ago
- NeurIPS 2021 | Fine-Grained Neural Network Explanation by Identifying Input Features with Predictive Information☆34Updated 4 years ago
- A simple to use pytorch wrapper for contrastive self-supervised learning on any neural network☆150Updated 4 years ago
- A basic implementation of Layer-wise Relevance Propagation (LRP) in PyTorch.☆102Updated 3 years ago
- (ICLR 2023) Official PyTorch implementation of "What Do Self-Supervised Vision Transformers Learn?"☆115Updated last year
- ☆122Updated 3 years ago
- ☆111Updated 2 years ago
- [NeurIPS 2021] Official codes for "Efficient Training of Visual Transformers with Small Datasets".