Kaleidophon / deep-significanceLinks
Enabling easy statistical significance testing for deep neural networks.
☆337Updated last year
Alternatives and similar repositories for deep-significance
Users that are interested in deep-significance are comparing it to the libraries listed below
Sorting:
- Robustness Gym is an evaluation toolkit for machine learning.☆442Updated 3 years ago
- A repository for explaining feature attributions and feature interactions in deep neural networks.☆188Updated 3 years ago
- Weakly Supervised End-to-End Learning (NeurIPS 2021)☆156Updated 2 years ago
- A python package for benchmarking interpretability techniques on Transformers.☆213Updated last year
- ☆139Updated last year
- Self-training with Weak Supervision (NAACL 2021)☆161Updated 2 years ago
- Course webpage for COMP 790, (Deep) Learning from Limited Labeled Data☆304Updated 5 years ago
- Knowledge-Aware machine LEarning (KALE): accessible machine learning from multiple sources for interdisciplinary research, part of the 🔥…☆476Updated this week
- Cockpit: A Practical Debugging Tool for Training Deep Neural Networks☆482Updated 3 years ago
- ☆471Updated 5 months ago
- XAI Tutorial for the Explainable AI track in the ALPS winter school 2021☆58Updated 4 years ago
- Official repository for CMU Machine Learning Department's 10721: "Philosophical Foundations of Machine Intelligence".☆262Updated 2 years ago
- Check if you have training samples in your test set☆64Updated 3 years ago
- A collection of code snippets for my PyTorch Lightning projects☆107Updated 4 years ago
- Deep Learning project template best practices with Pytorch Lightning, Hydra, Tensorboard.☆158Updated 4 years ago
- Create interactive textual heat maps for Jupiter notebooks☆196Updated last year
- Dataset Cartography: Mapping and Diagnosing Datasets with Training Dynamics☆205Updated 3 years ago
- A curated list of programmatic weak supervision papers and resources☆190Updated 2 years ago
- diagNNose is a Python library that facilitates a broad set of tools for analysing hidden activations of neural models.☆82Updated last year
- Repository containing code for "How to Train BERT with an Academic Budget" paper☆314Updated 2 years ago
- HetSeq: Distributed GPU Training on Heterogeneous Infrastructure☆106Updated 2 years ago
- Pretrain and finetune ELECTRA with fastai and huggingface. (Results of the paper replicated !)☆330Updated last year
- MinT: Minimal Transformer Library and Tutorials☆258Updated 3 years ago
- Implicit MLE: Backpropagating Through Discrete Exponential Family Distributions☆258Updated last year
- http://nlp.seas.harvard.edu/2018/04/03/attention.html☆62Updated 4 years ago
- DRIFT is a tool for Diachronic Analysis of Scientific Literature.☆115Updated 2 years ago
- Reduce end to end training time from days to hours (or hours to minutes), and energy requirements/costs by an order of magnitude using co…☆342Updated 2 years ago
- Flexible components pairing 🤗 Transformers with Pytorch Lightning☆612Updated 2 years ago
- A simple and working implementation of Electra, the fastest way to pretrain language models from scratch, in Pytorch☆231Updated 2 years ago
- SummVis is an interactive visualization tool for text summarization.☆253Updated 3 years ago