unlearning-challenge / starting-kit
Starting kit for the NeurIPS 2023 unlearning challenge
☆380Updated last year
Alternatives and similar repositories for starting-kit:
Users that are interested in starting-kit are comparing it to the libraries listed below
- Awesome Machine Unlearning (A Survey of Machine Unlearning)☆811Updated last month
- Existing Literature about Machine Unlearning☆840Updated last week
- PyTorch code corresponding to my blog series on adversarial examples and (confidence-calibrated) adversarial training.☆68Updated last year
- A fast, effective data attribution method for neural networks in PyTorch☆203Updated 4 months ago
- Lab assignments for Introduction to Data-Centric AI, MIT IAP 2024 👩🏽💻☆449Updated last month
- A curated list of papers of interesting empirical study and insight on deep learning. Continually updating...☆317Updated 2 weeks ago
- A curated list of trustworthy deep learning papers. Daily updating...☆362Updated 2 weeks ago
- Code release for "Git Re-Basin: Merging Models modulo Permutation Symmetries"☆476Updated 2 years ago
- The official implementation of “Sophia: A Scalable Stochastic Second-order Optimizer for Language Model Pre-training”☆955Updated last year
- 💡 Adversarial attacks on explanations and how to defend them☆314Updated 4 months ago
- ☆285Updated last month
- ☆192Updated 5 months ago
- Code release for "Dropout Reduces Underfitting"☆312Updated last year
- ☆67Updated last year
- Editing Models with Task Arithmetic☆460Updated last year
- Exact method for visualizing partitions of a Deep Neural Network, CVPR 2023 Highlight☆100Updated last month
- Algorithms for Privacy-Preserving Machine Learning in JAX☆93Updated 9 months ago
- ViT Prisma is a mechanistic interpretability library for Vision Transformers (ViTs).☆216Updated 3 weeks ago
- Starter kit and data loading code for the Trojan Detection Challenge NeurIPS 2022 competition☆33Updated last year
- MLCommons Algorithmic Efficiency is a benchmark and competition measuring neural network training speedups due to algorithmic improvement…☆372Updated this week
- Zennit is a high-level framework in Python using PyTorch for explaining/exploring neural networks using attribution methods like LRP.☆217Updated 8 months ago
- OpenXAI : Towards a Transparent Evaluation of Model Explanations☆242Updated 7 months ago
- Probing the representations of Vision Transformers.☆321Updated 2 years ago
- ☆181Updated last year
- Tools for understanding how transformer predictions are built layer-by-layer☆481Updated 10 months ago
- Differentially-private transformers using HuggingFace and Opacus☆133Updated 7 months ago
- A codebase that makes differentially private training of transformers easy.☆171Updated 2 years ago
- https://slds-lmu.github.io/seminar_multimodal_dl/☆169Updated 2 years ago
- Effortless plugin and play Optimizer to cut model training costs by 50%. New optimizer that is 2x faster than Adam on LLMs.☆381Updated 9 months ago
- The official PyTorch implementation - Can Neural Nets Learn the Same Model Twice? Investigating Reproducibility and Double Descent from t…☆78Updated 2 years ago