JonasGeiping / fullbatchtrainingLinks
Training vision models with full-batch gradient descent and regularization
☆37Updated 2 years ago
Alternatives and similar repositories for fullbatchtraining
Users that are interested in fullbatchtraining are comparing it to the libraries listed below
Sorting:
- ☆55Updated 5 years ago
- Code for the paper "Understanding Generalization through Visualizations"☆61Updated 4 years ago
- Towards Understanding Sharpness-Aware Minimization [ICML 2022]☆35Updated 3 years ago
- ☆34Updated last year
- Weight-Averaged Sharpness-Aware Minimization (NeurIPS 2022)☆28Updated 2 years ago
- A modern look at the relationship between sharpness and generalization [ICML 2023]☆43Updated 2 years ago
- Code for the ICLR 2022 paper. Salient Imagenet: How to discover spurious features in deep learning?☆40Updated 3 years ago
- Measurements of Three-Level Hierarchical Structure in the Outliers in the Spectrum of Deepnet Hessians (ICML 2019)☆16Updated 6 years ago
- A Closer Look at Accuracy vs. Robustness☆88Updated 4 years ago
- [ICML'20] Multi Steepest Descent (MSD) for robustness against the union of multiple perturbation models.☆26Updated last year
- An Investigation of Why Overparameterization Exacerbates Spurious Correlations☆30Updated 5 years ago
- Computing various measures and generalization bounds on convolutional and fully connected networks☆35Updated 6 years ago
- ImageNet Testbed, associated with the paper "Measuring Robustness to Natural Distribution Shifts in Image Classification."☆119Updated 2 years ago
- ☆58Updated 2 years ago
- Distilling Model Failures as Directions in Latent Space☆47Updated 2 years ago
- ☆108Updated last year
- Simple data balancing baselines for worst-group-accuracy benchmarks.☆42Updated last year
- Gradient Starvation: A Learning Proclivity in Neural Networks☆60Updated 4 years ago
- Code release for REPAIR: REnormalizing Permuted Activations for Interpolation Repair☆49Updated last year
- ☆16Updated 2 years ago
- ☆38Updated 4 years ago
- On the Loss Landscape of Adversarial Training: Identifying Challenges and How to Overcome Them [NeurIPS 2020]☆36Updated 4 years ago
- [ICML 2021] This is the official github repo for training L_inf dist nets with high certified accuracy.☆42Updated 3 years ago
- ☆40Updated 2 years ago
- Winning Solution of the NeurIPS 2020 Competition on Predicting Generalization in Deep Learning☆40Updated 4 years ago
- Implementation of Confidence-Calibrated Adversarial Training (CCAT).☆45Updated 5 years ago
- The Pitfalls of Simplicity Bias in Neural Networks [NeurIPS 2020] (http://arxiv.org/abs/2006.07710v2)☆41Updated last year
- ☆23Updated 3 years ago
- ☆34Updated 4 years ago
- Code for the paper "MMA Training: Direct Input Space Margin Maximization through Adversarial Training"☆34Updated 5 years ago