JingzhaoZhang / why-clipping-acceleratesLinks
A pytorch implementation for the LSTM experiments in the paper: Why Gradient Clipping Accelerates Training: A Theoretical Justification for Adaptivity
☆46Updated 5 years ago
Alternatives and similar repositories for why-clipping-accelerates
Users that are interested in why-clipping-accelerates are comparing it to the libraries listed below
Sorting:
- Code base for SRSGD.☆28Updated 5 years ago
- Reparameterize your PyTorch modules☆71Updated 4 years ago
- [JMLR] TRADES + random smoothing for certifiable robustness☆14Updated 4 years ago
- ☆41Updated 2 years ago
- This repository is no longer maintained. Check☆81Updated 5 years ago
- Implementation of Methods Proposed in Preventing Gradient Attenuation in Lipschitz Constrained Convolutional Networks (NeurIPS 2019)☆35Updated 5 years ago
- [NeurIPS'19] [PyTorch] Adaptive Regularization in NN☆68Updated 5 years ago
- An adaptive training algorithm for residual network☆15Updated 4 years ago
- Ἀνατομή is a PyTorch library to analyze representation of neural networks☆65Updated last month
- ☆47Updated 4 years ago
- Code for Self-Tuning Networks (ICLR 2019) https://arxiv.org/abs/1903.03088