OverLordGoldDragon / keras-adamwLinks
Keras/TF implementation of AdamW, SGDW, NadamW, Warm Restarts, and Learning Rate multipliers
☆167Updated 3 years ago
Alternatives and similar repositories for keras-adamw
Users that are interested in keras-adamw are comparing it to the libraries listed below
Sorting:
- Learning rate multiplier☆46Updated 4 years ago
- Lookahead mechanism for optimizers in Keras.☆50Updated 4 years ago
- AdamW optimizer for Keras☆115Updated 5 years ago
- Keras callback function for stochastic weight averaging☆56Updated 3 years ago
- RAdam implemented in Keras & TensorFlow☆325Updated 3 years ago
- Keras implementation of AdaBound☆130Updated 5 years ago
- Keras implementation of Cosine Annealing Scheduler☆44Updated 5 years ago
- keras implementation of AdamW from Fixing Weight Decay Regularization in Adam (https://arxiv.org/abs/1711.05101)☆71Updated 6 years ago
- Keras implementation of Attention Augmented Convolutional Neural Networks☆121Updated 5 years ago
- lookahead optimizer for keras☆170Updated 5 years ago
- Implementation of One-Cycle Learning rate policy (adapted from Fast.ai lib)☆287Updated 4 years ago
- Implementation of Rectified Adam in Keras☆70Updated 5 years ago
- Plots the change of the loss function of a Keras model when the learning rate is exponentially increasing.☆257Updated 3 weeks ago
- Cyclic learning rate TensorFlow implementation.☆66Updated 6 years ago
- A simpler version of the self-attention layer from SAGAN, and some image classification results.☆212Updated 5 years ago
- AdaBound optimizer in Keras☆56Updated 4 years ago
- Implementation in Keras of: Snapshot Ensembles: Train 1, get M for free (https://arxiv.org/abs/1704.00109)☆26Updated 6 years ago
- Binary and Categorical Focal loss implementation in Keras.☆278Updated 6 months ago
- Snapshot Ensemble in Keras☆309Updated 7 years ago
- Metrics for Keras. DEPRECATED since Keras 2.3.0☆163Updated 3 years ago
- pytorch implement of Lookahead Optimizer☆190Updated 3 years ago
- ☆86Updated 2 years ago
- Repo to build on / reproduce the record breaking Ranger-Mish-SelfAttention setup on FastAI ImageWoof dataset 5 epochs