Mish Activation Function for PyTorch
☆148Jan 4, 2021Updated 5 years ago
Alternatives and similar repositories for mish-cuda
Users that are interested in mish-cuda are comparing it to the libraries listed below. We may earn a commission when you buy through links labeled 'Ad' on this page.
Sorting:
- Official Repository for "Mish: A Self Regularized Non-Monotonic Neural Activation Function" [BMVC 2020]☆1,302Updated this week
- Repo to build on / reproduce the record breaking Ranger-Mish-SelfAttention setup on FastAI ImageWoof dataset 5 epochs☆116Nov 16, 2019Updated 6 years ago
- Practical Deep Learning for Time Series / Sequential Data using fastai2/ Pytorch☆12Nov 12, 2020Updated 5 years ago
- PyTorch implementation of YOLOv4☆1,913Nov 3, 2024Updated last year
- Mish Deep Learning Activation Function for PyTorch / FastAI☆160Mar 26, 2020Updated 5 years ago
- Managed Database hosting by DigitalOcean • AdPostgreSQL, MySQL, MongoDB, Kafka, Valkey, and OpenSearch available. Automatically scale up storage and focus on building your apps.
- Ranger - a synergistic optimizer using RAdam (Rectified Adam), Gradient Centralization and LookAhead in one codebase☆1,207Dec 22, 2023Updated 2 years ago
- Paper reading list☆15Oct 5, 2020Updated 5 years ago
- Over9000 optimizer☆424Nov 22, 2022Updated 3 years ago
- Unofficial PyTorch Implementation of EvoNorm☆123Aug 29, 2021Updated 4 years ago
- diffGrad: An Optimization Method for Convolutional Neural Networks☆54Oct 12, 2022Updated 3 years ago
- Implementation of OpenAI paper with Simple Noise Scale on Fastai☆49Nov 3, 2019Updated 6 years ago
- "Learning Rate Dropout" in PyTorch