ChristophReich1996 / SmeLULinks
PyTorch reimplementation of the Smooth ReLU activation function proposed in the paper "Real World Large Scale Recommendation Systems Reproducibility and Smooth Activations" [arXiv 2022].
☆22Updated 3 years ago
Alternatives and similar repositories for SmeLU
Users that are interested in SmeLU are comparing it to the libraries listed below
Sorting:
- DiWA: Diverse Weight Averaging for Out-of-Distribution Generalization☆31Updated 2 years ago
- Recycling diverse models☆44Updated 2 years ago
- ☆37Updated 3 years ago
- A regularized self-labeling approach to improve the generalization and robustness of fine-tuned models☆28Updated 2 years ago
- PyTorch implementation of FNet: Mixing Tokens with Fourier transforms