wrmthorne / cycleformersView on GitHub
A Python library for efficient and flexible cycle-consistency training of transformer models via iteratie back-translation. Memory and compute efficient techniques such as PEFT adapter switching allow for 7.5x larger models to be trained on the same hardware.
11Jan 13, 2025Updated last year

Alternatives and similar repositories for cycleformers

Users that are interested in cycleformers are comparing it to the libraries listed below

Sorting:

Are these results useful?