Lance0218 / Pytorch-DistributedDataParallel-Training-Tricks

A guide that integrates Pytorch DistributedDataParallel, Apex, warmup, learning rate scheduler, also mentions the set-up of early-stopping and random seed.
60Updated 2 years ago

Related projects

Alternatives and complementary repositories for Pytorch-DistributedDataParallel-Training-Tricks