ParCIS / Ok-Topk

Ok-Topk is a scheme for distributed training with sparse gradients. Ok-Topk integrates a novel sparse allreduce algorithm (less than 6k communication volume which is asymptotically optimal) with the decentralized parallel Stochastic Gradient Descent (SGD) optimizer, and its convergence is proved theoretically and empirically.
24Updated 2 years ago

Alternatives and similar repositories for Ok-Topk:

Users that are interested in Ok-Topk are comparing it to the libraries listed below