AICurious Logo

What is: Gradient Sparsification?

SourceGradient Sparsification for Communication-Efficient Distributed Optimization
Year2000
Data SourceCC BY-SA - https://paperswithcode.com

Gradient Sparsification is a technique for distributed training that sparsifies stochastic gradients to reduce the communication cost, with minor increase in the number of iterations. The key idea behind our sparsification technique is to drop some coordinates of the stochastic gradient and appropriately amplify the remaining coordinates to ensure the unbiasedness of the sparsified stochastic gradient. The sparsification approach can significantly reduce the coding length of the stochastic gradient and only slightly increase the variance of the stochastic gradient.