NeurIPS 2019
Sun Dec 8th through Sat the 14th, 2019 at Vancouver Convention Center
Paper ID:1862
Title:Communication-Efficient Distributed Learning via Lazily Aggregated Quantized Gradients


		
The paper adds quantization to lazily aggregated quantized gradient methods. The contributions were uniformly liked by the reviewers, and the feedback round was productive. We hope the authors will incorporate the detailed reviewer comments in the camera ready version, and in particular will discuss relations to a recent handful of papers on error-compensation/feedback. In the latter, quantization quality is known to only be required in expectation, not in every step, so already allows some communication steps to be skipped.