Skip to content

Implementation of Compressed SGD with Compressed Gradients in Pytorch

Notifications You must be signed in to change notification settings

kirillacharya/Compressed_SGD_PyTorch

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

30 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Code guidelines

This implementation is based on PyTorch (1.5.0) in Python (3.8).

It enables to run simulated distributed optimization with master node on any number of workers based on PyTorch SGD Optimizer with gradient compression. Communication can be compressed on both workers and master level. Error-Feedback is also enabled.

This is a fork of https://github.com/SamuelHorvath/Compressed_SGD_PyTorch

Installation

To install requirements

$ pip install -r requirements.txt

Example Notebook

To run our code see example notebook.

Theory

If you are interested in theoretical results, you may check the keynote files in the theory folder.

License

License: MIT

About

Implementation of Compressed SGD with Compressed Gradients in Pytorch

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 77.6%
  • Jupyter Notebook 22.4%