Skip to content

🔥 A lightweight PyTorch library for distributed training, like the Accelerate library

License

Notifications You must be signed in to change notification settings

thisisiron/irontorch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

89 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation


Getting Started

Install Irontorch

pip install irontorch

Example

You can set up the distributed environment as follows.

from irontorch import distributed as dist

def main():
    ...

parser = argparse.ArgumentParser()
parser.add_argument("--config_path", type=str, default="config/fine.yaml")
parser.add_argument("--epoch", type=int, default=10)
parser.add_argument("--batch_size", type=int, default=64)

conf = dist.setup_config(parser)
conf.distributed = conf.n_gpu > 1
dist.run(main, conf.launch_config.nproc_per_node, conf=conf)

This is an example of calling the dataset sampler.

trainset = torchvision.datasets.MNIST(root='./data', train=True, download=True, transform=transform)
sampler = dist.get_data_sampler(trainset, shuffle=True, distributed=distributed)
trainloader = torch.utils.data.DataLoader(trainset, batch_size=batch_size, sampler=sampler)

About

🔥 A lightweight PyTorch library for distributed training, like the Accelerate library

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages