You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
🚀 Feature Request: Support splitting model weights and training states into separate checkpoint files
Feature Request
Currently, PyTorch Lightning saves the entire training state (model weights, optimizer states, scheduler states, trainer state, etc.) into a single .ckpt file.
I would like to have an option to separate model weights (and config) from training states when saving checkpoints.
For example, the desired checkpoint structure could look like this:
checkpoints/
pretrained_model/
config.json # model configuration
model.safetensors # model weights only
training_states.pth # optimizer, LR scheduler, trainer states