-
Notifications
You must be signed in to change notification settings - Fork 180
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
How to use the full-parameter training checkpoint for sample_t2i.py? #107
Comments
同问 |
I think you can use 005-dit_g2_full_1024p/checkpoints/0010000.pt/mp_rank_00_model_states.pt |
试了试,不太行啊: |
key: 'ema' |
多谢,搞定了 |
Running into same issue. What do you mean by this?
Is there some way to simply point to the checkpoint path using the command line? |
Figured it out. After loading the weights from 'mp_rank_00_model_states.pt', you have to index it with 'ema' before loading the state dict. Code ref |
PYTHONPATH=./ sh hydit/train.sh --index-file dataset/porcelain/jsons/porcelain.json
I use this code to train full-parameter, then the checkpoint is saved as 005-dit_g2_full_1024p/checkpoints,I want to know how to use the checkpoint for sample_t2i.py?
The text was updated successfully, but these errors were encountered: