Skip to content

Conversation

zhangfanTJU
Copy link
Contributor

fix loader when ckpt is fp16

@xiezipeng-ML
Copy link
Contributor

maybe it should be changed to directly load fp16 when loading weights

@zhangfanTJU
Copy link
Contributor Author

Yes, you're quite right. However, when constructing the model, the default is fp32 (such as layernorm). Therefore, the simplest approach at present may be to convert it to fp16 first and then load it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants