Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Inference #3

Open
antirez opened this issue Jan 16, 2024 · 3 comments
Open

Inference #3

antirez opened this issue Jan 16, 2024 · 3 comments

Comments

@antirez
Copy link

antirez commented Jan 16, 2024

Hi! Thank you for porting MiniGPT. This looks a lot faster than running the PyTorch in Metal, which is very poorly supported indeed (CPU is faster in my tests!). I see that in the model itself generation is implemented, but it is not used anywhere, nor the model is saved after training or tested just to output some text. I wonder if inference would work if I save/reload the model. Thanks.

@vithursant
Copy link
Owner

hey @antirez thanks! Just a heads up, I am working on a PR for saving/loading pre-trained models for inference. You can expect the feature sometime this week.

@vithursant
Copy link
Owner

@antirez This issue has been addressed in PR #4. Check it out and let me know, thanks!

@ivanfioravanti
Copy link
Contributor

Model is saved, but inference from it is not working at all, see issue #5

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants