You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on May 12, 2023. It is now read-only.
I am trying to run the new MPT models by MosaicML with pygpt4all. In loading the following, I get a "bad magic" error. How do i overcome it? I've checked https://github.com/ggerganov/llama.cpp/issues and there aren't similar issues reported for the MPT models.
Code:
from pygpt4all.models.gpt4all_j import GPT4All_J
model = GPT4All_J('./models/ggml-mpt-7b-chat.bin')
Error:
runfile('C:/Data/gpt4all/gpt4all_cpu2.py', wdir='C:/Data/gpt4all')
gptj_model_load: invalid model file './models/ggml-mpt-7b-chat.bin' (bad magic)
Windows fatal exception: int divide by zero
The text was updated successfully, but these errors were encountered:
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
I am trying to run the new MPT models by MosaicML with pygpt4all. In loading the following, I get a "bad magic" error. How do i overcome it? I've checked https://github.com/ggerganov/llama.cpp/issues and there aren't similar issues reported for the MPT models.
Code:
Error:
The text was updated successfully, but these errors were encountered: