Skip to content
This repository has been archived by the owner on May 12, 2023. It is now read-only.

GPT4ALL + MPT ---> Bad Magic error ? #110

Open
gykung opened this issue May 11, 2023 · 0 comments
Open

GPT4ALL + MPT ---> Bad Magic error ? #110

gykung opened this issue May 11, 2023 · 0 comments

Comments

@gykung
Copy link

gykung commented May 11, 2023

I am trying to run the new MPT models by MosaicML with pygpt4all. In loading the following, I get a "bad magic" error. How do i overcome it? I've checked https://github.com/ggerganov/llama.cpp/issues and there aren't similar issues reported for the MPT models.

Code:


from pygpt4all.models.gpt4all_j import GPT4All_J

model = GPT4All_J('./models/ggml-mpt-7b-chat.bin')

Error:


runfile('C:/Data/gpt4all/gpt4all_cpu2.py', wdir='C:/Data/gpt4all')

gptj_model_load: invalid model file './models/ggml-mpt-7b-chat.bin' (bad magic)

Windows fatal exception: int divide by zero

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant