You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
(ollama) ╭─hougelangley at Arch-Legion in ~/ollama on main✘✘✘ 24-04-17 - 22:40:01
╰─(ollama) ⠠⠵ python llm/llama.cpp/convert-llama-ggml-to-gguf.py -i ming/ggml-adapter-model.bin -o ming.bin
* Using config: Namespace(input=PosixPath('ming/ggml-adapter-model.bin'), output=PosixPath('ming.bin'), name=None, desc=None, gqa=1, eps='5.0e-06', context_length=2048, model_metadata_dir=None, vocab_dir=None, vocabtype='spm,hfft')
=== WARNING === Be aware that this conversion script is best-effort. Use a native GGUF model if possible. === WARNING ===
- Note: If converting LLaMA2, specifying "--eps 1e-5" is required. 70B models also need "--gqa 8".
* Scanning GGML input file
Traceback (most recent call last):
File "/home/hougelangley/ollama/llm/llama.cpp/convert-llama-ggml-to-gguf.py", line 441, in <module>
main()
File "/home/hougelangley/ollama/llm/llama.cpp/convert-llama-ggml-to-gguf.py", line 415, in main
offset = model.load(data, 0) # noqa
^^^^^^^^^^^^^^^^^^^
File "/home/hougelangley/ollama/llm/llama.cpp/convert-llama-ggml-to-gguf.py", line 175, in load
offset += self.validate_header(data, offset)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/hougelangley/ollama/llm/llama.cpp/convert-llama-ggml-to-gguf.py", line 160, in validate_header
raise ValueError(f"Unexpected file magic {magic!r}! This doesn't look like a GGML format file.")
ValueError: Unexpected file magic b'algg'! This doesn't look like a GGML format file.
The text was updated successfully, but these errors were encountered:
从 BlueZeros/MING-MOE-14B 下载了模型
git clone https://huggingface.co/BlueZeros/MING-MOE-14B ming
后续使用以下命令转换
lora
到ggml
是成功的后续需要将
ggml-adapter-model.bin
转换成gguf
提示如下:The text was updated successfully, but these errors were encountered: