We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
managed server, account without sudo privilege singularity available:
$ singularity --version singularity-ce version 4.1.2-focal
OS version
NAME="Ubuntu" VERSION="20.04.6 LTS (Focal Fossa)" ID=ubuntu ID_LIKE=debian PRETTY_NAME="Ubuntu 20.04.6 LTS" VERSION_ID="20.04" HOME_URL="https://www.ubuntu.com/" SUPPORT_URL="https://help.ubuntu.com/" BUG_REPORT_URL="https://bugs.launchpad.net/ubuntu/" PRIVACY_POLICY_URL="https://www.ubuntu.com/legal/terms-and-policies/privacy-policy" VERSION_CODENAME=focal UBUNTU_CODENAME=focal
v1.1.0
singularity exec --fakeroot --env XINFERENCE_MODEL_SRC=huggingface --bind xinference/.xinference:/root/.xinference --nv --bind /tmp/.X11-unix:/tmp/.X11-unix xinference/xinference_v1.1.0.sif xinference-local -H 0.0.0.0 --log-level debug
access via localhost:port/v1/chat/completions:
Error handling webview message: { "msg": { "messageId": "b88dd8cb-da49-4938-83d2-429313093b9c", "messageType": "llm/streamChat", "data": { "messages": [ { "role": "user", "content": [ { "type": "text", "text": "hi" } ] }, { "role": "assistant", "content": "" } ], "title": "CodeGeeX4", "completionOptions": {} } } } Error: Malformed JSON sent from server: {"error": "[address=0.0.0.0:34705, pid=3795840] ChatGLM4Tokenizer._pad() got an unexpected keyword argument 'padding_side'"}
then got this error message
should work normally. llama.cpp version is good to the exact same query
You can read THUDM/GLM-4#578 for discussion at ChatGLM4 repo
The text was updated successfully, but these errors were encountered:
@codingl2k1 Can you help with this?
Sorry, something went wrong.
The model tokenizer requires an update, related issue: https://huggingface.co/THUDM/codegeex4-all-9b/discussions/20
Just like this fix on LongWriter-glm4-9b: https://huggingface.co/THUDM/LongWriter-glm4-9b/commit/778b5712634889f5123d6c463ca383bc6dd5c621
This issue is stale because it has been open for 7 days with no activity.
No branches or pull requests
System Info / 系統信息
managed server, account without sudo privilege
singularity available:
OS version
Running Xinference with Docker? / 是否使用 Docker 运行 Xinfernece?
Version info / 版本信息
v1.1.0
The command used to start Xinference / 用以启动 xinference 的命令
singularity exec --fakeroot
--env XINFERENCE_MODEL_SRC=huggingface
--bind xinference/.xinference:/root/.xinference
--nv
--bind /tmp/.X11-unix:/tmp/.X11-unix
xinference/xinference_v1.1.0.sif
xinference-local -H 0.0.0.0 --log-level debug
Reproduction / 复现过程
access via localhost:port/v1/chat/completions:
then got this error message
Expected behavior / 期待表现
should work normally. llama.cpp version is good to the exact same query
You can read THUDM/GLM-4#578 for discussion at ChatGLM4 repo
The text was updated successfully, but these errors were encountered: