We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
图一:报错信息;图二:报错位置,是WhisperEncoding官方源码部分。 trt-llm版本:0.14.0.dev2024091700 直接使用的nvcr.io/nvidia/tritonserver:24.07-py3镜像。
The text was updated successfully, but these errors were encountered:
@yuekaizhang 可以帮忙看看么
Sorry, something went wrong.
@tianchengcheng-cn , https://github.com/k2-fsa/sherpa/blob/master/triton/whisper/Dockerfile.server#L6 请使用这个版本的 trt-llm,或者直接使用 docker-compose
最新的 trt-llm, 可以看这个代码 https://github.com/NVIDIA/TensorRT-LLM/blob/main/examples/whisper/run.py 自己修改或者等等我的更新。
@tianchengcheng-cn , https://github.com/k2-fsa/sherpa/blob/master/triton/whisper/Dockerfile.server#L6 请使用这个版本的 trt-llm,或者直接使用 docker-compose 最新的 trt-llm, 可以看这个代码 https://github.com/NVIDIA/TensorRT-LLM/blob/main/examples/whisper/run.py 自己修改或者等等我的更新。
好的,多谢。
No branches or pull requests
图一:报错信息;图二:报错位置,是WhisperEncoding官方源码部分。
trt-llm版本:0.14.0.dev2024091700
直接使用的nvcr.io/nvidia/tritonserver:24.07-py3镜像。
The text was updated successfully, but these errors were encountered: