Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] NotImplementedError: get_num_tokens_from_messages() is not presently implemented for model cl100k_basethe #4302

Open
XiaoTongDeng opened this issue Jun 24, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@XiaoTongDeng
Copy link

2024-06-24 16:42:40,847 uvicorn.error 24624 ERROR Exception in ASGI application
Traceback (most recent call last):
File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\sse_starlette\sse.py", line 269, in call
await wrap(partial(self.listen_for_disconnect, receive))
File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\sse_starlette\sse.py", line 258, in wrap
await func()
File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\sse_starlette\sse.py", line 215, in listen_for_disconnect
message = await receive()
^^^^^^^^^^^^^^^
File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 524, in receive
await self.message_event.wait()
File "E:\Anaconda\envs\langchainchatchat3.0\Lib\asyncio\locks.py", line 213, in wait
await fut
asyncio.exceptions.CancelledError: Cancelled by cancel scope 20e18628510

During handling of the above exception, another exception occurred:

  • Exception Group Traceback (most recent call last):
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\uvicorn\protocols\http\h11_impl.py", line 396, in run_asgi
    | result = await app( # type: ignore[func-returns-value]
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\uvicorn\middleware\proxy_headers.py", line 70, in call
    | return await self.app(scope, receive, send)
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\fastapi\applications.py", line 1054, in call
    | await super().call(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\applications.py", line 123, in call
    | await self.middleware_stack(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\middleware\errors.py", line 186, in call
    | raise exc
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\middleware\errors.py", line 164, in call
    | await self.app(scope, receive, _send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\middleware\cors.py", line 83, in call
    | await self.app(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\middleware\exceptions.py", line 62, in call
    | await wrap_app_handling_exceptions(self.app, conn)(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
    | raise exc
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
    | await app(scope, receive, sender)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\routing.py", line 758, in call
    | await self.middleware_stack(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\routing.py", line 778, in app
    | await route.handle(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\routing.py", line 299, in handle
    | await self.app(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\routing.py", line 79, in app
    | await wrap_app_handling_exceptions(app, request)(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette_exception_handler.py", line 64, in wrapped_app
    | raise exc
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette_exception_handler.py", line 53, in wrapped_app
    | await app(scope, receive, sender)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\starlette\routing.py", line 77, in app
    | await response(scope, receive, send)
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\sse_starlette\sse.py", line 255, in call
    | async with anyio.create_task_group() as task_group:
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\anyio_backends_asyncio.py", line 680, in aexit | raise BaseExceptionGroup(
    | ExceptionGroup: unhandled errors in a TaskGroup (1 sub-exception)
    +-+---------------- 1 ----------------
    | Traceback (most recent call last):
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\model_providers\core\model_runtime\model_providers__base\large_language_model.py", line 450, in _invoke_result_generator
    | for chunk in result:
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\model_providers\core\model_runtime\model_providers\deepseek\llm\llm.py", line 819, in _handle_chat_generate_stream_response
    | prompt_tokens = self._num_tokens_from_messages(
    | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\model_providers\core\model_runtime\model_providers\deepseek\llm\llm.py", line 1028, in _num_tokens_from_messages
    | raise NotImplementedError(
    | NotImplementedError: get_num_tokens_from_messages() is not presently implemented for model cl100k_base.See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens.
    |
    | During handling of the above exception, another exception occurred:
    |
    | Traceback (most recent call last):
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\sse_starlette\sse.py", line 258, in wrap
    | await func()
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\sse_starlette\sse.py", line 245, in stream_response
    | async for data in self.body_iterator:
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\model_providers\bootstrap_web\message_convert\core.py", line 211, in _stream_openai_chat_completion
    | for chunk in response:
    | File "E:\Anaconda\envs\langchainchatchat3.0\Lib\site-packages\model_providers\core\model_runtime\model_providers__base\large_language_model.py", line 474, in _invoke_result_generator
    | raise self._transform_invoke_error(e)
    | model_providers.core.model_runtime.errors.invoke.InvokeError: [deepseek] Error: get_num_tokens_from_messages() is not presently implemented for model cl100k_base.See https://github.com/openai/openai-python/blob/main/chatml.md for information on how messages are converted to tokens.
@XiaoTongDeng XiaoTongDeng added the bug Something isn't working label Jun 24, 2024
@imClumsyPanda imClumsyPanda changed the title [BUG] 简洁阐述问题 / Concise description of NotImplementedError: get_num_tokens_from_messages() is not presently implemented for model cl100k_basethe issue [BUG] NotImplementedError: get_num_tokens_from_messages() is not presently implemented for model cl100k_basethe Jun 27, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant