-
-
Notifications
You must be signed in to change notification settings - Fork 362
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
work well in chat jupyternaut, but failed in inline completion #1238
Comments
JupyterLab 4.2.4 is a bit outdated btw. |
@songqiangchina It's possible that this is not an issue with Jupyter AI. This could be the OpenAI servers rate-limiting/throttling requests issued by the inline completer. Are you able to see any code completions before the error surfaces? |
I can't see any code completions in jupyterlab notebook. since I also use vscode continue, the same local LLM work well in continue code completions. Is it try to connect the openai server address not my local base url address? Is the local base url not pass to openai request parameters when in completion? |
jupyter_ai==2.28.5 I use OpenRouter, my config likes: Completion model:
Local model ID
API Base URL(optional) |
@srdas Can you help test OpenRouter completions when using the |
Issue
I searched the issue but can't found similar answers. here is my issue: I set up the Language model and inline completions model with my local LLM config. Jupyternaut chat work well but I got error in inlince completion. (Inline completion failed: APIConnectionError)
Description
Traceback (most recent call last): File "/usr/lib/python3.10/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions yield File "/usr/lib/python3.10/site-packages/httpx/_transports/default.py", line 377, in handle_async_request resp = await self._pool.handle_async_request(req) File "/usr/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 256, in handle_async_request raise exc from None File "/usr/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 236, in handle_async_request response = await connection.handle_async_request( File "/usr/lib/python3.10/site-packages/httpcore/_async/connection.py", line 101, in handle_async_request raise exc File "/usr/lib/python3.10/site-packages/httpcore/_async/connection.py", line 78, in handle_async_request stream = await self._connect(request) File "/usr/lib/python3.10/site-packages/httpcore/_async/connection.py", line 124, in _connect stream = await self._network_backend.connect_tcp(**kwargs) File "/usr/lib/python3.10/site-packages/httpcore/_backends/auto.py", line 31, in connect_tcp return await self._backend.connect_tcp( File "/usr/lib/python3.10/site-packages/httpcore/_backends/anyio.py", line 113, in connect_tcp with map_exceptions(exc_map): File "/usr/lib/python3.10/contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions raise to_exc(exc) from exc httpcore.ConnectError: All connection attempts failed The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1582, in _request response = await self._client.send( File "/usr/lib/python3.10/site-packages/httpx/_client.py", line 1674, in send response = await self._send_handling_auth( File "/usr/lib/python3.10/site-packages/httpx/_client.py", line 1702, in _send_handling_auth response = await self._send_handling_redirects( File "/usr/lib/python3.10/site-packages/httpx/_client.py", line 1739, in _send_handling_redirects response = await self._send_single_request(request) File "/usr/lib/python3.10/site-packages/httpx/_client.py", line 1776, in _send_single_request response = await transport.handle_async_request(request) File "/usr/lib/python3.10/site-packages/httpx/_transports/default.py", line 376, in handle_async_request with map_httpcore_exceptions(): File "/usr/lib/python3.10/contextlib.py", line 153, in exit self.gen.throw(typ, value, traceback) File "/usr/lib/python3.10/site-packages/httpx/_transports/default.py", line 89, in map_httpcore_exceptions raise mapped_exc(message) from exc httpx.ConnectError: All connection attempts failed The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/usr/lib/python3.10/site-packages/jupyter_ai/completions/handlers/base.py", line 120, in handle_request_and_catch await handle_request File "/usr/lib/python3.10/site-packages/jupyter_ai/completions/handlers/base.py", line 148, in _handle_request await self.handle_request(request) File "/usr/lib/python3.10/site-packages/jupyter_ai/completions/handlers/default.py", line 15, in handle_request reply = await llm.generate_inline_completions(request) File "/usr/lib/python3.10/site-packages/jupyter_ai_magics/providers.py", line 486, in generate_inline_completions suggestion = await chain.ainvoke(input=model_arguments) File "/usr/lib/python3.10/site-packages/langchain_core/runnables/base.py", line 2923, in ainvoke input = await asyncio.create_task(part()) File "/usr/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 298, in ainvoke llm_result = await self.agenerate_prompt( File "/usr/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 787, in agenerate_prompt return await self.agenerate( File "/usr/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 747, in agenerate raise exceptions[0] File "/usr/lib/python3.10/site-packages/langchain_core/language_models/chat_models.py", line 923, in _agenerate_with_cache result = await self._agenerate( File "/usr/lib/python3.10/site-packages/langchain_openai/chat_models/base.py", line 843, in _agenerate response = await self.async_client.create(**payload) File "/usr/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1727, in create return await self._post( File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1849, in post return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls) File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1543, in request return await self._request( File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1606, in _request return await self._retry_request( File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1676, in _retry_request return await self._request( File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1606, in _request return await self._retry_request( File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1676, in _retry_request return await self._request( File "/usr/lib/python3.10/site-packages/openai/_base_client.py", line 1616, in _request raise APIConnectionError(request=request) from err openai.APIConnectionError: Connection error.
Expected behavior
since the config of my local llm in Language model and inline completions model are the same, it should not occur
openai.APIConnectionError: Connection error.
Context
IPython : 8.13.2
ipykernel : 6.29.5
ipywidgets : not installed
jupyter_client : 8.6.1
jupyter_core : 5.7.2
jupyter_server : 2.15.0
jupyterlab : 4.2.4
nbclient : 0.10.0
nbconvert : 7.16.4
nbformat : 5.10.4
notebook : 7.2.3
qtconsole : not installed
traitlets : 5.14.3
Troubleshoot Output
Command Line Output
Browser Output
The text was updated successfully, but these errors were encountered: