Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Bug]:After loading the app, the front-end reported an error: Error connecting to the LLM: API error: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: stream_options (request id: 20240617162225973326812zW5O0B98)', 'type': 'invalid_request_error', 'param': '', 'code': None}} #1022

Open
wdhq4261761 opened this issue Jun 17, 2024 · 0 comments
Labels
bug Something isn't working

Comments

@wdhq4261761
Copy link

Version

VisualStudio Code extension

Operating System

Windows 10

What happened?

This is my log,how to deal this problem:
2024-06-17 16:11:20,374 WARNING [core.llm.base] API error: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: stream_options. Please contact us through an Azure support request at: https://go.microsoft.com/fwlink/?linkid=2213926 for further questions. (request id: 20240617161152391779276mx4XTBdF)', 'type': 'invalid_request_error', 'param': '', 'code': None}}
Traceback (most recent call last):
File "f:\Gptpilot\gpt-pilot\core\llm\base.py", line 189, in call
response, prompt_tokens, completion_tokens = await self._make_request(
^^^^^^^^^^^^^^^^^^^^^^^^^
File "f:\Gptpilot\gpt-pilot\core\llm\openai_client.py", line 51, in _make_request
stream = await self.client.chat.completions.create(**completion_kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "f:\Gptpilot\gpt-pilot\venv\Lib\site-packages\openai\resources\chat\completions.py", line 1181, in create
return await self._post(
^^^^^^^^^^^^^^^^^
File "f:\Gptpilot\gpt-pilot\venv\Lib\site-packages\openai_base_client.py", line 1790, in post
return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "f:\Gptpilot\gpt-pilot\venv\Lib\site-packages\openai_base_client.py", line 1493, in request
return await self._request(
^^^^^^^^^^^^^^^^^^^^
File "f:\Gptpilot\gpt-pilot\venv\Lib\site-packages\openai_base_client.py", line 1584, in _request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'message': 'Unrecognized request argument supplied: stream_options. Please contact us through an Azure support request at: https://go.microsoft.com/fwlink/?linkid=2213926 for further questions. (request id: 20240617161152391779276mx4XTBdF)', 'type': 'invalid_request_error', 'param': '', 'code': None}}

@wdhq4261761 wdhq4261761 added the bug Something isn't working label Jun 17, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant