-
-
Notifications
You must be signed in to change notification settings - Fork 4.4k
Closed
Labels
bugSomething isn't workingSomething isn't working
Description
What happened?
When using gpt-oss:120B with Ollama, LiteLLM throws an error when processing the response.
Relevant log output
litellm.APIConnectionError: Unable to parse ollama chunk - {'model': 'gpt-oss:120b', 'created_at': '2025-08-06T11:07:39.075180921Z', 'response': '', 'thinking': 'The', 'done': False}
Traceback (most recent call last):
File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1680, in __anext__
async for chunk in self.completion_stream:
File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/base_llm/base_model_iterator.py", line 128, in __anext__
chunk = self._handle_string_chunk(str_line=str_line)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 424, in _handle_string_chunk
return self.chunk_parser(json.loads(str_line))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 465, in chunk_parser
raise e
File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 463, in chunk_parser
raise Exception(f"Unable to parse ollama chunk - {chunk}")
Exception: Unable to parse ollama chunk - {'model': 'gpt-oss:120b', 'created_at': '2025-08-06T11:07:39.075180921Z', 'response': '', 'thinking': 'The', 'done': False}
Are you a ML Ops Team?
No
What LiteLLM version are you on ?
v1.74.12
johannesrmer, gabrielrinaldi, mschwaig, VitoDrofenik and MiMoHo
Metadata
Metadata
Assignees
Labels
bugSomething isn't workingSomething isn't working