Skip to content

[Bug]: litellm.APIConnectionError: Unable to parse ollama chunk #13333

@svenseeberg

Description

@svenseeberg

What happened?

When using gpt-oss:120B with Ollama, LiteLLM throws an error when processing the response.

Relevant log output

litellm.APIConnectionError: Unable to parse ollama chunk - {'model': 'gpt-oss:120b', 'created_at': '2025-08-06T11:07:39.075180921Z', 'response': '', 'thinking': 'The', 'done': False}
Traceback (most recent call last):
  File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/litellm_core_utils/streaming_handler.py", line 1680, in __anext__
    async for chunk in self.completion_stream:
  File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/base_llm/base_model_iterator.py", line 128, in __anext__
    chunk = self._handle_string_chunk(str_line=str_line)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 424, in _handle_string_chunk
    return self.chunk_parser(json.loads(str_line))
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 465, in chunk_parser
    raise e
  File "/opt/litellm/.venv/lib/python3.11/site-packages/litellm/llms/ollama/completion/transformation.py", line 463, in chunk_parser
    raise Exception(f"Unable to parse ollama chunk - {chunk}")
Exception: Unable to parse ollama chunk - {'model': 'gpt-oss:120b', 'created_at': '2025-08-06T11:07:39.075180921Z', 'response': '', 'thinking': 'The', 'done': False}

Are you a ML Ops Team?

No

What LiteLLM version are you on ?

v1.74.12

Metadata

Metadata

Assignees

Labels

bugSomething isn't working

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions