Skip to content

Conversation

@imranq2
Copy link

@imranq2 imranq2 commented Oct 9, 2025

This pull request introduces support for OpenAI's Responses API alongside the existing Chat Completions API by implementing an abstraction layer with wrapper classes. The refactoring simplifies code, reduces duplication, and reorganizes user profile and memory management into a cleaner structure.

Key Changes:

Introduced wrapper classes (ChatRequestWrapper, ChatMessageWrapper) to abstract differences between Chat Completions and Responses APIs, enabling unified processing logic
Added /responses endpoint with comprehensive test coverage for both streaming and non-streaming scenarios, including MCP tool integration
Refactored user profile and conversation memory management into a repository/validator/serializer pattern with proper separation of concerns
Updated dependencies (pypdf 6.4.0, mypy 1.19.0) and OpenWebUI configuration for better API key management

This pull request refactors the langgraph_to_openai_converter.py module to streamline the handling of chat requests and responses, replacing raw OpenAI types and message dicts with new wrapper classes. It also updates configuration and dependency files to improve compatibility and security. The main changes are grouped below:

Refactoring and Codebase Simplification:

  • Replaced direct usage of OpenAI types (ChatCompletionMessageParam, ChatCompletionChunk, etc.) with new wrapper classes (ChatMessageWrapper, ChatRequestWrapper) throughout language_model_gateway/gateway/converters/langgraph_to_openai_converter.py, simplifying message handling and response generation. [1] [2] [3] [4] [5] [6] [7]
  • Refactored streaming and non-streaming response logic to use methods on the new wrapper classes, reducing duplicated code and centralizing formatting logic. [1] [2]
  • Removed unused imports and legacy code related to OpenAI message types and formatting utilities. [1] [2]

Configuration and Dependency Updates:

  • Updated dependencies in Pipfile to newer versions for improved compatibility and security: pypdf to 6.4.0, mypy to 1.19.0, and adjusted types-beautifulsoup4 version constraint. [1] [2] [3]
  • Added a new authentication option "headers" to the AgentConfig model in language_model_gateway/configs/config_schema.py.

Environment and Compose File Changes:

  • Switched to ENABLE_API_KEYS in docker-compose-openwebui.yml for better API key management and compatibility with OpenWebUI documentation. [1] [2]

imranq2 added 30 commits October 8, 2025 21:52
@imranq2 imranq2 marked this pull request as ready for review December 2, 2025 17:12
Copilot AI review requested due to automatic review settings December 2, 2025 17:12
Copilot finished reviewing on behalf of imranq2 December 2, 2025 17:16
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This pull request introduces support for OpenAI's Responses API alongside the existing Chat Completions API by implementing an abstraction layer with wrapper classes. The refactoring simplifies code, reduces duplication, and reorganizes user profile and memory management into a cleaner structure.

Key Changes:

  • Introduced wrapper classes (ChatRequestWrapper, ChatMessageWrapper) to abstract differences between Chat Completions and Responses APIs, enabling unified processing logic
  • Added /responses endpoint with comprehensive test coverage for both streaming and non-streaming scenarios, including MCP tool integration
  • Refactored user profile and conversation memory management into a repository/validator/serializer pattern with proper separation of concerns
  • Updated dependencies (pypdf 6.4.0, mypy 1.19.0) and OpenWebUI configuration for better API key management

Reviewed changes

Copilot reviewed 46 out of 57 changed files in this pull request and generated 11 comments.

Show a summary per file
File Description
language_model_gateway/gateway/structures/openai/request/responses_api_request_wrapper.py New wrapper for Responses API requests with SSE message formatting and MCP tool extraction
language_model_gateway/gateway/structures/openai/request/chat_completion_api_request_wrapper.py New wrapper for Chat Completions API requests with existing response formatting logic
language_model_gateway/gateway/structures/openai/request/chat_request_wrapper.py Abstract base class defining the unified interface for request wrappers
language_model_gateway/gateway/structures/openai/message/*_message_wrapper.py Wrapper classes for messages from both APIs with conversions to LangChain format
language_model_gateway/gateway/utilities/openai/responses_api_converter.py Utility functions to convert Responses API structures to LangChain messages
language_model_gateway/gateway/schema/openai/responses.py Pydantic schema for Responses API requests
language_model_gateway/gateway/schema/openai/completions.py Converted ChatRequest from TypedDict to Pydantic BaseModel with strict validation
language_model_gateway/gateway/routers/chat_completion_router.py Added /responses endpoint and refactored common logic into _chat_completions method
language_model_gateway/gateway/converters/langgraph_to_openai_converter.py Refactored to use wrapper classes instead of raw OpenAI types, removing ~200 lines of code
language_model_gateway/gateway/converters/streaming_manager.py Updated to use wrapper methods for SSE formatting, eliminating direct OpenAI type construction
language_model_gateway/gateway/managers/chat_completion_manager.py Updated to accept ChatRequestWrapper instead of ChatRequest dict
language_model_gateway/gateway/providers/*.py Updated providers to use ChatRequestWrapper parameter
language_model_gateway/gateway/tools/user_profile/* Reorganized user profile management into modular structure with validator, serializer, and repository
language_model_gateway/gateway/tools/memories/* Reorganized memory management with similar structure pattern
language_model_gateway/configs/config_schema.py Added "headers" as a new authentication option for AgentConfig
tests/gateway/test_openai_responses*.py Comprehensive tests for Responses API with streaming, history, and MCP tool integration
docker-compose-openwebui.yml Fixed ENABLE_API_KEY to ENABLE_API_KEYS per OpenWebUI documentation
docker-compose-openwebui-auth.yml Changed JWT expiration from 12h to 5m (potentially problematic for UX)
docker-compose-llm.yml New compose file for local Ollama setup with Qwen3 8B model
Pipfile Updated pypdf to 6.4.0, mypy to 1.19.0, relaxed types-beautifulsoup4 constraint
openwebui-config/functions/open-webui.Dockerfile Updated OpenWebUI image from v0.6.36 to v0.6.40

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

if isinstance(tool["allowed_tools"], (list, tuple))
else "",
headers=tool.get("headers"),
auth="headers",
Copy link

Copilot AI Dec 2, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The extract_mcp_agent_configs method sets auth="headers" unconditionally for all MCP tools. However, this doesn't consider the case where the tool might specify a different authentication method or none at all. The authentication method should be configurable or at least checked before being hardcoded.

Suggested change
auth="headers",
auth=tool.get("auth", "headers"),

Copilot uses AI. Check for mistakes.
@imranq2 imranq2 merged commit 454b21a into main Dec 2, 2025
4 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants