Skip to content

Conversation

shatfield4
Copy link
Collaborator

@shatfield4 shatfield4 commented Sep 19, 2025

Pull Request Type

  • ✨ feat
  • πŸ› fix
  • ♻️ refactor
  • πŸ’„ style
  • πŸ”¨ chore
  • πŸ“ docs

Relevant Issues

resolves #4403
resolves #4304
closes #4307

What is in this change?

  • Removes using chat.completions from the OpenAI package and uses responses to chat via the OpenAI responses API
  • Simplifies OpenAI LLM provider implementation by removing unnecessary O series model checks
  • Adds support for gpt-5 models

Additional Information

Developer Validations

  • I ran yarn lint from the root of the repo & committed changes
  • Relevant documentation has been updated
  • I have tested my code functionality
  • Docker build succeeds locally

@shatfield4 shatfield4 linked an issue Sep 19, 2025 that may be closed by this pull request
@shatfield4 shatfield4 added the PR:needs review Needs review by core team label Sep 19, 2025
@timothycarambat
Copy link
Member

  • Image support was missing/broken and needed responses object updates
  • removed manual runPromptTokenCalculation (default true) to be false on streams since we now have usage on response queries

@timothycarambat timothycarambat merged commit 1209606 into master Sep 19, 2025
1 check passed
@timothycarambat timothycarambat deleted the 4403-feat-migrate-openai-llm-provider-to-responses-api branch September 19, 2025 04:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
PR:needs review Needs review by core team
Projects
None yet
2 participants