[inference] Add support for o3
#213652
Labels
bug
Fixes for quality problems that affect the customer experience
Team:AI Infra
AppEx AI Infrastructure Team
It has been reported that using the inference APIs with openAI's
o3
results in an error because we're always sending thetemperature
parameter, ando3
doesn't support it.Unfortunately, I don't think there' a "good" way to do this, so my only idea is to just check the model's name for this, which would be somewhat similar to what we do to "detect" native function calling:
https://github.com/elastic/kibana/blob/main/x-pack/platform/plugins/shared/inference/server/chat_complete/utils/function_calling_support.ts#L11-L26
The text was updated successfully, but these errors were encountered: