You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
feat(llm): stop sending temperature to non‑supporting models
Gate temperature and max_tokens in chat requests to models that support it. Add gpt-5 and gpt-5-mini to NO_SUPPORT_TEMPERATURE_MODELS to prevent provider 400s (e.g., 'Unsupported value: temperature; only default is supported').
0 commit comments