Bug Report: Cannot use GLM for both main agent and auxiliary title generation simultaneously
Summary
When using GLM as the main agent via the Anthropic-compatible endpoint (/api/anthropic), auxiliary tasks like title generation cannot use GLM's OpenAI-compatible endpoint (/api/paas/v4) even when base_url is explicitly set in config.yaml. This is because GLM_BASE_URL always overrides any base_url specified under auxiliary.title_generation in config.yaml.
Environment
- Hermes Web UI (latest, cloned from main)
- GLM main agent endpoint:
https://open.bigmodel.cn/api/anthropic
- OS: Ubuntu (WSL2)
Steps to Reproduce
- Set
GLM_BASE_URL=https://open.bigmodel.cn/api/anthropic in ~/.hermes/.env (required for main agent to work via Anthropic-compatible endpoint)
- Configure
auxiliary.title_generation in config.yaml with an explicit OpenAI-compatible base URL:
auxiliary:
title_generation:
provider: zai
model: glm-4-flash
base_url: https://open.bigmodel.cn/api/paas/v4
api_key: ''
timeout: 30
extra_body: {}
- Start Hermes and send a chat message.
Expected Behavior
The base_url explicitly set under auxiliary.title_generation in config.yaml should take priority for auxiliary tasks, allowing the main agent and title generation to use different GLM endpoints.
Actual Behavior
Title generation fails with:
⚠ Auxiliary title generation failed: HTTP 401: 令牌已过期或验证不正确
This happens because _resolve_zai_base_url() in hermes_cli/auth.py gives GLM_BASE_URL absolute priority:
def _resolve_zai_base_url(api_key, default_url, env_override):
if env_override: # GLM_BASE_URL always wins — ignores config.yaml base_url
return env_override
So the Anthropic-format endpoint is used for title generation too, which returns 401 since it doesn't accept OpenAI-format requests.
Root Cause
GLM_BASE_URL is a single global variable controlling the GLM endpoint for the entire application. There is no way to specify different endpoints for the main agent vs auxiliary tasks. The base_url field in auxiliary.title_generation config is silently ignored for the zai provider.
Suggested Fix
In _resolve_zai_base_url(), check for an explicit base_url passed from the auxiliary config before falling back to GLM_BASE_URL:
def _resolve_zai_base_url(api_key, default_url, env_override, explicit_base_url=None):
if explicit_base_url: # config.yaml base_url takes priority for aux tasks
return explicit_base_url
if env_override: # GLM_BASE_URL wins for main agent
return env_override
...
Alternatively, introduce a separate GLM_AUX_BASE_URL environment variable for auxiliary tasks.
Impact
Any user running GLM via the Anthropic-compatible endpoint as their main model cannot use GLM for auxiliary tasks (title generation, vision, web extract, etc.) without breaking their main agent configuration.
Bug Report: Cannot use GLM for both main agent and auxiliary title generation simultaneously
Summary
When using GLM as the main agent via the Anthropic-compatible endpoint (
/api/anthropic), auxiliary tasks like title generation cannot use GLM's OpenAI-compatible endpoint (/api/paas/v4) even whenbase_urlis explicitly set inconfig.yaml. This is becauseGLM_BASE_URLalways overrides anybase_urlspecified underauxiliary.title_generationinconfig.yaml.Environment
https://open.bigmodel.cn/api/anthropicSteps to Reproduce
GLM_BASE_URL=https://open.bigmodel.cn/api/anthropicin~/.hermes/.env(required for main agent to work via Anthropic-compatible endpoint)auxiliary.title_generationinconfig.yamlwith an explicit OpenAI-compatible base URL:Expected Behavior
The
base_urlexplicitly set underauxiliary.title_generationinconfig.yamlshould take priority for auxiliary tasks, allowing the main agent and title generation to use different GLM endpoints.Actual Behavior
Title generation fails with:
This happens because
_resolve_zai_base_url()inhermes_cli/auth.pygivesGLM_BASE_URLabsolute priority:So the Anthropic-format endpoint is used for title generation too, which returns 401 since it doesn't accept OpenAI-format requests.
Root Cause
GLM_BASE_URLis a single global variable controlling the GLM endpoint for the entire application. There is no way to specify different endpoints for the main agent vs auxiliary tasks. Thebase_urlfield inauxiliary.title_generationconfig is silently ignored for thezaiprovider.Suggested Fix
In
_resolve_zai_base_url(), check for an explicitbase_urlpassed from the auxiliary config before falling back toGLM_BASE_URL:Alternatively, introduce a separate
GLM_AUX_BASE_URLenvironment variable for auxiliary tasks.Impact
Any user running GLM via the Anthropic-compatible endpoint as their main model cannot use GLM for auxiliary tasks (title generation, vision, web extract, etc.) without breaking their main agent configuration.