fix(llm): pin LLM edge functions to US/EU regions to fix geo-block 403s#2541
fix(llm): pin LLM edge functions to US/EU regions to fix geo-block 403s#2541
Conversation
…k 403s OpenRouter returns 403 'This model is not available in your region' when Vercel routes through edge nodes in regions where Google Gemini is blocked. Pin chat-analyst, news, and intelligence edge functions to iad1/lhr1/fra1/sfo1. Also improves error logging in callLlmReasoningStream to include model name and full response body on non-2xx for easier future diagnosis.
|
The latest updates on your projects. Learn more about Vercel for GitHub. 1 Skipped Deployment
|
Greptile SummaryThis PR pins three LLM-calling Vercel edge functions (
Confidence Score: 5/5Safe to merge — targeted infrastructure fix with no logic changes and proper error handling guards. All four files contain only mechanical, low-risk changes: three one-line config additions and one tightly-scoped logging improvement. No auth, caching, or data-path logic is touched. The No files require special attention. Important Files Changed
Sequence DiagramsequenceDiagram
participant Client
participant Vercel_Edge as Vercel Edge<br/>(iad1/lhr1/fra1/sfo1)
participant OpenRouter
participant LLM as LLM Provider<br/>(e.g. Gemini)
Client->>Vercel_Edge: POST /api/chat-analyst<br/>or /api/news/v1/* or /api/intelligence/v1/*
Note over Vercel_Edge: regions config pins routing<br/>to US/EU nodes only
Vercel_Edge->>OpenRouter: POST streaming request<br/>(model, messages, stream:true)
OpenRouter->>LLM: Forward to LLM provider
alt Success (2xx + body)
LLM-->>OpenRouter: SSE stream
OpenRouter-->>Vercel_Edge: SSE chunks
Vercel_Edge-->>Client: data: {"delta":"..."}
Vercel_Edge-->>Client: data: {"done":true}
else Non-2xx or no body (e.g. 403 geo-block)
OpenRouter-->>Vercel_Edge: HTTP 4xx/5xx
Note over Vercel_Edge: clearTimeout, read errBody<br/>log: HTTP 403 model=X body={"error":...}
Vercel_Edge->>OpenRouter: Retry next provider
end
Reviews (1): Last reviewed commit: "fix(llm): pin LLM edge functions to US/E..." | Re-trigger Greptile |
Summary
403 "This model is not available in your region"when Vercel routes LLM-calling edge functions through nodes in regions where Google Gemini is geo-blocked (Middle East, East Asia, etc.)/api/chat-analyst,/api/news/v1/*(article summarization),/api/intelligence/v1/*['iad1', 'lhr1', 'fra1', 'sfo1'](Virginia, London, Frankfurt, San Francisco) where all OpenRouter-routed LLM providers have full availabilitycallLlmReasoningStreamerror logging to include model name + full response body on non-2xx for easier future diagnosisDiagnosed by pulling Vercel production logs which revealed:
Test plan
/api/chat-analystfrom a VPN endpoint in a geo-blocked region (e.g. UAE, Turkey)llm_unavailableerrors in Vercel function logs