Skip to content

Commit 0e6561d

Browse files
leestottCopilot
andauthored
Document Microsoft Foundry Local setup and usage (#461)
* Document Microsoft Foundry Local setup and usage Added section for Microsoft Foundry Local with installation and usage instructions. * Update docs/auth/byok.md Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
1 parent eaf06cd commit 0e6561d

File tree

1 file changed

+48
-0
lines changed

1 file changed

+48
-0
lines changed

docs/auth/byok.md

Lines changed: 48 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -10,6 +10,7 @@ BYOK allows you to use the Copilot SDK with your own API keys from model provide
1010
| Azure OpenAI / Azure AI Foundry | `"azure"` | Azure-hosted models |
1111
| Anthropic | `"anthropic"` | Claude models |
1212
| Ollama | `"openai"` | Local models via OpenAI-compatible API |
13+
| Microsoft Foundry Local | `"openai"` | Run AI models locally on your device via OpenAI-compatible API |
1314
| Other OpenAI-compatible | `"openai"` | vLLM, LiteLLM, etc. |
1415

1516
## Quick Start: Azure AI Foundry
@@ -250,6 +251,37 @@ provider: {
250251
}
251252
```
252253

254+
### Microsoft Foundry Local
255+
256+
[Microsoft Foundry Local](https://foundrylocal.ai) lets you run AI models locally on your own device with an OpenAI-compatible API. Install it via the Foundry Local CLI, then point the SDK at your local endpoint:
257+
258+
```typescript
259+
provider: {
260+
type: "openai",
261+
baseUrl: "http://localhost:<PORT>/v1",
262+
// No apiKey needed for local Foundry Local
263+
}
264+
```
265+
266+
> **Note:** Foundry Local starts on a **dynamic port** — the port is not fixed. Use `foundry service status` to confirm the port the service is currently listening on, then use that port in your `baseUrl`.
267+
268+
To get started with Foundry Local:
269+
270+
```bash
271+
# Windows: Install Foundry Local CLI (requires winget)
272+
winget install Microsoft.FoundryLocal
273+
274+
# macOS / Linux: see https://foundrylocal.ai for installation instructions
275+
# List available models
276+
foundry model list
277+
278+
# Run a model (starts the local server automatically)
279+
foundry model run phi-4-mini
280+
281+
# Check the port the service is running on
282+
foundry service status
283+
```
284+
253285
### Anthropic
254286

255287
```typescript
@@ -305,6 +337,7 @@ Some Copilot features may behave differently with BYOK:
305337
|----------|-------------|
306338
| Azure AI Foundry | No Entra ID auth; must use API keys |
307339
| Ollama | No API key; local only; model support varies |
340+
| [Microsoft Foundry Local](https://foundrylocal.ai) | Local only; model availability depends on device hardware; no API key required |
308341
| OpenAI | Subject to OpenAI rate limits and quotas |
309342

310343
## Troubleshooting
@@ -368,6 +401,21 @@ curl http://localhost:11434/v1/models
368401
ollama serve
369402
```
370403

404+
### Connection Refused (Foundry Local)
405+
406+
Foundry Local uses a dynamic port that may change between restarts. Confirm the active port:
407+
408+
```bash
409+
# Check the service status and port
410+
foundry service status
411+
```
412+
413+
Update your `baseUrl` to match the port shown in the output. If the service is not running, start a model to launch it:
414+
415+
```bash
416+
foundry model run phi-4-mini
417+
```
418+
371419
### Authentication Failed
372420

373421
1. Verify your API key is correct and not expired

0 commit comments

Comments
 (0)