From 59faa503ef1b4a386544d8d3d264540602b49425 Mon Sep 17 00:00:00 2001 From: Richard Macarthy Date: Fri, 19 Apr 2024 13:58:14 +0100 Subject: [PATCH] Update providers.md Open WebUI --- docs/providers.md | 12 +++++++----- 1 file changed, 7 insertions(+), 5 deletions(-) diff --git a/docs/providers.md b/docs/providers.md index d040f82a..6bbe330b 100644 --- a/docs/providers.md +++ b/docs/providers.md @@ -19,21 +19,23 @@ These example configurations serve as a starting point. Individual adjustments m - **Path:** `/v1/chat/completions` - **Model Name:** `codellama:7b-instruct` or any effective instruct model -### Ollama Web UI +### Open WebUI + +Open WebUI can be used a proxy API for twinny, simply configure the endpoint to match what is served by OpenWeb UI. #### FIM (Auto-complete) - **Hostname:** `localhost` -- **Port:** `11434` -- **Path:** `/ollama/api/generate` +- **Port:** Check documentation +- **Path:** Check documentation - **Model Name:** `codellama:7b-code` - **FIM Template:** Use the template corresponding to your model, similar to the desktop configuration. #### Chat Configuration - **Hostname:** `localhost` -- **Port:** `11434` -- **Path:** `/ollama/v1/chat/completions` +- **Port:** Check documentation +- **Path:** Check documentation - **Model Name:** `codellama:7b-instruct` or another reliable instruct model ### LM Studio