-
Notifications
You must be signed in to change notification settings - Fork 128
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Configured providers but twinny not sending any requests to provider. #242
Comments
Maybe try a restart? The settings look correct to me. Also in the extension settings change the Ollama settings too. Click the cog in the extension header, there are some api settings for ollama in there too. |
There is an issue with Twinny and WSL connected VSCode windows.Continue extension works, but I can't get Twinny to work. Let me know if there's a way for me to help debug this. I've also tried setting the host value to Relevant console logs(?): ERR [Extension Host] Fetch error: TypeError: fetch failed
at node:internal/deps/undici/undici:12345:11
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async t.streamResponse (/home/user/.vscode-server/extensions/rjmacarthy.twinny-3.11.45/out/index.js:2:138539)
console.ts:137 [Extension Host] Fetch error: TypeError: fetch failed
at node:internal/deps/undici/undici:12345:11
at process.processTicksAndRejections (node:internal/process/task_queues:95:5)
at async t.streamResponse (/home/user/.vscode-server/extensions/rjmacarthy.twinny-3.11.45/out/index.js:2:138539)
y @ console.ts:137 |
Describe the bug
I have setup the following providers and I checked with curl that /api/generate endpoint on http://duodesk.duo:11434 works, the extension shows loading circle but is not sending any requests. Also tried setting Ollama Hostname setting to duodesk.duo, but no luck.
To Reproduce
Just added the providers I have attached.
Expected behavior
Should work with the providers I have I think?
Screenshots
Logging
Logging is enabled but not sure where am I supposed to see the logs, checked Output tab but there is no entry for twinny.
API Provider
Ollama running at http://duodesk.duo:11434 in local network.
Chat or Auto Complete?
Both
Model Name
codellama:7b-code
Desktop (please complete the following information):
Additional context
The text was updated successfully, but these errors were encountered: