-
Notifications
You must be signed in to change notification settings - Fork 10.2k
Description
Describe the bug
Observed behavior: After a series of successful prompts Bolt stops applying any changes to the codebase.
For Ollama: The prompt and reply show as executing successfully yet no code changes are applied. It just says "Here's how we'll implement it:" and then a link below to Click to Open Workbench
OpenAIlike: The prompt and reply execute successfully however after a few turns of code changes applying instead the code is listed as a snippet but is not applied by Bolt - it just displays the new code.
The last post in this thread shows another user (Ollama) with a similar or same issue: https://thinktank.ottomator.ai/t/ollama-bolt-diy-on-local-homeserver-ubuntu/7254/5
Link to the Bolt URL that caused the error
http://localhost:5173/chat/imported-files
Steps to reproduce
- Install bolt.diy locally on Arch Linux using git pull / pnpm etc..
- Open in Chrome Canary
- Add ollama as local model provider
- Add openAIlike as local model provider
-
- In Ollama manually set context to be relatively large (~64k+)
- In openwebui manually set context to be relatively large (~64k+)
- In bolt v1.0 Load a reasonably sized model (Devstral Small Q_8, GLM4 Q6 etc..)
- Keep prompting the model
- Eventually the model stops committing changes
Expected behavior
Expected behavior: When using either ollama or OpenAIlike as a local model provider you should be able to chat perpetually with Bolt and, after each prompt, have changes applied to the codebase.
Screen Recording / Screenshot
No response
Platform
Arch Linux (local)
Bolt.diy v1.0 (local)
Ollama (local network server)
OpenWebui (local network server)Chrome, Safari, Firefox]
- Version: [e.g. 91.1]
Provider Used
Ollama, Openwebui
Model Used
devstral Q8, GLM4 q6, Qwen3 30b Q4
Additional context
This ollama behavior was also observed in version 0.7 of bolt.