feat(ollama): add local Ollama model integration with admin controls, analytics, and docs #79
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
Ollama Integration Setup & Testing
This PR adds integration for Ollama with the local Weam backend. Follow the steps below to set up and test:
1. Install and run Ollama locally
2. Set environment variables in the Node.js service
OLLAMA_URL
(default:http://localhost:11434
)OLLAMA_FALLBACK_ENABLED
(optional)3. Validate environment and dependencies
4. Start the Weam backend
npm run dev # or use `npm start` depending on your setup
5. Run Ollama integration tests
node test-comprehensive-ollama.js # optionally, if present node test-local-ollama.js
6. Manual API smoke tests
/api/ollama/chat
with a short prompt to confirm responses and streaming behavior./api/ollama/models
to verify allowed models and admin settings.Test Configuration
OLLAMA_URL
if needed)3000
(adjust as needed)