Skip to content

Conversation

NiladriHazra
Copy link

@NiladriHazra NiladriHazra commented Sep 1, 2025

Ollama Integration Setup & Testing

This PR adds integration for Ollama with the local Weam backend. Follow the steps below to set up and test:

1. Install and run Ollama locally

2. Set environment variables in the Node.js service

  • OLLAMA_URL (default: http://localhost:11434)
  • OLLAMA_FALLBACK_ENABLED (optional)

3. Validate environment and dependencies

node validate-ollama-env.js

4. Start the Weam backend

npm run dev
# or use `npm start` depending on your setup

5. Run Ollama integration tests

node test-comprehensive-ollama.js
# optionally, if present
node test-local-ollama.js

6. Manual API smoke tests

  • POST /api/ollama/chat with a short prompt to confirm responses and streaming behavior.
  • GET /api/ollama/models to verify allowed models and admin settings.

Test Configuration

  • Local Ollama endpoint: http://localhost:11434/ (adjust OLLAMA_URL if needed)
  • Weam backend: default port 3000 (adjust as needed)
  • Ensure the user account used for admin endpoints has the admin role.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant