-
Notifications
You must be signed in to change notification settings - Fork 161
Open
Description
Hi,
As I'm using Ollama on a dedicated host, I quickly added these:
- /config.py line 16:
LLM_BASEURL = "http://myollamahost:11434" - /core/rag_system.py line 24:
llm = ChatOllama(model=config.LLM_MODEL, temperature=config.LLM_TEMPERATURE, base_url=config.LLM_BASEURL)
Not sure this would break something if I made a PR... Now, other tweaks I made in the code surely will, so I'll avoid breaking this excellent project ;)
Cheers,
JC
Metadata
Metadata
Assignees
Labels
No labels