-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
trouble with litellm proxy #110
Comments
Hi @goodb!
The ChromaDB embedding utility is what's asking for the API key here, and that may need to be explicitly told this is an Azure endpoint since Azure will expect a value for the API version along with the other details (i.e., the URL base and the headers). Another potential workaround - use duckdb instead of chromadb with the |
In practice this shouldn't raise an error at all and should just attempt to use the provided model in the same way as it would use an OpenAI API endpoint, so I'll look into that |
Thanks @caufieldjh ! Adding openai.APIConnectionError: Connection error. originating in the same call from chat_agent.py at line 92 I will try duckdb next as suggested. |
Does your Azure endpoint have its own API key? |
Yes, the azure endpoint has its own key and endpoint. |
Trying to get duckdb loaded is proving challenging. ontology index -p stagedb/duck.db -c ont-hp sqlite:obo:hp -D duckdb results in error |
OK, I think that Will try adding a fix for using Azure endpoints and using chromadb. |
Thanks @caufieldjh . I think it's related. Chasing it down, it happened because the model was not loaded so there was a None in there. It seems there are a fair number in the places in the code that are assuming an OpenAI key / service. It would be cool if it was a little more decoupled. The litellm proxy option seems a pretty reasonable way to do that as suggested in the readme. Maybe a little push there so that it becomes the default way of doing everything - with OpenAI as just one service underneath the umbrella - would lead to a more flexible system in the end. Just a thought. |
I'm attempting to use an azure-hosted chatgpt model with curateGPT via the litellm proxy system.
I have the proxy running and tested and the LLM package underlying curateGPT successfully using it, but when I try curateGPT it keeps asking for an openai key. It seems to be ignoring the -m option ???
Thanks for any tips and for this very helpful piece of work! Go Berkeley Bop :). And thanks to @justaddcoffee for pointing me here.
(showing I do have the proxy working and my local llm talking to it)
(curategpt-py3.10) (base) bgood@MacBook-Pro-5 curategpt % llm -m azure/gpt-4o-azure "what is the capital of canada?"
The capital of Canada is Ottawa.
(but curategpt not working with it)
(curategpt-py3.10) (base) bgood@MacBook-Pro-5 curategpt % curategpt ask -m azure/gpt-4o-azure -c ont_cl "What neurotransmitter is released by the hippocampus?"
Traceback (most recent call last):
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/bin/curategpt", line 6, in
sys.exit(main())
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/lib/python3.10/site-packages/click/core.py", line 1157, in call
return self.main(*args, **kwargs)
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/lib/python3.10/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/lib/python3.10/site-packages/click/core.py", line 1688, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/lib/python3.10/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/lib/python3.10/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
File "/Users/bgood/Documents/GitHub/curategpt/src/curategpt/cli.py", line 1852, in ask
response = chatbot.chat(query, collection=collection, conversation=conversation)
File "/Users/bgood/Documents/GitHub/curategpt/src/curategpt/agents/chat_agent.py", line 92, in chat
kb_results = list(
File "/Users/bgood/Documents/GitHub/curategpt/src/curategpt/store/chromadb_adapter.py", line 426, in search
yield from self._search(text=text, **kwargs)
File "/Users/bgood/Documents/GitHub/curategpt/src/curategpt/store/chromadb_adapter.py", line 441, in _search
yield from self.diversified_search(
File "/Users/bgood/Documents/GitHub/curategpt/src/curategpt/store/chromadb_adapter.py", line 561, in diversified_search
ef = self._embedding_function(metadata.venomx.embedding_model.name)
File "/Users/bgood/Documents/GitHub/curategpt/src/curategpt/store/chromadb_adapter.py", line 129, in _embedding_function
return embedding_functions.OpenAIEmbeddingFunction(
File "/Users/bgood/Library/Caches/pypoetry/virtualenvs/curategpt-ZGKDA_LV-py3.10/lib/python3.10/site-packages/chromadb/utils/embedding_functions.py", line 160, in init
raise ValueError(
ValueError: Please provide an OpenAI API key. You can get one at https://platform.openai.com/account/api-keys
The text was updated successfully, but these errors were encountered: