-
Notifications
You must be signed in to change notification settings - Fork 267
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Feature] support for OpenAI-like mock servers & OpenAI proxy servers #14
Comments
Thanks tranhoangnguyen03 for this request, I am just right now trying to figure out how to do this too! :) Edit: It's nothing perfect for sure and the token generation limits are way too high for me but I hope that helps someone. |
+1 Also looking at how to connect to e.g. Azure OpenAI endpoints. Thinking there needs to be somewhat significant code changes to make it support those endpoints. If somebody has figured this out, let it be known in this issue! |
I second this. For most developers in the corporate world Azure is the only compliant way to access OpenAI models. Or using an open source model deployed on their cloud infrastructure. In either case we simply need all OpenAI API connection options to be configurable - that's all. Best just evaluate the same environment variables as the OpenAI Python module is doing. |
LlamaIndexTS should use Azure if the following environment variables are set:
Can you set these variables in |
But I am not able to access the embedding model, couldn't find the respective variable in
Any idea how can I use this ? |
对于这种观点我很认同,Azure 是访问 OpenAI 模型的唯一合规方式。或者使用部署在其云基础架构上的开源模型,最好只评估与 OpenAI Python 模块相同的环境变量 |
is there any news? |
@frazur that might be an Issue in LlamaIndexTS - can you use https://ts.llamaindex.ai/modules/llms/available_llms/azure with your azure account? |
Currently, when I want to use OpenAI-like mock servers or proxy servers, there's no apparent way to manually modify the openai.api_base and add headers to openai Completion/ChatCompletion request.
The mock server requires changing openai.api_base and specifying the model name.
The proxy server requires changing openai.api_base, providing openai.api_key, specifying the model name, and adding a custom headers to the request.
The text was updated successfully, but these errors were encountered: