-
Notifications
You must be signed in to change notification settings - Fork 267
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Supported LLM: Azure OpenAI? #52
Comments
Chat LlamaIndex can use any LLM that is supported by LlamaIndexTS, you just have to plug it in here: chat-llamaindex/app/api/llm/route.ts Lines 149 to 154 in 86aca07
|
Hi, And I am getting below error: [LlamaIndex] BadRequestError: 400 Unsupported data type at APIError.generate (C:\LAAMA\chat-llamaindex\node_modules\openai\error.js:43:20) Can you please help me suspecting what could be the issue. |
Sorry, currently we're not having an azure example. I would start using this example https://github.com/run-llama/LlamaIndexTS/blob/main/examples/openai.ts and modify the parameters |
You have indicated that ChatGPT-Next-Web project was used as a starter template for this project. Can you please confirm if LlamaIndex Chat support Azure OpenAI?
If yes, please provide the instructions to switch to Azure OpenAI.
If no, will this be treated as feature enhancement? Is there a quick way to make this switch to use Azure OpenAI?
Content of .env.development.local file
Your openai api key. (required)
OPENAI_API_KEY=sk-xxxx
The text was updated successfully, but these errors were encountered: