Skip to content

[Bug]: Please add llm config demo for gemini openai and Claude it will be more helpful #1630

@rudra-sah00

Description

@rudra-sah00

Platform

All or multiple (please specify)

Platform details

like this

llm: {
url: AGORA_AI_CONFIG.LLM.URL,
api_key: AGORA_AI_CONFIG.LLM.API_KEY,
style: "gemini",
system_messages: [
{
role: "system",
content: AGORA_AI_CONFIG.LLM.SYSTEM_MESSAGE
}
],
params: {
model: AGORA_AI_CONFIG.LLM.MODEL
},
max_history: AGORA_AI_CONFIG.LLM.MAX_HISTORY,
greeting_message: AGORA_AI_CONFIG.LLM.GREETING_MESSAGE,
failure_message: AGORA_AI_CONFIG.LLM.FAILURE_MESSAGE
},

Product

Other

Product details

Ai Conversational Engine

Business Case

It would be helpful to developers I am getting confused I am setting up the corrector Roman for llm but it still always giving back the error response while I talk to a agent

Subject Matter Expert

No response

Documentation Link

https://docs.agora.io/en/conversational-ai/rest-api/join

Scope

No response

Acceptance Criteria

The documentation update must be technically accurate.

  • It should be easy for users to follow and understand.
  • It should adhere to the template structure.

Additional Information

No response

Metadata

Metadata

Labels

No labels
No labels

Type

No type

Projects

No projects

Milestone

No milestone

Relationships

None yet

Development

No branches or pull requests

Issue actions