Async capabitlites of chat engine #14733
Replies: 1 comment 2 replies
-
Hello @SarahSohana27! I'm here to help you with any questions or issues you have regarding the chat engine. Feel free to ask me anything! To make your chat engine asynchronous so that multiple users can use the app simultaneously, you can utilize the asynchronous methods
Here is an example of how you might modify your code to handle multiple users asynchronously: import asyncio
from llama_index.legacy.chat_engine.context import ContextChatEngine
from llama_index.legacy.memory import ChatMemoryBuffer
from llama_index.legacy.core.llms.llm import LLM
from llama_index.legacy.indices.base_retriever import BaseRetriever
async def handle_user(chat_engine, user_id, user_message, user_chat_history):
response = await chat_engine.achat(message=user_message, chat_history=user_chat_history)
print(f"User {user_id} got response: {response.response}")
async def main():
retriever = BaseRetriever() # Initialize your retriever
llm = LLM() # Initialize your LLM
chat_engine = ContextChatEngine.from_defaults(
retriever=retriever,
llm=llm,
memory=ChatMemoryBuffer()
)
users = [
{"id": 1, "message": "Hello!", "history": []},
{"id": 2, "message": "How are you?", "history": []},
# Add more users as needed
]
tasks = [
handle_user(chat_engine, user["id"], user["message"], user["history"])
for user in users
]
await asyncio.gather(*tasks)
# Run the main function
asyncio.run(main()) This example demonstrates how to handle multiple users' messages asynchronously using the |
Beta Was this translation helpful? Give feedback.
-
I am using chat engine for my app. The problem is this not async. How to make it async so that multiple users can use the app at a time. is there any function like achat for query_engine?
Beta Was this translation helpful? Give feedback.
All reactions