-
I am using 'RAGStringQueryEngine' with a recursive retriever, leveraging Azure OpenAI and embeddings. My question is about the use of a 'reranker' in my query engine. Is it necessary to include a 'reranker'? If so, can I use the Azure embedding model? Can I use the same Azure "text-embedding-ada-002" model for reranking? If yes, could you share some sample code? I'd like to integrate it into my query engine. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 6 replies
-
Yes, it is necessary to include a 'reranker' in your 'RAGStringQueryEngine' with a recursive retriever to improve the accuracy and relevance of the retrieved results. You can use the Azure "text-embedding-ada-002" model for reranking. Here is a sample code for integrating the Azure "text-embedding-ada-002" model for reranking: from llama_index.core import get_response_synthesizer
from llama_index.core.postprocessor import AzureRerank
from llama_index.core.query_engine import RetrieverQueryEngine
# Set up the query engine with Azure "text-embedding-ada-002" model as reranker.
response_synthesizer = get_response_synthesizer(
response_mode="tree_summarize",
)
# Initialize the Azure reranker
azure_rerank = AzureRerank(
model="text-embedding-ada-002",
api_key="YOUR_AZURE_API_KEY",
endpoint="YOUR_AZURE_ENDPOINT",
top_n=5
)
# Assuming `google_index` is your initialized GoogleIndex
retriever = google_index.as_retriever(similarity_top_k=5)
query_engine = RetrieverQueryEngine.from_args(
retriever=retriever,
response_synthesizer=response_synthesizer,
node_postprocessors=[azure_rerank],
)
# Query for better result!
response = query_engine.query("Which program did this author attend?")
print(response.response) Replace |
Beta Was this translation helpful? Give feedback.
To accommodate the evaluation process for Azure embeddings, you can modify the provided code snippet to use the
AzureOpenAIEmbedding
class. Here is the updated code snippet: