"system_prompt is not supported for CondenseQuestionChatEngine" #14726
Unanswered
sad-mathematician
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Yes, the error message "system_prompt is not supported for CondenseQuestionChatEngine" means that you cannot set an external system prompt with this Chat Engine. To set instructions for the response to be in formatted HTML, you can modify the from llama_index.legacy.prompts.base import PromptTemplate
HTML_TEMPLATE = """\
Given a conversation (between Human and Assistant) and a follow up message from Human, \
rewrite the message to be a standalone question that captures all relevant context \
from the conversation. The response should be formatted in HTML.
<Chat History>
{chat_history}
<Follow Up Message>
{question}
<Standalone question in HTML>
"""
html_prompt = PromptTemplate(HTML_TEMPLATE)
# Initialize the CondensePlusContextChatEngine with the custom HTML prompt
chat_engine = CondensePlusContextChatEngine.from_defaults(
retriever=retriever,
llm=llm,
memory=memory,
condense_prompt=html_prompt,
# other parameters...
) This way, the |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Does this mean that we cannot set an external system prompt with this Chat Engine? Meaning if we wanted the response in formatted HTML for example, can we not set this instruction? The condense_question_prompt doesn't seem to be the system prompt that could achieve this.
Beta Was this translation helpful? Give feedback.
All reactions