Passing chat history to LLM #7456
-
Novice user here, just want to confirm if I'm thinking about this correctly -- when using the ChatHistoy object, the context of the conversation is maintained even if the history variable is not directly passed through the YAML/system prompt, as long as queries + responses are appended to the object. I'm assuming that chat history is still passed to the LLM behind the scenes and therefore the size of the history does contribute to token usage - is this correct? |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
Would you mind providing some code snippets of what you're working on and referencing in the questions above? We'd like to understand further before replying. |
Beta Was this translation helpful? Give feedback.
-
Everything you send in the request contributes to your token usage. We'd recommend a strategy, such as using the most recent message to help keep your token usage low as opposed to sending the entire chat history. |
Beta Was this translation helpful? Give feedback.
Everything you send in the request contributes to your token usage. We'd recommend a strategy, such as using the most recent message to help keep your token usage low as opposed to sending the entire chat history.