Passing Metadata to LLM #2023
Unanswered
mikehudson2
asked this question in
Q&A
Replies: 1 comment 1 reply
-
Hmm to send both the content and URL, I don't think there is a straight forward way to do this yet, but something worth exploring. Perhaps we can use VectorStore To Document node -> Custom Function -> Prompt Template |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
We have websites scraped to Vectara so that each Vectara document contains a web page of text and its metadata, including that page's web address. When querying Vectara using the Conversational Retrieval QA Chain and ChatOpenAI (GPT4), matching chunks are currently sent to GPT4. Is it possible to also send GPT4 the url metadata of the page from which the chunks were extracted?
Or to put it more generally, within Flowise how can we send chunk metadata to GPT4?
Beta Was this translation helpful? Give feedback.
All reactions